The Age of Wonder: Science as a Means to Emancipation

Richard Holmes’ 2009 tome is aptly titled. It’s a wonder, and it takes an age to read it. Right. I wanted to get that out of the way, as the fact of its lengthiness weighs on me as I consider penning a reaction to its substance. It feels really long.

But, as with many efforts, it is worth it. The Age of Wonder is an exhaustive chronicle of the Romantic era of science — indeed, the dawn of the very term. It focuses primarily on a small cluster of main “characters,” beginning with the intrepid Joseph Banks (and his utterly fascinating adventures in Tahiti) all the way through the Herschel lineage (William, his sister Caroline, and William’s son John) — and just before Charles Darwin takes his voyage on the Beagle. It is a tale of presumptions shattered, egos inflated and exploded, and orthodoxies forever upended — and not just those of stodgy religionists, but of even the most open-minded of explorers and philosophers. As Humphrey Davy, perhaps the most prominent of Holmes’ subjects, said, “The first step towards the attainment of real discovery was the humiliating confession of ignorance.” There is a lot of that documented here.

Perhaps the most prominent theme throughout the book, with all of its detailed (often to a fault) recountings of experiments, arguments, and internal struggles, is that of the development of a professional discipline whose aim is more than the sum of its parts. What would eventually be known as science would become a practice not simply of confirming or denying the veracity of hypotheses, but it would perhaps be the one great force that ushers humanity beyond its terrestrial and provincial understanding of itself. Holmes summarizes the thinking of Samuel Taylor Coleridge on this subject:

… Coleridge was defending the intellectual discipline of science as a force for clarity and good. He then added one of his most inspired perceptions. He thought that science, as a human activity, ‘being necessarily performed with the passion of Hope, it was poetical’. Science, like poetry, was not merely ‘progressive’. It directed a particular kind of moral energy and imaginative longing into the future. It enshrined the implicit belief that mankind could achieve a better, happier world.

This was not so simple, of course. Even the previously noted Davy faced his own crisis on conscience as reason as a force behind a moral, and not just practical, philosophy challenged even the least superstitious of minds. In 1828 Davy wrote,

The art of living happy is, I believe, the art of being agreeably deluded; and faith in all things is superior to Reason, which, after all, is but a dead weight in advanced life, though as the pendulum to the clock in youth.

But “living happy” is not the same as living well, not the same as progress, not the same as advancing overall well-being.

There were those of this time who began to see something more than a happy illusion being stripped away, but rather a means to liberation of the species, a new reigniting of the the Enlightenment’s flame. Holmes offers the words of Percy Shelley as the technology of ballooning had become the center of international awe and controversy.

Yet it ought not to be altogether condemned. It promises prodigious faculties for locomotion, and will allow us to traverse vast tracts with ease and rapidity, and to explore unknown countries without difficulty. Why are we so ignorant of the interior of Africa? — Why do we not despatch intrepid aeronauts to cross it in every direction, and to survey the whole peninsula in a few weeks? The shadow of the first balloon, which a vertical sun would project precisely underneath it, as it glided over that hitherto unhappy country, would virtually emancipate every slave, and would annihilate slavery forever.

This did not happen literally, of course, but it reminds us that within genuine understanding of all things lies the potential to transcend them.

Side note:

Also rife within The Age of Wonder are examples of the seemingly timeless wars between religion and science, and science’s struggle to be seen as something other than raw atheism. Holmes tells of the profession’s coming to terms, as it were, with its own moniker, and the old demons are ever-present:

There was no general term by which these gentlemen could describe themselves with reference to their pursuits.
‘Philosophers’ was felt to be too wide and lofty a term, and was very properly forbidden them by Mr. Coleridge, both in his capacity as philologer and metaphysician. ‘Savans’ was rather assuming and besides too French; but some ingenious gentleman [in fact Whewell himself] proposed that, by analogy with ‘artist’, they might form ‘scientist’ — and added that there could be no scruple to this term since we already have such words as ‘economist’ and ‘atheist’ — but this was not generally palatable.

The analogy with ‘atheist’ was of course fatal. Adam Sedgwick exploded: ‘Better die of this want [of a term] than bestialize our tongue by such a barbarism.’ But in fact ‘scientist’ came rapidly into general use from this date, and was recognised in the OED by 1840. Sedgwick later reflected more calmly, and made up for his outburst by producing a memorable image. ‘Such a coinage has always taken place at the great epochs of discovery: like the medals that are struck at the beginning of a new reign.’

This argument over a single word — ‘scientists’ — gave a clue to the much larger debate that was steadily surfacing in Britain at this crucial period of transition 1830-34. Lurking beneath the semantics lay the whole question of whether the new generation of professional ‘scientists’ would promote safe religious belief or a dangerous secular materialism.

Same as it ever was.

This Isn’t the William Shirer You’re Looking For: Thoughts on Steve Wick’s “The Long Night”

Readers of this blog may already be aware of my deep affection for the thousand-plus-page tome The Rise and Fall of the Third Reich, journalist William Shirer’s invaluable 1960 history of Hitler and his Germany. It was with great delight, then, that I was made aware of a history of that history, Steve Wick’s The Long Night, telling the story of Shirer’s years covering the tumult in Europe, mostly from the eye of the storm itself, Berlin.

Though I feel it is missing a crucial chapter, it is a stirring tale. As Wick himself notes, it reads as much more of an adventure tale than a formal history or biography. Shirer struggles daily for over a decade with Nazi censorship, separation from his wife and child, a lack of support from his employers back home, his deep disappointment with the German people, and his own hubris and failings.

We learn a great deal about the mindset of the period, as Shirer was a tuned-in, worldly journalist who had come from extremely humble, rural beginnings. Of particular note to some of this blog’s readers is Shirer’s impression of the Scopes “Monkey Trial,” the event in American history that in many ways began the culture wars in which we struggle today:

As Shirer saw it, the drama unfolding in Tennessee in anticipation of the upcoming trial was reason alone to take leave of his country. “I yearned for some place, if only for a few weeks, that was more civilized, where a man could drink a glass of wine or a stein of beer without breaking the law, where you could believe and say what you wanted to about religion or anything else without being put upon, where inanity had not become a way of life, and where a writer or an artist or a philosopher, or merely a dreamer, was considered just as good as, if not better than, the bustling businessman.”

Even then, the willfully ignorant mob was making the rest of civilization feel unwelcome, just as the Tea Party imbeciles do today. Indeed, even Shirer’s struggles with a supposed journalistic need for “balance” over a human being’s honest impression rings true today. And like today, honesty did not always win the day over bland neutrality:

As for Hitler’s speech proposing peace for Europe, Shirer knew it was a lie. He was disgusted with himself for not declaring it so flat out. But he knew he could not, nor could he find a German outside the government to say it, and the frustration ate at him. “The proposal is a pure fraud, and if I had any guts, or American journalism had any, I would have said so in my dispatch tonight,” he wrote. “But I am not supposed to be ‘editorial.’ ”

But as a fan of Shirer’s definitive work, I concluded my reading with a slight sting of disappointment. Wick omits from his tale the writing of Rise and Fall; the process of putting this all-important book together is almost totally absent. Wick himself tells us near the book’s end that to do so would mean a wholly separate volume. “A biographer will someday write the story of the enormous hurdles Shirer had to climb to sell the book,” demurs Wick, and one can’t help but wish that this hypothetical book already existed within the one we were already reading.

What a herculean effort it must have been to pen such a book! Ten years of Shirer’s life was poured into it, and its influence will be felt for generations. Surely, this story can be told as well as the formative experiences in Europe that led to the book’s genesis. It is not Wick’s fault that this is missing (though having the words “The Rise and Fall of the Third Reich” in the subtitle does lead one on), but its absence is palpable and deflating.

That said, the book as it is holds up, and it is a story that needed to be told. We learn so much about what it means to be a journalist, a pro-democracy American, a liberal, and a vulnerable human being caught in a volatile, insane world.

What’s a Book Review For?

Miranda Celeste Hale (my Bespectacled Blog Twin™) writes thoughtfully and passionately in favor of the continued existence of the book review, a counter argument to a piece from n+1 by Elizabeth Gumport. On the whole, I agree with Miranda’s take. Here’s the meat of her argument:

Although Gumport would deny it, there’s a reason why many of us still read publications such as the The New York Times Book Review or The New York Review of Books: we hope to find expert analyses of the merits of literary texts. The simple truth is that some voices are more credible than others. I strongly believe in the democratization of knowledge, but not in the devaluing of earned authority and expertise.

And so do I. But I also thought it worth adding that not all book reviews are created equal — or should I say, not all book reviews serve the same purpose, not for me anyway. Indeed, if anything I feel that the great swath of (let’s call them) upper-middle-brow book reviews (such as those from the publications Miranda mentions) tend to have a similar construction: a general statement about the overall theme, a lengthy synopsis, and a conclusion as to whether the author’s intent is realized. I find this construction to be perfect in some cases, woefully wrong in others.

Here’s where this becomes relevant to Miranda’s argument. We have to decide what is a given review’s function — particularly when we’re talking about Miranda’s talking about: literature, as opposed to nonfiction. For literature, we’re dealing with something with a story; a plot with characters taking part in events. In this case, it seems to me there needs to be a kind of division of labor: a “you haven’t read the book yet” section that simply explains why the book is or is not worth your time, along with a small hint of the book’s plot; and a “come back after you’ve read it” section that can deliver the kind of deep analysis that Miranda wants. If it’s not divided as such, I feel that the review (wrongly) treats the literature as nonfiction.

With nonfiction, the review that handles the whole shebang makes sense. I want a learned mind to tackle the questions posed by the author of the book in question, and to weigh in on how well the author achieves what they set out to achieve. This is precisely why I enjoy turning to the New York Review of Books. Life is short, one can only read so many books, and the NYRB can be a wonderful digest of material one might not ever return to. And if the review sparks more fervent interest, then it can serve to turn me to the book itself.

And as Miranda states, I want that direction. If I were to go about as Gumport would have us, I would have such a limited mental store of words as to be tragic. There is a serendipity, much like that explained by Alan Jacobs, that a good book review publication can facilitate, and it would be madness for a lover of books to forego that serendipity altogether.

For my own writing, I probably fall somewhere in between the utility desired by Miranda and Gumport. I write book reviews for this blog, but they are personal responses, flavored with whatever “expertise” or experience I bring (naturally). They are not from a trained literary scholar by any means, but I think I have enough “merit” that my reviews are useful to friends, like-minded thinkers, and the general passer-by. I might be wrong, of course.

And one other side note: I am very sympathetic to Miranda’s eye-rolling over Gumport’s seeming fixation on there being something “orgiastic” about one’s relationship to a book — or, more specifically, an author’s relationship to one’s patron. I find it tedious when something pseudo-scholarly reaches for the over-sexualization card in what often feels like an attention grab.

 I’d write in iambic pentameter for that.

But that said, Gumport is far from off-base with her read on the author-patron relationship of old. Let’s check in with my old “employer,” Mr. Shakespeare, and see what he has to say in his dedication for The Rape of Lucrece written to one (rather mischevious-looking) Earl of Southhampton:

The love I dedicate to your lordship is without end; whereof this pamphlet, without beginning, is but a superfluous moiety. The warrant I have of your honourable disposition, not the worth of my untutored lines, makes it assured of acceptance. What I have done is yours; what I have to do is yours; being part in all I have, devoted yours. Were my worth greater, my duty would show greater; meantime, as it is, it is bound to your lordship, to whom I wish long life, still lengthened with all happiness.

Your lordship’s in all duty,
WILLIAM SHAKESPEARE.

Yowza. Take that, New York Times Book Review!

A Natural Worshiper of Serendipity and Whim: Alan Jacobs’ “Pleasures of Reading”

Alan Jacobs, who readers of this blog (all ten of you) may know from previous references to his excellent blog TextPatterns, has recently released a wonderful book about reading that I simply can’t recommend highly enough. The Pleasures of Reading in an Age of Distraction is just the sort of pithy, sympathetic tract that our times demand — it encourages bibliographic exploration, celebrates chance literary encounters, while offering sincere understanding for the would-be “well-read” among us who fear missing out on an overly massive menu of “great works.”

Those chance literary encounters are the subject of this passage, which I found so delightful and even moving, that I thought I’d share it here.

The cultivation of serendipity is an option for anyone, but for people living in conditions of prosperity and security and informational richness it is something vital. To practice “accidental sagacity” is to recognize that I don’t really know where I am going, even if I like to think I do, or think Google does; that if I know what I am looking for, I do not therefore know what I need; that I am not master of my destiny and captain of my fate; that it is probably a very good thing that I am not master of my destiny and captain of my fate. An accidental sagacity may be the form of wisdom I most need, but am least likely to find without eager pursuit. Moreover, serendipity is the near relation of Whim; each stands against the Plan. Plan once appealed to me, but I have grown to be a natural worshiper of Serendipity and Whim; I can try to serve other gods, but my heart is never in it. I truly think I would rather read an indifferent book on a lark than a fine one according to schedule and plan. And why not? After all, once upon a time we chose none of our reading: it all came to us unbidden, unanticipated, unknown, and from the hand of someone who loved us.

As the daddy of a toddler who absolutely loves to be read to, this strikes a chord. Jacobs reminds us that just as we trusted our parents to bring the world of words to us when we could not yet even speak sentences, so we can, as adults, allow the myriad chaotic forces around us to drop texts in our path, and accept them as they come, rather than worry over the time not spent on things we feel we are “supposed to” read.

Jacobs, incidentally, also confirms my feelings about the benefits of dedicated ereaders such as the Kindle. Particularly at this time in our technological lives when so many other gizmos promise to inundate us with all manner of simultaneous stimuli, Jacobs recognizes that this gizmo can help to cleanse the palate and provide oasis.

… people who know what it is like to be lost in a book, who value that experience, but who have misplaced it … They’re the ones who need help, and want it, and are prepared to receive it. I had become one of those people myself, or was well on my way to it, when I was rescued through the novelty of reading on a Kindle. My hyper-attentive habits were alienating me further and further from the much older and (one would have thought) more firmly established habits of deep attention. I was rapidly becoming a victim of my own mind’s plasticity, until a new technology helped me to remember how to do something that for years had been instinctive, unconscious, natural. I don’t know whether an adult who has never practiced deep attention—who has never seriously read for information or for understanding, or even for delight—can learn how. But I’m confident that anyone who has ever had this facility can recover it: they just have to want that recovery enough to make sacrifices for it, something they will only do if they can vividly recall what that experience was like.

So beyond Jacobs’ excellent prose and insight, perhaps one of the things that recommends this book to me so strongly is validation. I can live with that.

Why We ‘Refudiate’ the Brasolaeliocattleya: Thoughts on ”The Lexicographer’s Dilemma”

Jack Lynch’s fascinating book, The Lexicographer’s Dilemma, is full of original insights, refreshing perspective, and delightful trivia about our mother tongue. It spans history and academia to lend understanding to what it means for a word to be considered an “official” part of the English language. The gist, as you might surmise, is that there is no such thing as the official version of the language. Dictionaries and pedants have over the centuries set down guidelines about propriety, some more sternly than others, but on the whole, the language is an ever-evolving, gelatinous swarm of words, idioms, and ideas. Lynch would have it no other way, and has little regard for those prescriptivists who attempt to nail it down.
To give an idea of the book’s overall theme, see Lynch’s take on the word/non-word “ain’t,” which he describes as…

… the most stigmatized word in the language … [which] every five-year-old is taught is not a word. But why not? Just because. It originally entered the language as a contracted form of am not (passing through a phase as an’t before the a sound was lengthened) and first appeared in print in 1778, in Frances Burney’s novel Evelina. We have uncontroversial contractions for is not (isn’t) and are not (aren’t), so what’s wrong with reducing am not to ain’t? The problem is that it was marked as a substandard word in the nineteenth century, people have been repeating the injunction ever since, and no amount of logic can undo it. It’s forbidden simply because it’s been forbidden.

You see where he’s coming from. We can so easily take for granted notions of what words are “real” and which are not (I am more guilty of this kind of parsing than most), forgetting that the real arbiters of these disputes are not thick books of alphabetically arranged terms, nor English text books, but actual human beings using those words. We don’t fault Shakespeare, for example, for inventing new words — whether they were based on existing words, or made from whole cloth — in fact, the earliest lexicographers used great writers such as Shakespeare as the starting point for what was and was not an English word. But any such effort made before Shakespeare would have missed his substantial contributions.

So what other kinds of words tend to get nudged out of “proper” or “official” English? It can be pretty surprising when one considers what gets to stick around. For example. it makes perfect sense that newer words like “blog” or even the latest sense of the word “tweet” should be given general lexicographical approval, but what about words based entirely on error — not on some creative use of language? I’m thinking, of course, of the recent decision of the folks of the Oxford English Dictionary to give “word of the year” credence to Sarah Palin’s “refudiate,” a word muddled entirely from her ignorance of the word’s two roots. If a “wrong” word falls into common use by millions, that’s one thing. When a narcissistic anti-intellectual mob-inciter like Palin screws something up, I have trouble understanding why that should be given any credibility, even if it is half-tongue-in-cheek.

Another example of those terms that are real English words in the sense that Anglophones use them, but don’t get dictionary codification because of their arcaneness in the eyes and ears of the general populace: scientific terms. Lynch again:

… if including everything scientific is impossible, so is excluding everything scientific. Everyone recognizes the need to include some scientific words like fruit flykoalacarbon, and salt. But why should a lexicographer include daffodil and atom but omit brasolaeliocattleya (a kind of orchid) and graviscalar bosons (theoretical subatomic particles)? There’s no difference in the character of the words, only in the familiarity of the things they identify. If some future technological breakthrough makes us all familiar with graviscalar bosons, they’ll eventually show up in the major general dictionaries. Until then, they have to remain in the language’s antechamber.

I’m pulling for graviscalar bosons. I see no reason why it shouldn’t be in general use, if for no reason than that it’s a delight to say. Try it.

But, like “ain’t,” some words are beyond the language’s antechamber and instead find themselves in the house’s hidden dungeon. These are of course our “dirty words.” The origin of the concept of utterly-unacceptable words may surprise you, and be especially enlightening to my atheist readership:

The notion that particular words are taboo can probably be traced back to primitive beliefs about sympathetic magic, in which language can be used to injure people at a distance. It’s telling that many of our unseemly words are known as curses, since the conception of offensive language seems to have derived from a belief in the power of a malefactor to place a curse on an enemy.

So not only are “curse words” arbitrary and, on the whole, harmless in and of themselves, but their supposed power derives from notions of the supernatural, as though uttering them could do actual physical or spiritual damage. Makes the case for their enfranchisement even stronger, as no one will be made mysteriously ill or forced to reincarnate as a dung beetle by my typing the word “fuck.”

Of late, there may be no one who better illustrates — through written and verbal usage — the delightfully changeable nature of language than humorist Stephen Fry, who wrote a few years ago in an ever-relevant essay:

Convention exists, of course it does, but convention is no more a register of rightness or wrongness than etiquette is, it’s just another way of saying usage: convention is a privately agreed usage rather than a publicly evolving one. Conventions alter too, like life… . Imagine if we all spoke the same language, fabulous as it is, as Dickens? Imagine if the structure, meaning and usage of language was always the same as when Swift and Pope were alive. Superficially appealing as an idea for about five seconds, but horrifying the more you think about it.

If you are the kind of person who insists on this and that ‘correct use’ I hope I can convince you to abandon your pedantry. Dive into the open flowing waters and leave the stagnant canals be.

But above all let there be pleasure. Let there be textural delight, let there be silken words and flinty words and sodden speeches and soaking speeches and crackling utterance and utterance that quivers and wobbles like rennet. Let there be rapid firecracker phrases and language that oozes like a lake of lava. Words are your birthright.

This being so, we should make better use of this birthright. Embrace the changes, relish the experimentation, the creative truncations, the inventions, and at the same time, educate yourself. Learn the words that are unfamiliar. You can’t do Jackson Pollock-type abstract painting until you learn to reproduce the works of the impressionists. You can’t do improvisational jazz until you have mastered, note for note, the classical works of centuries past. Likewise, don’t presume to change the language until you are sufficiently familiar with it that your creativity means something — be aware of what you and those around you are doing to the language hundreds of millions of us share. And as Fry says, in this, find pleasure.

‘A Better Pencil’: A Good Point That Needs a Better Book

Apart from some interesting bits about the challenges presented by, and the romanticism associated with, various writing tools and implements, Dennis Baron’s A Better Pencil: Readers, Writers, and the Digital Revolution is a very repetitive book with little to say. Essentially, Baron gives laborious, truly unnecessary explanations of some of the most common and basic writing means — from pencils and typewriters to Facebook and IMs — fit only for those to whom these technologies are totally alien (so perhaps it will be of use to folks 500 years from now).

On the positive side, there’s a point made by Baron that, while not needing a book’s length to make, is important and worth remembering: Every new means of setting words down has elicited both exciting expansion of the ability to write and publish, as well anxiety over the alleged dire consequences for our culture. And every time, we seem to agree that the advance was worth the ensuing mess and uncertainty. But it’s fun to note that, yes, even the pencil once seemed a bridge too far for some folks, and we can keep that in mind when we wonder at the wisdom of things like Tumblr and Twitter and what they might be doing to the art of writing.

Baron also uses the book as a clumsy sledgehammer to attack those he sees as Luddites and tech skeptics. I’m sympathetic to Baron’s position, certainly, but it’s not enough to save the book. Interestingly, Baron may be one among a very rare species: the pro-technology curmudgeon.

But skip this one for now, at least for the next 500 years.

Lest We Forget: Thoughts on “Rise and Fall of the Third Reich”

The edition that I own of William Shirer’s The Rise and Fall of the Third Reich advertises that the book is one that “shocked the conscience of the world.” I saw this mainly as an indication of what the book must have meant to a public that might not have been as familiar with the crimes of the Nazis and, well, accustomed as we are today to frequent and thoughtless analogies; from goofy Mel Brooks Hitler parodies to the Soup Nazi, as a society we seem to have digested this period of human history as just that, a period of history, distant and with little relevance.

I think we may be doing a disservice to ourselves. I don’t mean to say that this terrible period should not be the subject of humor and satire — it must! — but having now completed Shirer’s enormous book, I am beginning to think that we are forgetting too much.

It’s easy to say that, for example, the tea-baggers calling Obama Hitler and comparing the health care bill to the Final Solution are out-of-bounds, an example of overheated rhetoric. But in a way, saying that these kinds of comparisons “go too far” really doesn’t go nearly far enough. And it may upset some of the more bloodthirsty liberals as well to hear that, yes, even doing a Bush or Cheney-to-Hitler comparison is way, way off base.

Let’s not even deal with the Obama/health care comparisons; they make no sense in the least. But the Bush/Cheney comparisons usually stem from the idea that the Bush team was imperialistic, hungry for the resources of other nations, and mainly heartless about who it hurt in its quest for power. Fine. All of that was true of Hitler. But it’s also true of just about every other imperial power in human history. You can’t be imperial unless you build an empire. You can’t build an empire unless you take someone else’s territory. You usually can’t do that without committing — or at least sincerely threatening — unthinkable violence.

But we use Hitler and Nazism as the standard of human evil for good reason. The Holocaust might be the most evil, horrific event of our species’ history even if had been merely a mass extermination — but it was not the first nor the last genocide, not the first or last slaughter of millions, that humans have known. The Holocaust was that plus, if it can be imagined, several additional levels of cruelty; the starvation, the slavery under unimaginable conditions, the insane medical experiments, the sadism of the Nazi captors, and the raw industrialism of the killing — rounding up the populations of already-rotted-out villages and systematically executing whole neighborhoods and families at once, forcing the soon-to-be dead to jump into pits filled with their dead neighbors and relatives before they themselves were murdered.

And when all was lost for the Third Reich, it was not enough to lose the war. Had Hitler had his way in his final days and hours, the entirety of Germany would have crumbled with him, as he ordered every aspect of German life — stores, waterworks, utilities, factories — destroyed so there would be nothing left for the Allies to take. As horrifically as he had treated his enemies, he was about to let the same happen to his own “superior” people for no other reason than pride.

Perhaps it’s not worth trying to figure out whether anyone in human history was “worse.” I’m no historian by any means. There are probably men and systems that were more evil but had less opportunity to do such harm (I don’t put it past the likes of Al Qaeda or the regime in North Korea to behave so madly and cruelly given the means to do so), and those who may have done more damage and caused more suffering, but are not remembered in the same way. But trivializing the terror that was Nazism in our daily parlance, to use the imagery as something applicable to our current politics is to forget. It’s to forget the tens of millions who not only died because of Hitler and his henchmen, but to forget the deep, unspeakable suffering of all those who found themselves beneath the Nazi boot.

And it is to forget what it is that brought Nazism to the forefront of German life. It is instructive that Hitler never succeeded in some violent takeover of Germany, despite attempts to do so. In the end, Hitler achieved power through “official” channels, bit by bit gaining the approval and acquiescence of the government and institutions, and bit by bit exploiting a frustrated and angry populace by stoking its rage, its fears, and its pride. Shirer himself, in a 1990 edition of his 1960 book, wondered whether a then-newly-reunified Germany might be ripe for another similar episode. 20 years later, his fears have not come true. Not there, anyway.

But he was right to be watchful. To trivialize the Third Reich today is to lose sight of how it could happen again, not in the Obama-is-Hitler sense, but in the sense of a charismatic person or persons taking advantage of a weakened and frightened public and a spineless government, and doing things in their name that they did not think human beings were capable of. It can happen again, but if we don’t learn the right lessons from history, we’ll miss it. And it will be too late.

All that said, do your brain a favor and take the big chunk of time you’ll need to read Shirer’s book. Learn something, why don’t you.

Bryson’s “At Home”: A Delightful Slog through Human Misery

About halfway through Bill Bryson’s At Home: A Short History of Private Life, one can’t help but come to a couple of stark conclusions. One, that most of humanity’s domestic life, for the vast majority of time time we had domestic lives, was full of suffering and misery the likes of which we moderns can barely imagine. Two, that the tiny percentage of the species blessed with an overabundance of money and/or status have not been content to simply live well, but have wasted vast economic resources to spoil and aggrandize themselves in ways that would make Ozymandias cringe.

Bryson is a wonderful writer, and his storytelling is as usual conversational while remaining high-minded, as he clearly glories in his research and discoveries while allowing the space for the reader to catch up to him.

But his subject, I suppose, necessitated the retelling of these two central themes I’ve mentioned: The misery of the underclasses (disease, vermin, cold, being overwhelmed by feces, etc.) and the unabated vanity of the rich (who also, it should be noticed, were subject to disease and other unpleasantness, but often in Bryson’s telling faced ruin by their own ignorance or hubris). But if it is necessary, it is also relentless. Story after story, anecdote after anecdote is a tail that either makes one feel deep pity for those who are crushed under the weight of their poverty or nausea over the largess of the aristocracy. In between are the triumphs, the brilliant ideas, the advances, but it becomes almost exhausting when one contemplates the mayhem from which the victories emerge.

Here’s a good summation from the book, a quote from Edmond Halley (of comet fame), that I feel gets to the heart of the long crawl of human domesticity — human daily life — over the centuries.

How unjustly we repine at the shortness of our Lives and think our selves wronged if we attain not Old Age; where it appears hereby, that the one half of those that are born are dead in Seventeen years.… [So] instead of murmuring at what we call an untimely Death, we ought with Patience and unconcern to submit to that Dissolution which is the necessary Condition of our perishable Materials.

And in the meantime, invent the telephone and the flush toilet and make it a little easier.

A recommended read; a slog, but a delightful slog.

Armstrong’s “The Case for God”: A Case Not Made

Few religious thinkers have eased the consciences of spiritual liberals, anti-fundamentalist religious moderates, and functional nonbelievers unwilling to stake any affirmatively atheistic ground than Karen Armstrong. For years she has been making the assertion that her scholarship proves that the “great” monotheisms ought not be associated with the fear, xenophobia, irrational faith in the absurd, violence, or misogyny that so they so often encourage, but that they have their “true” foundations in love and tolerance–and anyone who doesn’t think so hasn’t been doing it right. As much as that assertion causes many skeptics to arch their eyebrows, it at least sounds like a good thing to which the faiths could aspire if they were so inclined. Alas.

Her latest book, The Case for God, is not meant to explain the various faiths’ dispositions or ideological foundations, but to convince the reader that the most commonly held notions of God, those of a being that created the universe and “exists,” are false, and that in actuality, God is an unknowable, unfathomable concept for which the very term “existence” is too limited. If you think that sounds like a pretty weak basis for an argument when dealing with such a grand concept’s veracity, you’re right. And despite Armstrong’s impressive breadth of knowledge and her nuanced grasp of various thinkers’ positions throughout the generations, her case never adds up.

Part of the trouble, of course, is that her book’s premise is challenged by her own explanation of what God is. It is nigh impossible for me to understand how someone can build a case for God if the central thesis is that God is an unknowable pseudo-entity-but-not-really, something that mere humans are wholly incapable of defining. Where does that leave your book?

One would hope that an intellect like Armstrong’s would note this paradox. Instead, she throws her lot in with apophatic explanations of God from a chosen slate of theologians, limiting the “correct” discussion of the nature of God to defining what Godis not. Thus, The Case for God is not so much a set of proofs for God’s existence, but a selection of and elaboration on the apophatic positions of particular religious thinkers, united in their unwillingness to pin God down to the realm of the perceivable. It is not a case, per se, but a series of similarly-themed guesses.

There is another paradox in Armstrong’s venture. Repeatedly throughout the book, she laments how “rationalist” notions of God (pretty much all of them since the Enlightenment, a period about which she seems to have mixed feelings) that picture a supreme being of some form that “exists” in the way most (okay, just about all) people define the term, and she is quite clear that such a position is flatly incorrect and–what might be the worst sin in her eyes–”idolatrous.” But by the very act of asserting that some notions of God are incorrect, she gives the lie to her position that God is an unknowable. If it is not, how can she know who is and is not correct in their beliefs?

Armstrong has a particular bone to pick with what we understand today as atheism, most vigorously with the New Atheists, who she says choose the idol-God of the incorrectly-religious to assert the non-existence of. But Armstrong herself, as many have pointed out before me, defines God out of all notions of existence anyway, leaving nothing to believe in to begin with. Speaking for atheists (if I may for the moment), I think it is safe to say that whether we are talking about a vengeful Old Testament Yahweh or a non-definable, quasi-existent infinite ultimateness of the divine logos, both are equally unprovable, devoid of evidence, and not worthy of acceptance. Like many of the New Atheists’ critics, she complains that they are not sufficiently well-versed in theology, and are therefor in no position to weigh in on the question of God’s existence. This is akin to saying that one cannot assert the nonexistence of the Starship Enterprise unless one has studied every episode of every Star Trek series, earned a degree in startrekology, and published scholarly articles on the debate as to whether resistance truly is futile.

To Armstrong, this rationalist line of thinking shuts out alternate means of arriving at “truths,” for Armstrong rests on the also-unprovable notion that truth is a fluid, utterly subjective concept that can be realized by means other than reason. This mindset obviously opens up a formidable can of worms, as any cockamamie “methodology” that someone chooses, and any absurd answers they turn up, suddenly become equally valid. For Armstrong, there is rational truth and religious truth, and religion–something she insists should be viewed as a discipline and practice rather than a belief system–is equally capable of arriving at truth as science. What truths religion is supposed to reveal is, of course, not terribly well defined, and one is forced to infer that the correct truths are those that Armstrong has revealed to us; God is too out-there for mere existence, thinking otherwise is wrong, and we should only talk about what God is not, though we’re pretty sure it’s all about love and tolerance. Throw in a few dashes of meaningless terms like “the infinite,” “ultimate truth,” “inner essences,” and things that have “no qualities,” and you have some idea of what Armstrong is talking about. Or, more likely, you don’t.

I would be remiss if this review did not also highlight what was particularly troubling about the book; Armstrong’s twisting of the practice of science to imply that at its best it is grounded in some theological inspiration. Armstrong praises those greats of ages past, Newton, Descartes, Kepler, and claims that they practiced a “science rooted in faith” because they were personally inspired by religious feeling to pursue knowledge. Now, it may very well be that all these men were driven to discovery out of religious fervor, but that does not in any way make the method they chose, the scientific method, somehow dependent upon or tainted by superstition. Their motivations may have been faith-based, their science was not, and could not have been. Her assertion otherwise is as meaningless as saying that their work was rooted in money if they were paid for their research. She also offends the sensibilities by looping in modern physicists who, since they often deal in invisible abstracts, are examples of science-as-faith-exercise. I would imagine that most of those physicists would balk at the idea of their work being classified as religious in this way.

And while she praises those scientists who have been open to the supernatural, she castigates Galileo for that same disposition. The difference is that Galileo was punished for his pursuit of actual truth, and that punishment was delivered by the earthly representatives of the ruling faith tradition. This is apparently an uncomfortable and inconvenient bit of history for Armstrong, as it is an instance in which religion actively suppresses and penalizes understanding of reality, and so she is careful to let us know that Galileoshares the blame with Pope Urban VIII for his house arrest and public shaming, because Galileo was “pe
rversely intent on reconciling” his scientific findings with scripture. This kind of faith-inspired science is perverse to Armstrong. Is it because he, unlike Kepler and Newton, was punished for it by the ministers of that very faith?

There is more to be said about Armstrong’s puzzling take on the purpose of religion; something of a set of rituals designed to make one a better person in some undefined way, the vaguest kind of self-improvement. But if every believer on Earth were religious in the Armstrongian sense, there probably wouldn’t be too much of a need for affirmative atheism or a secularist movement. Goodness knows, she has won the congratulations of the fuzzily-spiritual and not-quite religious all over the mainstream press, particularly from liberals. But her God is as flimsy of a hypothesis as any other more “idolatrous” version, whether she would deign to allow her position to be subject to such earthly terms as “hypothesis” or not. The case for God is not only a weak one, but it is, like Armstrong’s own God, for all intents and purposes non-existent.