You are currently browsing the monthly archive for October 2008.

[Chuck Walker has decided to squander some of his precious time today by posting on the 1746 Lima earthquake. Chuck's extraordinary new book, on the same subject, can be found here. Thanks, Chuck, for agreeing to join us.]

On this day in 1746 a massive earthquake walloped Lima, Peru, the center of Spain’s holdings in South America. Tumbling adobe walls, ornate facades, and roofs smothered hundreds of people and the death toll reached the thousands by the next day. About ten percent of this city of 50,000 died in the catastrophe. The earthquake captured the imagination of the world, inspired Lima’s leaders to try to rethink the city, and unified the city’s population–in opposition to these rebuilding plans. With constant aftershocks and horrific discoveries of the dead and wounded, despair as well as thirst and hunger set in quickly. Life was miserable for a long time. Limeños took to the street in countless religious processions, bringing out the relics of “their” saints such as Saint Rose of Lima or San Francisco de Solano. People took refuge in plazas, gardens, and the areas just outside of the walled city.

Things were worse in the port city of Callao, ten miles to the west of Lima. Half an hour after the earthquake, a tsunami crushed the port, killing virtually all of its 7,000 inhabitants. Some survivors had been inland or in Lima while a few made it to the top of the port’s bastions and rode it out. Several washed up alive at beaches to the south, telling their miracle stories to all who would listen. One woman had floated on a painting of her favorite saint. Merchants in Lima who kept houses, shops, and warehouses in Callao claimed that when they arrived the next morning they could not find the site of their property—no landmarks remained. The water not only ravaged people, animals, and structures but also swallowed up the city’s records. For years people petitioned to the courts about their identity or property, unable to show their papers in the well-documented Spanish colonial world.

The earthquake/tsunami takes us into areas where historians cannot normally venture—we have descriptions of where people were sleeping and life (and death) in cloistered convents. It also serves as an entryway into the mental world of the era, as people displayed their fears and priorities. Some worried about recovering their property and others about rebuilding social hierarchies; some more subversive-minded members of the lower classes saw it as an opportunity. Everyone suffered, however, with the loss of life and the misery of the following months.

The Viceroy and his inner circle did a remarkable job of stemming panic, assuring water and food supplies, and rebuilding the city. José Manso de Velasco, who would be titled for his efforts the Count of Superunda (“Over the Waves”), quickly gathered his advisors, surveyed the city on horseback, and took measures to rebuild water canals and find food from nearby towns and beached ships in Callao. His social control measures contained an ugly racialized side as authorities and elites fretted about slaves liberating themselves, maroons raiding the city, and free blacks dedicating themselves to crime. Martial law apparently stopped a crime wave although stories circulated about people tearing off jewelry from the dead or dying. People lined up at nearby beaches to collect washed up goods and many died when ransacking tottering houses—either from the caved in structures or angry crowds.

The viceroy managed to assure food, water and relative calm. The Bush administration would envy his success or at least his public relations coup. In part it reflected Manos de Velasco’s experience as a city builder (he had done this in Chile where he was stationed previously) and his relative hands-on approach. But absolutist authorities were good at emergency relief; in some ways, this is what they did. They couldn’t allow the people to go hungry (or to eat cake) and so Manso de Velasco requisitioned workers and flour and banned profiteering. People of all castes and classes lauded him for his willingness to sleep in the central Plaza.

Yet the Viceroy’s plans went far beyond immediate efforts in public safety. He and his advisors sought to change the city in classic eighteenth-century fashion. After deciding not to move it elsewhere (in part because they so fretted over the thought of maroons taking over the remains of the former viceregal capital), they sketched out a city with wider streets, lower buildings, and less “shady” areas. They wanted people, air, and commodities to circulate with greater ease, with all movement ultimately leading towards the Viceregal Plaza in the main square. Think of a frugal version ofVersailles. His plan, quite brilliant in theory, failed miserably. It managed to bring almost all Peruvians together, but in opposition. The upper classes rejected limiting their rebuilt houses to one story and tearing down some of their heavy facades, status symbols that proved deadly in earthquakes. They claimed he didn’t have the right and fought him for years. The Church saw the plan to impede them from rebuilding some churches, convents, and monasteries (there were an astounding 64 churches in Lima at the time) as a dangerous step towards secularization. They stressed their role in aiding the wounded and destitute and reasoned that since the earthquake was a sign of God’s wrath, did it make sense to anger him even more?

The lower classes did not like the plan. Afro-Peruvians resisted the social control campaigns aimed squarely at them and Indians (surprise, surprise) did not flock to the city to volunteer for rebuilding efforts. In fact, blacks and Indians organized a conspiracy that rocked the city less than four years after the earthquake and spread into a nearby Andean region, Huarochirí. Authorities arrested conspiracy leaders when a priest passed information about it that he had learned in confession. They executed six conspirators, displaying their heads for months. The rebels planned to flood the central Plaza and then kill Spaniards when they fled their homes. One participant proposed that any rebel who killed a Spaniard would assume his political position. While nipped in the bud in Lima, the uprising spread in the nearby Andean region of Huarochirí. After rebels took several towns and imprisoned and even killed local authorities, officials in Lima fretted that they would take the city, link with a messianic movement in the jungle, and even ally with the pernicious English. A Lima battalion, however, captured and executed the leaders. Manso de Velasco proved highly capable in efforts immediately after the earthquake. He was much less successful in using the catastrophe to create a new Lima.

There is an old, perhaps dead, journalism standard that third-world disasters merit little if any attention. Stories on “earthquake in South America kills hundreds” should accordingly receive an inch or two on page 19, at best. Did international savants and writers overlook Lima’s earthquake-tsunami? Not at all. Earthquakes fascinated the erudite in Europe and what became the United States. They debated their causes in this pre-plate tectonics era, contributing to trans-Atlantic polemics about whether the Americas was a younger, more humid, and inferior continent. The Comte de Buffon and Thomas Jefferson feuded on this question. Accounts of earthquakes also interested some Englishmen with imperial yearnings, who saw them as an opportunity to take over from the “weak” Spanish. This interest increased greatly in 1755 when the Lisbon earthquake took place and became an obligatory topic for virtually all European intellectuals. Lima plays a role alongside Lisbon in Voltaire’s Candide. One account of the Lima earthquake was published in Spanish, Portuguese, Italian, Dutch, and English, including a lovely edition by Benjamin Franklin (Philadelphia, 1749). The 1746 earthquake was by no means an overlooked and distant “third world” disaster.

I was in Lima for the destructive August 15, 2007 earthquake and confirmed that the Peruvian state’s response to earthquakes has changed over the centuries, not necessarily for the better. President Alan García’s efforts were more Bush than Manso de Velasco. While making impassioned speeches and taking the obligatory tour of the ruins in Pisco, the epicenter south of Lima, his government was unable to guarantee food, water, and shelter. And while international organizations and governments promised relief, the people of Pisco still live in makeshift tents and most of the city remains in shambles. The May 2008 China earthquake pushed Peru out of the press. Perhaps Peru needs a viceroy or at least a more committed and efficient leader; the world could always use a Voltaire and a Benjamin Franklin.

Via the super-cool litbrit at cogitamus.

On this date sixty years ago, a poisonous fog descended on the small town of Donora, Pennsylvania. An industrial community located 28 miles south of Pittsburgh, Donora’s economy depended on the American Steel and Wire Plant (a two-factory complex owned by US Steel) and the Donora Zinc Works. Although the three plants provided the livelihood for thousands of workers, air pollution had been a problem in the region since prior to World War I, as farmers reported periodic livestock deaths and crop damage. Several lawsuits were settled out of court during these years; a routine air sampling program, however, was halted in 1935.

On 26 October 1948, effluents from the town’s factories — including suphur dioxide, fluoride, carbon monoxide, and dusts from assorted heavy metals — were trapped by an air temperature inversion that swaddled Donora’s 13,000 residents in a deadly haze for five days. During an inversion, the air at ground level suddenly becomes warmer than the air above it, halting the ordinary convection currents that would ordinarily have lifted the poisonous industrial gases into the atmosphere. As the temperature inversion took place, observers reported that smoke from Donora’s three factories rolled out from the stacks and settled across the town’s rooftops like a thick, sweet-smelling blanket.

Russell Davis, driver for the Donora Fire Department, described the scene as he responded to emergency calls as townspeople began to cough up blood and lose consciousness

There never was such a fog. You couldn’t see your hand in front of your face, day or night. Hell, even inside the station the air was blue. I drove on the left side of the street with my head out the window, steering by scraping the curb. We’ve had bad fogs here before . . . Well, by God, this fog was so bad you couldn’t even get a car to idle. I’d take my foot off the accelerator and – bango! – the engine would stall. There just wasn’t any oxygen in the air. I don’t know how I kept breathing. I don’t know how anybody did. I found people laying in bed and laying on the floor. Some of them were laying there and they didn’t give a damn whether they died or not. I found some down in the basement with the furnace draft open and their head stuck inside, trying to get air that way. What I did when I got to a place was throw a sheet or a blanket over the patient and stick a cylinder of oxygen underneath and crack the valves for fifteen minutes or so. By God, that rallied them. I didn’t take any myself. What I did every time I came back to the station was have a little shot of whiskey. That seemed to help. It eased my throat. There was one funny thing about the whole thing. Nobody seemed to realize what was going on. Everybody seemed to think he was the only sick man in town.

By mid-day on October 27, eleven people had died and the Board of Health advised residents with chronic respiratory or cardiac problems to evacuate Donora. Within three days, the death count stood at eighteen; when the air inversion lifted and rain dispersed the remnants of the fog, as many as fifty additional townspeople died of lung and heart ailments. The health of hundreds more was permanently undermined by the lingering effects of the Danora fog.

Formal investigations by the United States Public Health Service were inconclusive, blaming the weather rather than the chemical effluents or the companies themselves. The PHS results were not surprising. Oscar Ewing, head of the Federal Safety Administration — where PHS was housed at the time — was formerly a top lawyer for Alcoa, which was fending off multiple lawsuits throughout the United States as a result of wartime air pollution. Although the medical symptoms in Donora were consistent with fluoride poisoning, the final report refused to single out any particular chemical for blame for the deadliest air pollution disaster in United States history.

Unfortunately for researchers, the PHS records related to the Donora Fog have been permanently misplaced or destroyed; the investigative records of US Steel, which evidently still exist, are closed to public scrutiny.

Also, the Fonz was wr— … I mean, he was wr— …

Below the fold you’ll find a copy of the paper I presented today. As I’ve said before, when I write a talk, I write a talk. I don’t write an essay that just so happens to be read aloud. I revise based on what I hear when I read aloud, so as to avoid speaking sentences that can’t be parsed on the fly like, say, this one:

Tina told Mark that John thought Pauline knew what Sam had planned for Justine, but Pauline insisted she had no idea John believed that, nor whether the look Justine exchanged with Mark at work yesterday meant that Tina had inadvertently revealed Sam’s trap before John and his brother Adam could spring it.

My shorthand’s pretty straight-forward: ALL CAPS means emphasis, en dash short pause, em dash longer pause, &c.  Some of the sentences are, yes, ungrammatical when written down—but when read aloud, they make more sense.  (There are complicated linguistic reasons for this vis-a-vis the relation of written language to spoken, and one day I might get into them, but that day ain’t today.)  That said, my talk:

Read the rest of this entry »

This, from McCain in 2000, is virtually identical to Obama’s answer today. Isn’t it?

On this day in October 1923, a committee in Stockholm met to consider giving the Nobel Prize in medicine to Frederick Banting, one of the discoverers of insulin.  The Nobel committee’s decision the next day to  honor both Banting and another researcher so infuriated Banting that he considered giving the prize back.  The fight between the prickly Banting and University of Toronto Physiology Professor John Macleod over the Nobel Prize was just one of the many strange and remarkable aspects of the discovery of insulin, a scientific breakthrough that saved the lives of millions of diabetics all over the world and helped usher in the modern era of medical research.

Banting, who would become the first Canadian to win the prize, and Macleod, a Scotsman, were jointly honored for the discovery of insulin the previous year. A struggling surgeon with no research experience, no doctorate, and little knowledge of the scientific literature in the field, Banting had approached Macleod, an eminent researcher, in the fall of 1921 with a seemingly quixotic idea for treating diabetes.  In those days, diabetes was often a fatal illness, especially for children who suffered from the more severe form of the disease (now called Type 1).  Type 1 diabetes usually begins when the sufferer’s own immune system, for  reasons still unknown, destroys the insulin-producing cells of the pancreas, which makes it impossible for diabetics to get the energy from their food.

Banting proposed an innovative research project: to take out dogs’ pancreases to make them diabetic, prepare a serum from the removed organs, and then inject the serum back into the dogs.  If it worked, then he would try the elixir of mashed animal pancreas in humans.  Macleod, who initially was appalled by Banting’s ignorance of previous scientific work on diabetes, nevertheless was intrigued enough to give Banting a filthy, unused laboratory, a small budget to buy dogs, and a medical student to assist him.  Though Banting proved to be a careless researcher, he did succeed in producing an elixir — a “mysterious something” — that dramatically reduced the blood sugar in diabetic dogs.

Once the experiments worked, Professor Macleod took an active interest and began to refine and promote Dr. Banting’s work (and, in Banting’s view, steal credit from the real discoverer of insulin).  A biochemist at the University of Toronto detoxified Banting’s serum, and the researchers tested it in humans beginning in 1922.  The results were dramatic.  Before the discovery of insulin, many diabetics starved to death. Once they received Banting’s serum, children sat up and began eating, walking, laughing, and living.  It was a miracle; it was, the newspapers said, the greatest advance in medicine in decades.  The before and after pictures can make you cry:

The discovery helped to elevate the reputation of medicine in the western world.  Before about 1910, if you were sick, you were better off avoiding doctors than consulting one.  By the 1920s, western medicine was beginning to help more people than it hurt; and insulin was one of the most dramatic examples of how modern scientists could become like gods.  In 1921, thousands of diabetic children in the US and Canada were slowly wasting away; in 1923, thanks to Banting’s potion, they rose from their deathbeds to lead normal, active lives. The experiments also invigorated the anti-vivisectionist movement, as many dogs died in the course of the quest. (The mass producers of insulin shifted to using pigs, whose pancreases were more readily available and whose sacrifice did not excite the same sort of public hostility.)

Yet though the discovery of insulin marked a watershed in the history of medicine, it still took place in a scientific world that seems very distant from us today.  Unlike the scientific discoveries of later years — the atomic bomb, say, or the structure of DNA, or the polio vaccine — insulin was not the product of big science, with well-funded teams of scientists in gleaming laboratories around the world racing each other to a world-changing goal.  It was one doctor and his research assistant, mashing up dog pancreases and straining them through a filter.

One might imagine that Banting would be thrilled to receive the Nobel Prize, the ultimate validation of his work.  But one would be wrong. He might not have a Ph.D., but he did have the temperament of an academic: he reacted with complete, unalloyed fury that he had to share the prize.  In his words:

I rushed out and drove as fast as possible to the laboratory.  I was going to tell Macleod what I thought of him.  When I arrived at the building Fitzgerald [a colleague] was on the steps.  He came to meet me and knowing I was furious he took me by the arm.  I told him that I would not accept the Prize….  I defied Fitzgerald to name one idea in the whole research from beginning to end that had originated in Macleod’s brain – or to name one experiment that he had done with his own hands.*

A witness said Banting was so furious “he could have torn the whole building down.”

Eventually, Banting relented and agreed to accept the prize.  There was his country to consider – “what would the people of Canada think if the first Canadian to receive this honour were to turn it down?”  And there was the reputation of science.

Macleod seems to have regarded Banting as something of a dilettante who happened to get lucky once, and Banting’s future career did nothing to change his mind. Canada’s most honored scientist enjoyed munificent funding from a grateful government, but never produced another significant discovery before he died in a plane crash in 1941.  For the children and adults whose lives were saved by Banting’s miracle elixir, though, that one discovery was more than enough.

*For more, see Michael Bliss, The Discovery of Insulin.

Yesterday afternoon I talked to a very kind and pleasant reporter and camera crew from CBS News for three hours about the Great Depression and New Deal, both as history and as compared to the present crisis, as far as one can. This was for what they called a “turnaround” documentary—i.e., one commissioned quickly for quick production—to appear on the History Channel.

About halfway through I began thinking, boy, I thought live interviews are bad: but maybe tape is worse! In a live interview you might say something dumb because it’s live; in a taped interview—especially one that goes three hours and in which they ask you to recall myriad facts and figures off the top of your head, without telling you which they might be in advance—you’re guaranteed to say something dumb at some point. Probably several somethings.

Here’s the who and the where and the when.  A wonderful time will be had by all.  As Ari will tell you, my slagging of Jonah Goldberg is unparalleled. It isn’t like a Sadly, No! slagging.  It is a very serious, thoughtful slagging that has never been made in such detail or with such care.   (Largely because I quote him at length, but still.)

*Title edited because I can’t read posters.

If you watched the video on Understanding the Financial Crisis, you know I got asked a question something like, when did the RFC retire its bank stock. And I said, well, they’d got rid of about a quarter of it in 1935-36, but I don’t know how long it took to get rid of all of it.

I couldn’t find the answer in any obvious place, so I spent a couple hours this morning pulling it out of the Federal Reserve Bulletin 1932 onwards, and a couple of later audit reports tendered to Congress. I include it here for your interest. (Of course once I post it, I’m confident someone will point out that this is readily available in such-and-such standard reference work, but hey, such is the wages of research.) Most of the figures are as of Oct 31 of the year; the last two figures are as of June 30 and include a bit more than just the preferred stock—also notes and debentures.

Here’s what an audit of the RFC said on the subject in 1950:

The balance of investments in bank stock, notes, and debentures of $157,655,807 [at 1947] represents the unliquidated portion of over $1,125,000,000 of such investments made mainly during 1933 and 1934. At the time the Corporation made investments in bank and trust companies it was anticipated they would be retired over a period of 20 years. On this basis approximately 70 percent would have been retired at June 30, 1947; actually 86 percent had been retired at that date.1

The RFC went into liquidation itself upon an act of 1953.

1Serial Set Vol. No. 11426, Session Vol. No.22, 81st Congress, 2nd Session, H.Doc. 468, Report on audit of Reconstruction Finance Corporation and subsidiaries, p. 52.

On this day in 1941, the Senate approved a $6 billion supplemental Lend-lease bill, thus bringing the United States closer to joining the war that was consuming the rest of the world.

The original Lend-Lease Act, passed in March 1941, gave President Franklin Roosevelt the power to lend, lease, or otherwise dispose of food, ammunition, and arms to any country he deemed essential to the defense of the United States.  By the fall of 1941, those nations included the Soviet Union and China as well as Great Britain.  By the end of the war, the US would give more than $49 billion to more than 40 nations under Lend-Lease.

Many members of Congress had predicted dire consequences if Lend-Lease became law; indeed, the debate shows elements of what Richard Hofstadter famously called the paranoid style. Despite Roosevelt’s insistence that the law would help the country avoid war, the anti-interventionists knew that Lend-Lease signaled a turning point in U.S. foreign policy, and they put up a tremendous fight against it. They repeatedly invoked the “lessons of history” taught by World War I revisionists. Senator Burton Wheeler, the leader of congressional forces against Lend-Lease, used arguments similar to those George Norris had made in 1917. The “interests” were once again foisting “one war measure after another on you, a peace-loving and unsuspecting people,” he told Congress. The people should respond by refusing to play the game of the Morgans and the Rockefellers. “Remember,” Wheeler told his supporters, “the interventionists control the money bags, but you control the votes.”

The anti-interventionists also stressed the dangers of a leviathan government in wartime, particularly the dangers of an imperial presidency. The peril to the republic, national hero Charles Lindbergh testified to a congressional committee, “lies not in an invasion from abroad. I believe it lies here at home in our own midst.” Senator Gerald Nye decried Congress’s willingness to surrender its constitutional purview to a “power-hungry executive” and reduce itself “to the impotence of another Reichstag.” If Congress was another Reichstag, then Roosevelt, by extension, must be another Hitler. Leaders of the anti-interventionist organization America First maintained that the New Deal’s centralizing bureaucrats wanted, as Senator Wheeler said, to “establish fascism in the United States.” In his opponents’ eyes, the very act of opposing Hitler transformed Roosevelt into an American Hitler. (For more on views of Roosevelt as a dictator, see Ben Alpers’ marvelous book.)

When they insisted that neither side in the war had a righteous cause, the anti-interventionists downplayed Hitler’s brutal and increasingly genocidal policies against the Jews. Indeed, anti-Semitism was the elephant in the room that the more “responsible” anti-interventionists tried to ignore. Some, like journalist John T. Flynn, tried to keep the most vehement anti-Semites out of America First. They also tried to persuade prominent Jews to join the organization. But Lindbergh laid bare the anti-Semitic core of anti-interventionism when he gave a speech in Des Moines in September 1941 that identified the three forces leading the country to war: the Roosevelt administration, the British, and the Jews. Lindbergh singled out the Jews for special criticism: “Their greatest danger to this country lies in their large ownership and influence in our motion pictures, our press, our radio and our Government.”

Most newspapers and public officials condemned Lindbergh’s speech—Wendell Willkie, the 1940 Republican nominee for president, called it “the most un-American talk made in my time by any person of national reputation”—and Flynn and some America First leaders were distressed by it. But many anti-interventionists believed that Lindbergh had simply told the “truth,” that, as the lawyer Amos Pinchot explained, “as a group, the Jews of America are for intervention.” These anti-interventionists shared Lindbergh’s conviction that Americans would never willingly join a war against Germany; instead, they were being forced into it by selfish Brits, a lying executive, and Jewish warmongers. Though they insisted that these beliefs were not anti-Semitic, they ignored the long history of American anti-Semitism that lay behind Lindbergh’s accusation.

The anti-interventionists refused to see the differences between the First World War and the Second, between the British and the Nazis. They did, however, understand that the U.S. government was changing in immense—and, they believed, frightening—ways. Senator Robert Taft, the dean of anti-interventionist conservatives, argued that support for Britain would be the first step down a slippery slope to a national security state. “If we admit at all that we should take an active interest,” he said back in 1939, “we will be involved in perpetual war.” The United States would become more like European countries, with a powerful, centralized government launching wars around the globe. The increase in the coercive power of the government—to draft men, to commandeer resources, to suppress dissent—would imperil Americans’ historic independence and autonomy. It would, as Wheeler said, “slit the throat of the last Democracy still living.”

On this day in 1962, President John F. Kennedy addressed the American people and announced that the Soviet Union had, for some time, been constructing missile installations in Cuba. The United States, Kennedy said, would respond by imposing a blockade on the island nation. Reading this Times article earlier today, I found myself struck by two not very profound thoughts.

First, good journalism — the above piece was written by the incomparable Anthony Lewis — is a powerful thing. The article is filled with sharp, declarative sentences, conjuring a mood of the deepest anxiety. See, for example, these two paragraphs:

In a speech of extraordinary gravity, he [Kennedy] told the American people that the Soviet Union, contrary to promises, was building offensive missiles and bomber bases in Cuba. He said the bases could handle missiles carrying nuclear warheads up to 2,000 miles.

Thus a critical moment in the cold war was at hand tonight. The President had decided on a direct confrontation with–and challenge to–the power of the Soviet Union.

Second, I began wondering if October 22, 1962 was the most terrifying day in American history. Lewis relayed that Kennedy, in his radio address, had said: “the launching of a nuclear missile from Cuba against any nation in the Western Hemisphere would be regarded as an attack by the Soviet Union against the United States.” “It would be met,” the President had warned, “by retaliation against the Soviet Union.” Kennedy had then “called on Premier Khrushchev to withdraw the missiles from Cuba and so ‘move the world back from the abyss of destruction.'”

The world remained perched at the brink of that abyss for another week (for additional source material go here, here, and here). Which, I suppose, prompts a third not very profound thought. The nation is now mired in two failed wars, faces an economic crisis that may rival the Great Depression, and is about to stage an election that feels like it could be make or break. Still, things could be worse.

Ari and I are teaching this in seminar. I say it is an internet tradition with which we are all familiar; Ari says not. Since our commenters are familiar with all internet traditions, we ask you to tell us which of us is correct.

(At any rate, it is cool, isn’t it?)

Via Shahar, McCain ads in the style of John Woo, Kevin Smith and Wes Anderson:

The Woo (0:00) I could take or leave; the Smith (1:42) reminds me (yet again) how poorly Clerks has aged; but the Anderson (2:21) is brilliant. No more commentary from me—I have a talk to write tweak a few minor points in.

Back in July, Sarah Palin famous wondered what the vice president does all day.  In Dick Cheney’s case, of course, the answer involves ten-inch sewing needles, a ball-gag, and a bucket of still-warm goat hearts; for nearly everyone else, the duties have been considerably less interesting. But since no American vice president has ever been quite as much a maverick as Sarah Palin — who just yesterday expressed a wish to transform the spirit of modern political campaigning — it’s safe to say that the office of VP has been sorely wasted on the likes of Garret Hobart, “Cactus Jack” Garner, George Dallas, and that other guy*, none of whom fully utilized the powers Jesus the Constitution gave did not actually give them.

Here’s my dopey governor, doing her best to make sure that Brandon Garcia fails his next social studies test:

Q: Brandon Garcia wants to know, “What does the Vice President do?”

PALIN: That’s something that Piper would ask me! … [T]hey’re in charge of the U.S. Senate so if they want to they can really get in there with the senators and make a lot of good policy changes that will make life better for Brandon and his family and his classroom.

The last person to think the vice president needed to be mucking about in the Senate — as opposed to, say, harassing intelligence analysts, which is evidently much more productive — was Charles Dawes, Calvin Coolidge’s vice president, who spent his first few months in office caterwauling in a counterproductive, mavericky way about how the Senate needed to stop allowing filibusters. Nobody liked Charles Dawes.** But since Dawes didn’t wink make a bunch of middle-aged conservative men experience a sudden rush of blood to the groin, he couldn’t count on anyone to run interference for his stupid ideas about the vice presidency.

* You know — the guy with the shirt. The shirt? With that thing on it? Look, I can’t remember his fucking name. The point is, he had a fucking shirt, and he was the vice president.

** Which is sad.

On this day in 2008, Jim Beaver—a.k.a. Ellsworth—commented on my post about the language of Deadwood. I know that’s not really historical, but damn it, it’s cool. Now for something completely historical:

On this day in 1929, Ursula K. Le Guin was born to Alfred and Theodora Kroeber—though you wouldn’t know it from this article, in which no mention of him having fathered one of the 20th Century’s most influential science fiction writers appears. Her Wikipedia entry was adapted from a bad student essay, as is evidenced by how thoroughly the narrative of how-I-came-to-learn-this pervades it.

Her mother’s biography of Alfred Kroeber, Alfred Kroeber: A Personal Configuration, is a good source for Le Guin’s early years and for the biographical elements in her late works, especially her interest in social anthropology.

Bully on you, anonymous person, for evaluating your sources. That said, the aforelinked review ain’t much better. We’re told:

Alfred Kroeber (1876-1960) is amply known through his works—more than five hundred publications of which eight are books—and he was familiar in varying degrees to students, because he taught at [universities].

I’ve edited out the list of illustrious institutions, but really, I wish I’d been an academic in an earlier era, when such sentences could be published. (Not too early, though—say, Post-Trilling-as-Columbia’s-sole-Jew or thereabouts.) But I digress. Alfred Kroeber, known through his 500 publications and by students, was a proponent of “salvage ethnography. Here’s a picture of him with Ishi, who claimed to be the last of the California Yahi:

Cultural preservation takes a place of pride in Le Guin’s work, albeit backwardly, via her frequent evocation of cultural obliteration. In her anti-reform novel extraordinaire The Lathe of Heaven, she skewers the idea that society can be changed for the better. All progress, she argues, entails the destruction of a society whose current form is the by-product of an evolutionary process. It may not be a just society, but it’s not an invented one, and thus is far more stable than the proto-totalitarian imaginings of well-intentioned liberals. As Sean McCann and my adviser, Michael Szalay, argue, the novel

offers an all but direct allegory in which a passive aesthetic sensibility comes to replace an illegitimate effort to transform the world through instrumental means. Le Guin’s George Orr discovers that his dreams change the world; almost nightly he has what he calls “effective dreams” that reshape existence. Upon waking, Orr is the only one who recalls what the world used to be like, the only one who realizes that each night his mind refashions the lives of the planet’s billions. Orr turns to government therapists to find assistance in ending his dreams, but is understood instead to be delusional and irrationally afraid of his unconscious. He is thus committed to the care of one William Haber, a state-employed psychiatrist who quickly discovers that Orr does indeed dream effectively, and who then tries to use Orr’s dreams to rid the world of misery. Orr objects, and Le Guin organizes this novel around the ensuing debate between the two men over whether it’s right to change the world . . . .

[But] this kind of idealism comes at a high price. Every time Haber induces Orr to dream a better world, something in Orr resists; when told to solve the color problem, Orr dreams a world in which all are a dull and listless battleship gray; when told to end all human conflict, Orr invents an alien invasion that threatens earth from the sky. Awake, he tells Haber, “it’s not right to play God with masses of people. . . . just believing you are right and your motives are good isn’t enough.” Le Guin’s sympathies are unambiguously with her dreamer, whose resistance to Haber’s megalomania resembles both the New Left’s resistance to traditional politics and the Women’s Movement resistance to the New Left itself. Haber does eliminate the many ills on which he set his sights: he brags to Orr that they have “Eliminated overpopulation; restored the quality of urban life and the ecological balance of the planet. Eliminated cancer as a major killer. . . . Eliminated the color problem, racial hatred. Eliminated war. . . . Eliminated—no, say in the process of eliminating—poverty, economic inequality, the class war, all over the world.” But Orr refuses to grant the importance of these accomplishments because, regardless of the outcome, he doesn’t “want to change things.” These were views consistent with the widely shared sense that technocratic solutions to social problems were invariably misguided. But, like Mailer and many of her contemporaries, Le Guin does not merely worry about the unintended consequences or heedless arrogance of technocratic power; she counters it to what by contrast appears a more fundamental spiritual and political accomplishment—a therapeutic acceptance of reality itself. “We’re in the world, not against it,” Orr responds, “you have to let it be.”

Their treatment of Le Guin (tackled previously) is particularly compelling given what we all hope is the imminent rise of a new technocracy made of Hope and Change and Yes We Cans. I mean, can you believe that? Academic work that’s immediately relevant coming from an English department? What’s the world coming to? (And why can’t it arrive sooner?)

Via approximately everybody, but worth noting anyway.

It looks like the post below on “Understanding the Financial Crisis” was our 1000th post. Stupid balloon drop never works right.

Recent comments

This is officially an award-winning blog

Best group blog: "Witty and insightful, the Edge of the American West puts the group in group blog, with frequent contributions from an irreverent band.... Always entertaining, often enlightening, the blog features snazzy visuals—graphs, photos, videos—and zippy writing...."

Get every new post delivered to your Inbox.

Join 180 other followers