You are currently browsing eric’s articles.

On this day, the first of February, in 1934, the New York Times carried Franklin Roosevelt’s proclamation of a new gold value for the US dollar. Previously it had been worth 25 8/10 ounces of gold 9/10 fine; now it would be worth 15 5/21 ounces of gold 9/10 fine—or, as it is more commonly said, the dollar had been valued at $20.67 to an ounce of pure gold and now it would be $35 to an ounce of pure gold. But the US was not in 1934, nor would it ever again be, on a gold standard.

Read the rest of this entry »

Hillary Clinton is taking flak today for her summary repetition of the white supremacist Dunning School of historical interpretation, which held that the attempt in the 1860s and 1870s to provide African Americans with their civil rights was a terrible imposition on the white folks of the South.

[Lincoln] was willing to reconcile and forgive. And I don’t know what our country might have been like had he not been murdered, but I bet that it might have been a little less rancorous, a little more forgiving and tolerant, that might possibly have brought people back together more quickly.

But instead, you know, we had Reconstruction, we had the re-instigation of segregation and Jim Crow. We had people in the South feeling totally discouraged and defiant. So, I really do believe he could have very well put us on a different path.… let’s also think about how we do try to summon up those better angels, and to treat each other, even when we disagree, fundamentally disagree, treat each other with more respect, and agree to disagree more civilly, and try to be inspired by, I think, the greatest of our presidents.

I’ll leave critiques of the Dunning School in other hands because I think they’re obvious, sadly, and Clinton should really know better. I’ll even forgo detail on the obvious point that if you’re a modern Democratic presidential aspirant asked who’s the greatest of the US presidents, your answer is Franklin Roosevelt.1 Instead I want to focus on Clinton’s counterfactual: “had [Lincoln] not been murdered”. Read the rest of this entry »

In the current Los Angeles Review of Books Quarterly Journal Michael W. Clune1 writes about odd small episodes, “particularly ephemeral perceptual experiences” we have that may alert us to the gap between how things seem and what they are. Riffing on Rei Terada’s Looking Away, he lists mirages, after-images; “clouds taken for mountains … looking at a landscape with one’s eyes half-closed so that it appears underwater.” He notes that we have nothing to say to each other about these experiences even if we share them, but that they remain with us. They remind us that if we pay too much attention to the mechanism by which we draw meaning from appearance we attenuate that meaning. One of Clune’s examples is an imaginary exchange with a sales clerk about money.

Read the rest of this entry »

A year from today, the US will inaugurate a new president. But inauguration day has not always been thus fixed.

In the early years of the Republic, habit (rather than statute) placed the date of inauguration at March 4—though even that convention was not quite firm. In 1821, with the incumbent President James Monroe about to take the oath of office for his second term, March 4 fell on a Sunday. Monroe asked of Chief Justice John Marshall whether he could take the oath on the following day, rather than sully the Lord’s day with secular business.

Read the rest of this entry »

I dislike the term “Great Recession” to describe our times, for technical and political reasons alike. Technically, the severe recession ended in June 2009. But, as the NBER says there,

In determining that a trough occurred in June 2009, the committee did not conclude that economic conditions since that month have been favorable or that the economy has returned to operating at normal capacity.

And indeed it still hasn’t, six and a half years after the recession ended. In fact, as Kevin O’Rourke noted, in August of 2015

the inevitable happened: measured in terms of industrial output, our current recovery was overtaken by that of the interwar period. Pretty dismal stuff.

So now, having avoided quite so severe a contraction as the 1930s, we are suffering a less impressive recovery. What do we call this ongoing period?

“Malaise” is taken, and rather ruined, by Carter-related discourse. I’ve lately been suggesting “the great economic unpleasantness” but without, I confess, really expecting it to catch on. Krugman’s old “Lesser Depression” is looking depressingly correct.

Non-specific plot details discussed herein.

Quentin Tarantino’s The Hateful Eight opens with a close shot of Christ’s face as he hangs on a cross; the frame widens to show us it’s a roadside crucifix somewhere in a desolate snowscape. Along that road sweeps a stagecoach bearing a bounty hunter—whose dual nature is revealed in his nickname, “the hangman,” and his surname, which is Ruth, meaning mercy—together with his prisoner, a murderer and thief.

The film so swiftly displaces the lingering shot of the stationary crucifix with the rush of the coach, which is seeking to beat a blizzard to shelter, that perhaps by the end of the film a viewer has forgotten the opening invitation to contemplate the image, and concept, of redemption-through-suffering. But that is where the movie starts, and also where it ends.

The opening shot looks a lot like the beginning of Samuel Fuller’s The Big Red One, which also opens on a long close shot of a crucifix, this one in France during World War I—or rather, in the hours just after the armistice, when Fuller’s protagonist slays one more German before finding out his license to kill has expired.

Fuller’s movie is largely about how arbitrarily, but how completely, the declaration of war permits human slaughter. It mocks the idea of meaning emerging through violence; it mocks meaning altogether, I think. There is only survival and, where permitted, empathy. By invoking it—we know Tarantino admires Fuller, and shares some of his preoccupations—is the Hateful Eight doing the same?

Maybe; I think it may go further. In Fuller’s movie the hate between antagonists switches off instantly once the war ends. In Tarantino’s, the hate persists and intensifies after the war. Denied the outlet of legitimate killing, hate finds other ways to erupt. War gives way to guerrilla fighting, terrorism, outlawry; murder after murder.

For Tarantino, “war” here is specifically the US Civil War. In the film, the artifact embodying the myth of this war’s meaning is revealed literally to be a lie. The Christ crucified of the Civil War—Lincoln—has brought no redemption. Indeed the only “Redemption” in the offing—the end to any attempt at Reconstruction of the defeated South—will repudiate the purpose for which the war was fought.

Tarantino’s Pulp Fiction can only give its main characters what seems like a happy ending by twisting the order of the tale, and ending in (what the film lets us know is) the middle of the story. Apart from a fairly conventional flashback, The Hateful Eight proceeds more or less chronologically to its conclusion. That flashback shows us an evidently idyllic Minnie’s Haberdashery in the hours before the movie’s main action begins—a place where black and white people live in blissful harmony and women are in charge, driving the action.

On an uncareful reading, one might assume this flashback is supposed to tell us that the West really could have been a new and better society (it is, after all, Wyoming territory, the first to enact woman suffrage)—but Tarantino undercuts this illusion too, making Minnie’s hatred of Mexicans into a plot point. There was already a serpent in the garden then.

Straightened out, the film’s narrative begins with this false Eden and ends with a mocking Calvary. One of its most thoughtful monologues comes from Tim Roth’s character, who muses that justice demands that an executioner act without passion. A passion play ends with a crucifixion. This movie ends with another kind of execution, and none are saved.

Peter Hitchens is less well known in the United States than his late brother, but when asked to write for the New York Times, he delivers his Mail columnist goods, full-strength. Regarding Robert Tombs’s English and their History:

Even in free countries it is sometimes necessary to alter the past to suit the present. For instance, I recall the day at my English boarding school in the early 1960s when our sober, patriotic old history books were gathered up and carted away to a storeroom. In their place we were handed bright, optimistic replacements, with a good deal less to say about the empire, the Protestant martyrs or what we had been taught without embarrassment to call the Glorious Revolution.… Older English people look back fondly on 1940, when we supposedly stood alone. In fact we were a major industrial and exporting power with a global navy, more or less self- sufficient, nationally cohesive and bolstered by the tribute of a still-great empire. Now all of that is gone. Is it possible that, after a thousand astonishing years, our island story has finally come to a full stop? Will the next great history of our nation and people be written in Chinese?

Now that George MacDonald Fraser has died, the sources for such views of the empire and its history seem fewer each day.

The estimable Heather Cox Richardson sympathizes with George Will in his despair over Bill O’Reilly’s book, Killing Reagan. Will decries “today’s cultural pathology of self-validating vehemence with blustery certitudes substituting for evidence.”

Read the rest of this entry »

In The New Republic, Jeet Heer says that Donald Trump is not a populist, he’s “the voice of aggrieved privilege—of those who already are doing well but feel threatened by social change from below, whether in the form of Hispanic immigrants or uppity women.” Or the voice of the white American man enraged at the possibility he might lose his ill-gotten privilege. Heer doesn’t use the f-word, but it’s the elephant in the room.

Hitler elephant

A relevant elephant

For the alleged misunderstanding of Trumpism as “populism,” Heer blames the historian Richard Hofstadter, who in the middle 1950s explained he was interested in “that side of Populism” that sounded to Hofstadter a lot like McCarthyism. Hofstadter was right: there was a side of Populism, and not a trivial side, that sounded like McCarthyism—and Trumpism too. Read the rest of this entry »

Ronald Reagan in “A Time for Choosing,” the Gipper’s speech for Barry Goldwater in 1964:

Welfare spending [is] 10 times greater than in the dark depths of the Depression. We’re spending 45 billion dollars on welfare. Now do a little arithmetic, and you’ll find that if we divided the 45 billion dollars up equally among those 9 million poor families, we’d be able to give each family 4,600 dollars a year. And this added to their present income should eliminate poverty.

David Brooks in the New York Times, regarding the case of Freddie Gray in 2015:

The problem is not lack of attention, and it’s not mainly lack of money. Since 1980 federal antipoverty spending has exploded. As Robert Samuelson of The Washington Post has pointed out, in 2013 the federal government spent nearly $14,000 per poor person. If you simply took that money and handed it to the poor, a family of four would have a household income roughly twice the poverty rate.

Annie Lowery points out why Brooks’s argument is numerically bogus: just as conservatives don’t count millions of government employees as employed in 1930s, Brooks doesn’t count federal money as money in the 2010s:

Brooks is claiming that federal spending on anti-poverty programs is not lifting families out of poverty… when the government specifically does not include the value of those very programs in its poverty calculations.… A fuller accounting shows that food stamps alone lift 4 million people above the poverty line. The earned-income tax credit lifts nearly 6 million above it. Which is to say that “not bringing down the official poverty rate” is not a good yardstick by which to judge these programs.

But I would like to take David Brooks up on his suggestion: with the absolute same degree of sincerity as 1964-era Reagan, he’s supporting a straight-up transfer of wealth from the rich to the poor. It is a radical solution to poverty, this long-standing Republican proposal, but perhaps one that we should consider.

Teege

With Abenesia in the news, I thought it might be useful to talk about another Axis nation’s complicated struggle with the memory of the Second World War. Jennifer Teege found out, at the age of 38, that not only was her grandfather a Nazi, he was an especially infamous Nazi: Amon Goeth, the commandant of Płaszów concentration camp, the man played by Ralph Fiennes in Schindler’s List. On trial after the war, Goeth sneered at the witnesses against him, “What? So many Jews? And we were always told there’d be none left.” He gave a Hitler salute on the gallows.

Hence the title of Teege’s memoir: she has an African father, and a Nazi grandfather who would have regarded as subhuman a person of African descent. The book is a great deal subtler than the title suggests. It is saturated in memory, and forgetting, and the fault lines between memory and history run throughout it. Teege describes her attempt to reconcile what she learns about her grandfather with the kind – but, she now knows, complicit – grandmother she remembers. The book presents Teege’s reminiscence and necessarily somewhat therapeutic work alongside the sober, reportorial voice of Nikola Sellmair, whose dry factual rendering of verifiable history often undermines Teege’s hopeful, emotional writing.

There are different kinds of memory in the book. Teege’s adoptive German family had a more usual relationship to the Nazi era – her father didn’t really know the extent to which his family had taken part in Nazi crimes. Sellmair discusses such modern Germans, summarizing Harald Welzer’s study “Grandpa Wasn’t a Nazi.” Latter day Germans seize on any opportunity to construct a guiltless, even noble past for their forebears – as with the French, they were all in the resistance.

Teege’s brief narrative also encompasses also the memory kept by Holocaust survivors and their descendants: before Teege found out about her grandfather, she traveled to Tel Aviv, made friends there, and lived there. Her discovery imposes silence between her and her Jewish friends. She doesn’t know what she can say. Her grandfather might have shot their grandparents.

“There is no Nazi gene,” Teege insists, struggling against the idea that she must bear some guilt for her grandfather. But she clearly feels that guilt. We all inhabit the world the bloodthirstiest conquerors made; only some of us grew up with them, personally.

TLS
Now, or recently, at newsagents

In the TLS for 17 April, you can find my essay on Nicholas Wapshott’s The Sphinx, about the presidential election of 1940, the isolationists, and how Franklin Roosevelt engineered the US shift toward war. The essay starts like this:

Franklin Roosevelt recognized the threat Adolf Hitler posed from the moment of the German Chancellor’s appointment. In January of 1933, Roosevelt—not yet inaugurated, though already elected, President—told an aide that Hitler’s ascent was “a portent of evil”, not just for Europe but “for the United States”. He “would in the end challenge us because his black sorcery appealed to the worst in men; it supported their hates and ridiculed their tolerances; and it could not exist permanently in the same world with a system whose reliance on reason and justice was fundamental:. From then onward, Roosevelt’s policies raced Hitler’s: the New Deal was not merely a programme for recovery from depression, but one to rebuild economic strength while preserving democracy in the United States so the nation would be ready to fight Nazism when the time came.

The New Deal gave Americans not only the material capacity to fight fascism, but faith in American institutions. Which is why, of course, the prevalence of remarks like this one remains so appalling.

Paul Campos writes in the New York Times about what he claims is the “real reason” for higher college tuition in the USA:

far from being caused by funding cuts, the astonishing rise in college tuition correlates closely with a huge increase in public subsidies for higher education.… a major factor driving increasing costs is the constant expansion of university administration

And he singles out the California State University (CSU) system as an example:

while the total number of full-time faculty members in the C.S.U. system grew from 11,614 to 12,019 between 1975 and 2008, the total number of administrators grew from 3,800 to 12,183 — a 221 percent increase

Read the rest of this entry »

Money Makers
Forthcoming in September, from Basic Books

As readers of this blog know, Franklin Roosevelt declared he had taken the US off the gold standard on March 6, 1933, as the first substantial act of his presidency. But scholars have not been so quick to accept this date or, with firmness, any other.

When Roosevelt first said he had taken the US off the gold standard, he didn’t want to make too great a fuss about it because he was trying to quiet a panic that had nearly broken the Federal Reserve System. He hoped Americans would bring their gold back for deposit in the nation’s vaults. And they did. Even though the papers were reporting that the president might issue scrip for temporary currency; even though the Emergency Banking Act provided that Federal Reserve notes could be backed by commercial bank assets, people generally preferred paper money to gold, so long as they trusted the paper money – which, with Roosevelt’s assurances, they did, as you can see from the chart. Read the rest of this entry »

Money Makers
Forthcoming in September, from Basic Books

On this day in 1933, it was the first Friday of Franklin Roosevelt’s administration, and the new president met reporters to talk to them about ending the bank holiday with which he had begun his term. The Federal Reserve Banks would open on Saturday so that member banks of the Federal Reserve System could open on Monday. A reporter asked if the banks would be open “[a]ll along the line, Mr. President; that is, all functions?” Roosevelt replied, “Yes, all functions. Except, of course, as to gold. That is a different thing. I am keeping my finger on gold.”

The president’s week had started with his inauguration on March 4, the previous Saturday, when he had told the American people they had only fear itself to fear, and promised them an “adequate but sound currency.” The next day Roosevelt worked all through the day with members of his team and holdovers from Herbert Hoover’s to draft orders to close the banks and halt all payouts of gold. They worked so hard that Sunday that it was late by the time the order was ready for presidential signature – so late that Federal Reserve counsel Walter Wyatt urged the president to wait a little longer to sign, so that it would be Monday, and not so sacrilegious. Roosevelt did wait, and then, on signing the order said – gleefully, according to one account – “We are now off the gold standard.” That was in the wee hours of March 6; later that morning, Americans began a week of doing business without access to banks.

At Roosevelt’s first press conference, on Wednesday March 8, he told reporters that “what you are coming to now is really a managed currency.… It may expand one week and it may contract another week.” The end of the dollar’s convertibility to gold was not temporary but “part of the permanent system so we don’t run into this thing again” – which was what he repeated when he said, on Friday, that he was keeping his finger on gold.

Roosevelt was strikingly consistent in his monetary policy declarations.1 The US went off the gold standard with the bank holiday, he never meant to go back on, and he didn’t. He wanted to establish an international system of managed currencies, with an agreement that would allow them to remain stable for long periods, but adjustable in case of need – that was what he told the World Economic Conference at the end of summer 1933, and that was why it broke up – because other countries weren’t yet ready to join the US. At the end of 1933, Roosevelt talked up the dollar in value, stabilizing it in January 1934, while saying “I reserve the right … to alter this proclamation as the interest of the United States may seem to require.”

Despite some often-quoted barbs at Roosevelt’s method of getting there (complaining that the president’s talking up of the dollar by unpredictable amounts looked “more like a gold standard on the booze than the ideal managed currency of my dreams” or a “game of blind man’s bluff with exchange speculators”) John Maynard Keynes approved of the destination, writing in January 1934 that the president’s policy “means real progress.” Roosevelt had “adopted a middle course between old-fashioned orthodoxy and the extreme inflationists.” He had done nothing “which need be disturbing to business confidence,” and the monetary policy was “likely to succeed in putting the United States on the road to recovery.” Roosevelt’s adoption of a value for the dollar to be kept generally stable, if altered at need, also opened the possibility for an international conference on money, to “aim for the future not at rigid gold parities, but at provisional parities from which the parties to the conference would agree not to depart except for substantial reasons arising out of their balance of trade or the exigencies of domestic price policy.”

In other words, before Roosevelt had been in office a full year, he had articulated, with Keynes’s approval, all the elements of what would become the Bretton Woods monetary policy in 1944: currencies would be kept at stable exchange rates, but would be adjustable in keeping with the needs of economic prosperity in each country.


1Historians have a real problem recognizing this, owing I think to the influence of a couple of misleading memoirs by disaffected Roosevelt advisors who didn’t like his monetary policy, and who departed the administration early and therefore got their licks in early. Maybe I’ll write a post about this particular thing.

Possibly the right response to Gordon Wood’s “History in Context” in The Weekly Standard – a “get off my lawn essay,” as one historian says – is parody. After all Wood does begin the essay by saying his mentor Bernard Bailyn is woefully under-appreciated, and then proceeds to mention that Bailyn has two Pulitzers.1 What else can one do but mock?

Well, one can take Wood earnestly, as is one’s wont, and ask, what happened to the younger Gordon Wood? How would he fare before the stern tribunal of Weekly Standard Wood?

I ask because Wood the elder expresses dissatisfaction with those historians “obsessed with inequality,” who

see themselves as moral critics obligated to denounce the values of the past in order to somehow reform our present. They criticize Bailyn’s work for being too exquisitely attuned “to the temper of an earlier time” and, thus, for failing “to address the dilemmas of its own day.”…

These historians need to read and absorb Bailyn’s essay on “Context in History,” published in this collection for the first time. Perhaps then they would be less eager to judge the past by the values of the present and less keen to use history to solve our present problems. In some sense, of course, they are not really interested in the past as the past at all.

But, as another Bailyn student pointed out to me, Wood was not always so scornful of judging and using the past for present purposes, nor so principled about letting the past be past. Consider this important passage from Wood’s remarkable first book, Creation of the American Republic, 1776-1787:

Considering the Federalist desire for a high-toned government filled with better sorts of people, there is something decidedly disingenuous about the democratic radicalism of their arguments.… In effect they appropriated and exploited the language that more rightfully belonged to their opponents. The result was the beginning of a hiatus in American politics between ideology and motives that never again closed. By using the most popular and democratic rhetoric available to explain and justify their aristocratic system, the Federalists helped to foreclose the development of an American intellectual tradition in which differing ideas of politics would be intimately and genuinely related to differing social interests.… and thereby contributed to the creation of that encompassing liberal tradition which has mitigated and often obscured the real social antagonisms of American politics.… the Federalists fixed the terms for the future discussion of American politics.

Listen to what Wood the younger is saying, here: “disingenuous” surely sounds morally critical, as does “appropriated and exploited.”

Talking about what might “never again” be, and even about “the future” certainly doesn’t sound like thinking about “the past as the past.”

And bringing up “differing social interests” and “real social antagonisms” sounds like it might entail concern about, if not obsession with, inequality.

Perhaps Wood the younger would have to get off Wood the elder’s lawn.

I am actually more interested in what Wood the younger would say to his older self, concerned as he was with arguments that foreclosed discussion of genuine social antagonism. I have never really found persuasive Wood the younger’s argument that 1787 marked some kind of end-of-Eden, after which honest political discourse was never again possible in the United States. Rather, I think the Federalists’ disingenuous behavior has constantly to be emulated and that initial foreclosure reenacted to keep differing social interests unexpressed.


1A feat rarely matched, and then only by the likes of another giant among colonial historians, Alan Taylor.

I recently re-watched Do the Right Thing and found the ending a little shocking. No, not the violent part – which has, sadly, only become more familiar in the quarter century since 1989 – but the actual last scene.

The morning after the movie’s climax, the camera shifts up and away from the street while in voiceover we hear the storefront DJ, Mister Señor Love Daddy (Samuel L. Jackson). He has served throughout the film as a kind of Greek chorus and now he’s the last voice we hear, after the assault and the murder and the burning of Sal’s, and he says … “Register to vote. The election is coming up.”

Which struck me, in 2015, as awfully anemic. Is that really the conclusion we’re meant to draw, after all that heat, after repeated invocations of Martin Luther King, Jr., and Malcolm X? Register and vote?

I wondered if maybe Jackson’s performance had thrown me off and made me expect more of Love Daddy than I should have. After all, Jackson’s real talent is for the veneer of geniality over the threat of violence (see Jules or, in a different register, Nick Fury) – for conveying hidden weight, in the manner of a lead-filled sap with a polished leather finish.

But those roles came later. Maybe Mister Señor Love Daddy is supposed to be a bit of a buffoon. After all, during the climax of the movie, the camera catches him in his window, and his response to the police turning firehoses on his neighbors is to yell and … change his hat. Maybe we’re supposed to see him as impotent, inept – the kind of guy who would, on reflection, respond to brutality by delivering the Polonian advice, “Register to vote.”

Or maybe Spike Lee meant it seriously. There’s evidence he does, or did. On the twentieth anniversary Blu-ray, you’ll find an interview in which Spike Lee mentions he wrote and filmed Do the Right Thing in the midst of Ed Koch’s administration – but now, he says, everything’s different.

Those were heady days, 2009, to be sure, when maybe elections could fill you with hope and change. But: enough to, in retrospect, justify that flat-footed ending? “Vote”? After a movie that began with Public Enemy urging, “Fight the Power!”, and whose first line of dialogue had Love Daddy himself shouting, “Wake up!”?

I write to record a position on a subject already treated in more specialist fora.

I refer, of course, to a matter routinely, if implicitly, raised by the auditors of curricula, every time they ask for samples of a syllabus: if they request more than one, what do they say they want? Syllabi? Or syllabuses?1

Doubtless this issue vexes few of the hoi polloi, nor troubles many alumni of great universities. Indeed, academics – from the Hebrides to the Antipodes – seem often to use “syllabi.”

A highly scientific anti-prescriptivist study has it that the answer is “syllabi.” I prefer “syllabuses,” though.

If your etymological antennae are twitching, you can find a detailed account of the story of “syllabus” at the specialist links in the first sentence of the post. But the short version is, it’s a made-up word, erroneously thought to be adopted into Latin from the Greek, which it wasn’t. I.e., there isn’t a true proper correct answer, horribile dictu.


1I’ve never actually heard anyone insist it’s syllabūs.

For my new book, I spent long hours trawling through the many, many reels of the microfilmed diaries of Henry Morgenthau, Jr. We didn’t have them at my university, so I had to order a few at a time from Interlibrary Loan, wait, and then seize upon them and go through them before they were due back at their home institution. Working through them at that speed, and on the microfilm reader whose lens & screen combination wasn’t quite right to show a full page, invariably gave me motion sickness.

Then they showed up, digitized, free to download. The joke was on me.

Except, for some reason, the digitized edition seems to begin with Book 1. Which you would think was okay – except the first book is actually Book 00. And that’s the book that covers the beginning of the Roosevelt administration – a critical period during which decisions were made about monetary policy that lasted for the duration of Roosevelt’s terms in office.

A peril, perhaps, of the digital archive.

California’s measles outbreak has now reached more than 70 cases. 1

Populations especially at risk are those born after 1957 and vaccinated between 1963-1967 or not vaccinated. People born before 1957 would have been exposed to measles naturally and are ok; those not exposed to the virus in the wild will be vulnerable. People vaccinated 1963-1967 might have got the “killed virus” vaccine, which the Centers for Disease Control now say is ineffective, and they will be vulnerable.

Unvaccinated people will be vulnerable2 for what ought to be obvious reasons.

California permits unvaccinated students to enroll in public schools if their parents file a form saying their beliefs do not permit vaccination.

The percentage of unvaccinated students in Sacramento-area schools is over fifty percent in some cases.

As the historian Robert Johnston remarks, scholars used to treat anti-vaccination activists as “the deluded, the misguided, the ignorant, the irrationally fearful” but now they command ‘If not sympathy, at least a modicum of respect.”

I suppose we should respect those whom we can rationally fear.


1This is the outbreak that the press keep saying, correctly if punctiliously, began at “Disneyland Park and Disneyland California Adventure,” as if there were some important meaningful reason they couldn’t say “Disneyland”; Disneyland is offering a pretty good discount right now, by the way.

2A correspondent, I think correctly, points out in comments that all are potentially vulnerable once we drop below a percentage where we have “herd immunity.”

This is officially an award-winning blog

HNN, Best group blog: "Witty and insightful, the Edge of the American West puts the group in group blog, with frequent contributions from an irreverent band.... Always entertaining, often enlightening, the blog features snazzy visuals—graphs, photos, videos—and zippy writing...."
Follow

Get every new post delivered to your Inbox.

Join 1,690 other followers