You are currently browsing eric’s articles.
In The New Republic, Jeet Heer says that Donald Trump is not a populist, he’s “the voice of aggrieved privilege—of those who already are doing well but feel threatened by social change from below, whether in the form of Hispanic immigrants or uppity women.” Or the voice of the white American man enraged at the possibility he might lose his ill-gotten privilege. Heer doesn’t use the f-word, but it’s the elephant in the room.
For the alleged misunderstanding of Trumpism as “populism,” Heer blames the historian Richard Hofstadter, who in the middle 1950s explained he was interested in “that side of Populism” that sounded to Hofstadter a lot like McCarthyism. Hofstadter was right: there was a side of Populism, and not a trivial side, that sounded like McCarthyism—and Trumpism too. Read the rest of this entry »
Ronald Reagan in “A Time for Choosing,” the Gipper’s speech for Barry Goldwater in 1964:
Welfare spending [is] 10 times greater than in the dark depths of the Depression. We’re spending 45 billion dollars on welfare. Now do a little arithmetic, and you’ll find that if we divided the 45 billion dollars up equally among those 9 million poor families, we’d be able to give each family 4,600 dollars a year. And this added to their present income should eliminate poverty.
David Brooks in the New York Times, regarding the case of Freddie Gray in 2015:
The problem is not lack of attention, and it’s not mainly lack of money. Since 1980 federal antipoverty spending has exploded. As Robert Samuelson of The Washington Post has pointed out, in 2013 the federal government spent nearly $14,000 per poor person. If you simply took that money and handed it to the poor, a family of four would have a household income roughly twice the poverty rate.
Annie Lowery points out why Brooks’s argument is numerically bogus: just as conservatives don’t count millions of government employees as employed in 1930s, Brooks doesn’t count federal money as money in the 2010s:
Brooks is claiming that federal spending on anti-poverty programs is not lifting families out of poverty… when the government specifically does not include the value of those very programs in its poverty calculations.… A fuller accounting shows that food stamps alone lift 4 million people above the poverty line. The earned-income tax credit lifts nearly 6 million above it. Which is to say that “not bringing down the official poverty rate” is not a good yardstick by which to judge these programs.
But I would like to take David Brooks up on his suggestion: with the absolute same degree of sincerity as 1964-era Reagan, he’s supporting a straight-up transfer of wealth from the rich to the poor. It is a radical solution to poverty, this long-standing Republican proposal, but perhaps one that we should consider.
With Abenesia in the news, I thought it might be useful to talk about another Axis nation’s complicated struggle with the memory of the Second World War. Jennifer Teege found out, at the age of 38, that not only was her grandfather a Nazi, he was an especially infamous Nazi: Amon Goeth, the commandant of Płaszów concentration camp, the man played by Ralph Fiennes in Schindler’s List. On trial after the war, Goeth sneered at the witnesses against him, “What? So many Jews? And we were always told there’d be none left.” He gave a Hitler salute on the gallows.
Hence the title of Teege’s memoir: she has an African father, and a Nazi grandfather who would have regarded as subhuman a person of African descent. The book is a great deal subtler than the title suggests. It is saturated in memory, and forgetting, and the fault lines between memory and history run throughout it. Teege describes her attempt to reconcile what she learns about her grandfather with the kind – but, she now knows, complicit – grandmother she remembers. The book presents Teege’s reminiscence and necessarily somewhat therapeutic work alongside the sober, reportorial voice of Nikola Sellmair, whose dry factual rendering of verifiable history often undermines Teege’s hopeful, emotional writing.
There are different kinds of memory in the book. Teege’s adoptive German family had a more usual relationship to the Nazi era – her father didn’t really know the extent to which his family had taken part in Nazi crimes. Sellmair discusses such modern Germans, summarizing Harald Welzer’s study “Grandpa Wasn’t a Nazi.” Latter day Germans seize on any opportunity to construct a guiltless, even noble past for their forebears – as with the French, they were all in the resistance.
Teege’s brief narrative also encompasses also the memory kept by Holocaust survivors and their descendants: before Teege found out about her grandfather, she traveled to Tel Aviv, made friends there, and lived there. Her discovery imposes silence between her and her Jewish friends. She doesn’t know what she can say. Her grandfather might have shot their grandparents.
“There is no Nazi gene,” Teege insists, struggling against the idea that she must bear some guilt for her grandfather. But she clearly feels that guilt. We all inhabit the world the bloodthirstiest conquerors made; only some of us grew up with them, personally.
|Now, or recently, at newsagents|
In the TLS for 17 April, you can find my essay on Nicholas Wapshott’s The Sphinx, about the presidential election of 1940, the isolationists, and how Franklin Roosevelt engineered the US shift toward war. The essay starts like this:
Franklin Roosevelt recognized the threat Adolf Hitler posed from the moment of the German Chancellor’s appointment. In January of 1933, Roosevelt—not yet inaugurated, though already elected, President—told an aide that Hitler’s ascent was “a portent of evil”, not just for Europe but “for the United States”. He “would in the end challenge us because his black sorcery appealed to the worst in men; it supported their hates and ridiculed their tolerances; and it could not exist permanently in the same world with a system whose reliance on reason and justice was fundamental:. From then onward, Roosevelt’s policies raced Hitler’s: the New Deal was not merely a programme for recovery from depression, but one to rebuild economic strength while preserving democracy in the United States so the nation would be ready to fight Nazism when the time came.
The New Deal gave Americans not only the material capacity to fight fascism, but faith in American institutions. Which is why, of course, the prevalence of remarks like this one remains so appalling.
Paul Campos writes in the New York Times about what he claims is the “real reason” for higher college tuition in the USA:
far from being caused by funding cuts, the astonishing rise in college tuition correlates closely with a huge increase in public subsidies for higher education.… a major factor driving increasing costs is the constant expansion of university administration
And he singles out the California State University (CSU) system as an example:
while the total number of full-time faculty members in the C.S.U. system grew from 11,614 to 12,019 between 1975 and 2008, the total number of administrators grew from 3,800 to 12,183 — a 221 percent increase
|Forthcoming in September, from Basic Books|
As readers of this blog know, Franklin Roosevelt declared he had taken the US off the gold standard on March 6, 1933, as the first substantial act of his presidency. But scholars have not been so quick to accept this date or, with firmness, any other.
When Roosevelt first said he had taken the US off the gold standard, he didn’t want to make too great a fuss about it because he was trying to quiet a panic that had nearly broken the Federal Reserve System. He hoped Americans would bring their gold back for deposit in the nation’s vaults. And they did. Even though the papers were reporting that the president might issue scrip for temporary currency; even though the Emergency Banking Act provided that Federal Reserve notes could be backed by commercial bank assets, people generally preferred paper money to gold, so long as they trusted the paper money – which, with Roosevelt’s assurances, they did, as you can see from the chart. Read the rest of this entry »
|Forthcoming in September, from Basic Books|
On this day in 1933, it was the first Friday of Franklin Roosevelt’s administration, and the new president met reporters to talk to them about ending the bank holiday with which he had begun his term. The Federal Reserve Banks would open on Saturday so that member banks of the Federal Reserve System could open on Monday. A reporter asked if the banks would be open “[a]ll along the line, Mr. President; that is, all functions?” Roosevelt replied, “Yes, all functions. Except, of course, as to gold. That is a different thing. I am keeping my finger on gold.”
The president’s week had started with his inauguration on March 4, the previous Saturday, when he had told the American people they had only fear itself to fear, and promised them an “adequate but sound currency.” The next day Roosevelt worked all through the day with members of his team and holdovers from Herbert Hoover’s to draft orders to close the banks and halt all payouts of gold. They worked so hard that Sunday that it was late by the time the order was ready for presidential signature – so late that Federal Reserve counsel Walter Wyatt urged the president to wait a little longer to sign, so that it would be Monday, and not so sacrilegious. Roosevelt did wait, and then, on signing the order said – gleefully, according to one account – “We are now off the gold standard.” That was in the wee hours of March 6; later that morning, Americans began a week of doing business without access to banks.
At Roosevelt’s first press conference, on Wednesday March 8, he told reporters that “what you are coming to now is really a managed currency.… It may expand one week and it may contract another week.” The end of the dollar’s convertibility to gold was not temporary but “part of the permanent system so we don’t run into this thing again” – which was what he repeated when he said, on Friday, that he was keeping his finger on gold.
Roosevelt was strikingly consistent in his monetary policy declarations.1 The US went off the gold standard with the bank holiday, he never meant to go back on, and he didn’t. He wanted to establish an international system of managed currencies, with an agreement that would allow them to remain stable for long periods, but adjustable in case of need – that was what he told the World Economic Conference at the end of summer 1933, and that was why it broke up – because other countries weren’t yet ready to join the US. At the end of 1933, Roosevelt talked up the dollar in value, stabilizing it in January 1934, while saying “I reserve the right … to alter this proclamation as the interest of the United States may seem to require.”
Despite some often-quoted barbs at Roosevelt’s method of getting there (complaining that the president’s talking up of the dollar by unpredictable amounts looked “more like a gold standard on the booze than the ideal managed currency of my dreams” or a “game of blind man’s bluff with exchange speculators”) John Maynard Keynes approved of the destination, writing in January 1934 that the president’s policy “means real progress.” Roosevelt had “adopted a middle course between old-fashioned orthodoxy and the extreme inflationists.” He had done nothing “which need be disturbing to business confidence,” and the monetary policy was “likely to succeed in putting the United States on the road to recovery.” Roosevelt’s adoption of a value for the dollar to be kept generally stable, if altered at need, also opened the possibility for an international conference on money, to “aim for the future not at rigid gold parities, but at provisional parities from which the parties to the conference would agree not to depart except for substantial reasons arising out of their balance of trade or the exigencies of domestic price policy.”
In other words, before Roosevelt had been in office a full year, he had articulated, with Keynes’s approval, all the elements of what would become the Bretton Woods monetary policy in 1944: currencies would be kept at stable exchange rates, but would be adjustable in keeping with the needs of economic prosperity in each country.
1Historians have a real problem recognizing this, owing I think to the influence of a couple of misleading memoirs by disaffected Roosevelt advisors who didn’t like his monetary policy, and who departed the administration early and therefore got their licks in early. Maybe I’ll write a post about this particular thing.
Possibly the right response to Gordon Wood’s “History in Context” in The Weekly Standard – a “get off my lawn essay,” as one historian says – is parody. After all Wood does begin the essay by saying his mentor Bernard Bailyn is woefully under-appreciated, and then proceeds to mention that Bailyn has two Pulitzers.1 What else can one do but mock?
Well, one can take Wood earnestly, as is one’s wont, and ask, what happened to the younger Gordon Wood? How would he fare before the stern tribunal of Weekly Standard Wood?
I ask because Wood the elder expresses dissatisfaction with those historians “obsessed with inequality,” who
see themselves as moral critics obligated to denounce the values of the past in order to somehow reform our present. They criticize Bailyn’s work for being too exquisitely attuned “to the temper of an earlier time” and, thus, for failing “to address the dilemmas of its own day.”…
These historians need to read and absorb Bailyn’s essay on “Context in History,” published in this collection for the first time. Perhaps then they would be less eager to judge the past by the values of the present and less keen to use history to solve our present problems. In some sense, of course, they are not really interested in the past as the past at all.
But, as another Bailyn student pointed out to me, Wood was not always so scornful of judging and using the past for present purposes, nor so principled about letting the past be past. Consider this important passage from Wood’s remarkable first book, Creation of the American Republic, 1776-1787:
Considering the Federalist desire for a high-toned government filled with better sorts of people, there is something decidedly disingenuous about the democratic radicalism of their arguments.… In effect they appropriated and exploited the language that more rightfully belonged to their opponents. The result was the beginning of a hiatus in American politics between ideology and motives that never again closed. By using the most popular and democratic rhetoric available to explain and justify their aristocratic system, the Federalists helped to foreclose the development of an American intellectual tradition in which differing ideas of politics would be intimately and genuinely related to differing social interests.… and thereby contributed to the creation of that encompassing liberal tradition which has mitigated and often obscured the real social antagonisms of American politics.… the Federalists fixed the terms for the future discussion of American politics.
Listen to what Wood the younger is saying, here: “disingenuous” surely sounds morally critical, as does “appropriated and exploited.”
Talking about what might “never again” be, and even about “the future” certainly doesn’t sound like thinking about “the past as the past.”
And bringing up “differing social interests” and “real social antagonisms” sounds like it might entail concern about, if not obsession with, inequality.
Perhaps Wood the younger would have to get off Wood the elder’s lawn.
I am actually more interested in what Wood the younger would say to his older self, concerned as he was with arguments that foreclosed discussion of genuine social antagonism. I have never really found persuasive Wood the younger’s argument that 1787 marked some kind of end-of-Eden, after which honest political discourse was never again possible in the United States. Rather, I think the Federalists’ disingenuous behavior has constantly to be emulated and that initial foreclosure reenacted to keep differing social interests unexpressed.
1A feat rarely matched, and then only by the likes of another giant among colonial historians, Alan Taylor.
I recently re-watched Do the Right Thing and found the ending a little shocking. No, not the violent part – which has, sadly, only become more familiar in the quarter century since 1989 – but the actual last scene.
The morning after the movie’s climax, the camera shifts up and away from the street while in voiceover we hear the storefront DJ, Mister Señor Love Daddy (Samuel L. Jackson). He has served throughout the film as a kind of Greek chorus and now he’s the last voice we hear, after the assault and the murder and the burning of Sal’s, and he says … “Register to vote. The election is coming up.”
Which struck me, in 2015, as awfully anemic. Is that really the conclusion we’re meant to draw, after all that heat, after repeated invocations of Martin Luther King, Jr., and Malcolm X? Register and vote?
I wondered if maybe Jackson’s performance had thrown me off and made me expect more of Love Daddy than I should have. After all, Jackson’s real talent is for the veneer of geniality over the threat of violence (see Jules or, in a different register, Nick Fury) – for conveying hidden weight, in the manner of a lead-filled sap with a polished leather finish.
But those roles came later. Maybe Mister Señor Love Daddy is supposed to be a bit of a buffoon. After all, during the climax of the movie, the camera catches him in his window, and his response to the police turning firehoses on his neighbors is to yell and … change his hat. Maybe we’re supposed to see him as impotent, inept – the kind of guy who would, on reflection, respond to brutality by delivering the Polonian advice, “Register to vote.”
Or maybe Spike Lee meant it seriously. There’s evidence he does, or did. On the twentieth anniversary Blu-ray, you’ll find an interview in which Spike Lee mentions he wrote and filmed Do the Right Thing in the midst of Ed Koch’s administration – but now, he says, everything’s different.
Those were heady days, 2009, to be sure, when maybe elections could fill you with hope and change. But: enough to, in retrospect, justify that flat-footed ending? “Vote”? After a movie that began with Public Enemy urging, “Fight the Power!”, and whose first line of dialogue had Love Daddy himself shouting, “Wake up!”?
I refer, of course, to a matter routinely, if implicitly, raised by the auditors of curricula, every time they ask for samples of a syllabus: if they request more than one, what do they say they want? Syllabi? Or syllabuses?1
A highly scientific anti-prescriptivist study has it that the answer is “syllabi.” I prefer “syllabuses,” though.
If your etymological antennae are twitching, you can find a detailed account of the story of “syllabus” at the specialist links in the first sentence of the post. But the short version is, it’s a made-up word, erroneously thought to be adopted into Latin from the Greek, which it wasn’t. I.e., there isn’t a true proper correct answer, horribile dictu.
1I’ve never actually heard anyone insist it’s syllabūs.
For my new book, I spent long hours trawling through the many, many reels of the microfilmed diaries of Henry Morgenthau, Jr. We didn’t have them at my university, so I had to order a few at a time from Interlibrary Loan, wait, and then seize upon them and go through them before they were due back at their home institution. Working through them at that speed, and on the microfilm reader whose lens & screen combination wasn’t quite right to show a full page, invariably gave me motion sickness.
Then they showed up, digitized, free to download. The joke was on me.
Except, for some reason, the digitized edition seems to begin with Book 1. Which you would think was okay – except the first book is actually Book 00. And that’s the book that covers the beginning of the Roosevelt administration – a critical period during which decisions were made about monetary policy that lasted for the duration of Roosevelt’s terms in office.
A peril, perhaps, of the digital archive.
California’s measles outbreak has now reached more than 70 cases. 1
Populations especially at risk are those born after 1957 and vaccinated between 1963-1967 or not vaccinated. People born before 1957 would have been exposed to measles naturally and are ok; those not exposed to the virus in the wild will be vulnerable. People vaccinated 1963-1967 might have got the “killed virus” vaccine, which the Centers for Disease Control now say is ineffective, and they will be vulnerable.
Unvaccinated people will be vulnerable2 for what ought to be obvious reasons.
California permits unvaccinated students to enroll in public schools if their parents file a form saying their beliefs do not permit vaccination.
The percentage of unvaccinated students in Sacramento-area schools is over fifty percent in some cases.
As the historian Robert Johnston remarks, scholars used to treat anti-vaccination activists as “the deluded, the misguided, the ignorant, the irrationally fearful” but now they command ‘If not sympathy, at least a modicum of respect.”
I suppose we should respect those whom we can rationally fear.
1This is the outbreak that the press keep saying, correctly if punctiliously, began at “Disneyland Park and Disneyland California Adventure,” as if there were some important meaningful reason they couldn’t say “Disneyland”; Disneyland is offering a pretty good discount right now, by the way.
2A correspondent, I think correctly, points out in comments that all are potentially vulnerable once we drop below a percentage where we have “herd immunity.”
I still haven’t whittled that blog post down to size. In fact it’s now bigger. Meantime here’s another something on the web: a TLS essay I wrote on Martin Wolf’s The Shifts and the Shocks. There’s no paywall. Here’s a snippet, which provides the piece its rather nice illustration:
In the 2011 film Margin Call, which dramatizes the onset of our dismal era, the banker character played by Jeremy Irons delivers a monologue with which he attempts to justify his ruthless self-interest. Events like this just happen, he says. He lists a series of dates corresponding to financial panics, the modern part of which runs like this: “1819, 1837, 1857, 1884, 1901, 1907, 1929, 1937” – and then Irons pauses slightly before continuing – “1974, 1987 . . . .”. It is a slight pause, but there is room in it: room for, after the war, what the French call les trente glorieuses, the decades of widespread economic growth and prosperity. That gap in the string of crises undermines the Irons character’s argument: the disasters do not have to happen. During the period in that pause, banking was tightly regulated, capital movements were controlled, exchange rates pegged (if adjustable). As Wolf says, “finance was repressed. That certainly prevented crises”.
In the print edition the essay is called “Missing dates,” which I actually like a bit better, and which derives from this section.
I started writing a blog post yesterday but it’s now up to 1500 words and I don’t know what to do with is, so instead I’ll urge you: Plan your Saturday night now! Here’s a preview: In a world …
Aww yeah, baby: C-SPAN 3.1 Saturday, January 24, at 8pm and midnight, Eastern time. Or 5pm and 9pm, here on the edge of the American West.
This is a lecture2 from the World War II class that Ari and I debuted as a co-taught course in 2012, but which this year I taught all by my lonesome; I’m sure it shows. The subject is some of the various ways in which the motive of revenge overtook the strategy of the Axis and the Allies.
1The connoisseur’s C-SPAN.
2Yes, C-SPAN 3’s Saturday night programming is a filmed college lecture; look, either you’re the audience for this or you aren’t.
We took the kids to see Selma, and I think you should see it too. (I mean, my God: it’s got both Stephen Root and Wendell Pierce.) Its historical liberties notwithstanding, it’s a great piece of historical fiction. As a sometime practitioner of both history and historical fiction, let me explain why.
First, here’s John Steinbeck on the scholar and the truth; the fisherman and the fish:
the Mexican sierra has “XVII-15-IX” spines in the dorsal fin. These can easily be counted. But if the sierra strikes hard on the line so that our hands are burned, if the fish sounds and nearly escapes and finally comes in over the rail, his colors pulsing and his tail beating the air, a whole new relational externality has come into being—an entity which is more than the sum of the fish plus the fisherman. The only way to count the spines of the sierra unaffected by this second relational reality is to sit in a laboratory, open an evil-smelling jar, remove a stiff colorless fish from a formalin solution, count the spines, and write the truth “D.XVII-15-IX.” There you have recorded a reality which cannot be assailed—probably the least important reality concerning either the fish or yourself.
It is good to know what you are doing. The man with his pickled fish has set down one truth and has recorded in his experience many lies. The fish is not that color, that texture, that dead, nor does he smell that way…. [W]e were determined not to let a passion for unassailable little truths draw in the horizon and crowd the sky down on us.
(Yes, it’s a favorite passage.)
So clearly, it’s the business of the historian to count those spines (and get the count right). Historians go further, too: we traffic in permissible artifice. Call it cautious narrative, which indicates more often than it depicts: maybe, to press the analogy, we’re allowed to stuff and mount that fish in a lifelike posture that nevertheless permits the observer to see those spines and plainly ascertain their number.
Beyond that we daren’t go.
But purveyors of historical fiction aren’t trying to do that, at all: instead, they want to give us that other, otherwise unreachable, truth: the fisherman and the fish, the leap, the flash, the struggle. That too is true.
Historians can tell us it happened: fictionalizers can make us see it happening and feel the fight between angler and prey.
So, Selma gives us that fight, and how. The night march and the murder of Jimmie Lee Jackson are terrifying and heartbreaking. Bloody Sunday is grippingly staged and shot. David Oyelowo is a great King, Tom Wilkinson is a great Johnson, Oprah Winfrey can actually act, in case you didn’t remember. (Only why, in a movie that has Martin Luther King, Jr., Lyndon B. Johnson, and George Wallace, do they all have to be played by Brits?)
And I’m not greatly bothered by the depicted conflicts between King and the younger activists – Zeitz says the movie overplays them, but they were real.
I even think making Johnson a foot-dragger who has to have his mind changed is actually fine-ish, though. It didn’t happen this way, not in 1965. But it did pretty well happen. Johnson did help make the Civil Rights Act of 1957 weaker. And he did at last push the Voting Rights Act through. King’s activism helped propel him forward.
(And even in 1965, Johnson was still incredulous at the thought that he hadn’t done enough. “Could anybody do better? What do they want?” he asked.)
But. I do think it’s stepping over the line to make Johnson responsible for the infamous “suicide letter” to King. This is like telling us the fisherman leaped into the water and wrestled the fish into submission with his bare hands. I mean, someone did catch and kill that fish, but not like that.
The suicide letter (which appeared in Athan Theoharis’s From the Secret Files of J. Edgar Hoover and was recently analyzed in full by Beverly Gage) was a genuinely vicious thing. The FBI sent it to King, with tape-recordings of his sexual infidelities, saying “There is but one way out for you. You better take it before your filthy, abnormal fraudulent self is bared to the nation.”
As Zeitz points out, this letter had nothing to do either with Selma or with Johnson. But the movie has Johnson saying he needs to put King off, picking up the phone, and barking “Get me J. Edgar Hoover,” then cutting to Coretta Scott King listening to the tape with her husband. In the language of film, that’s as much as saying, Johnson ordered the sending of that letter and that tape.
Which is a shame, really; DuVernay doesn’t need Johnson to sink that low for the narrative to work. I suspect she did want to get in some evidence of King’s infidelities, and the complexity of his relationship with his wife, and this was the way to do it. To establish Hoover as acting independently from Johnson would have taken up too much screen time in a movie already packed with incident (Malcolm X is in it!); as it is, I’m not confident all viewers will remember Hoover from his single, brief, establishing scene.
But it is a shame.
Still, Lyndon Johnson was a big man with a secure place in history, and I bet wherever he is, he can take it. And you and yours should still see the movie. Which is fiction, if historical fiction.
(Also recommended: NPR’s Pop Culture Happy Hour, with friend of this blog Gene Demby, discussing this question, too.)
On this day in 1940 the United States House of Representatives passed a bill imposing fines on county or state officials who negligently failed to protect persons in their custody from seizure by a mob who injured or killed those persons – or, as it was better known, an anti-lynching bill.1
“Why is President Roosevelt so strangely silent on this bill?” asked Franklin Roosevelt’s own local Congressman, Representative Hamilton Fish, Republican of New York.2
Scarcely a soul did not know the answer to Fish’s question. The bill split the Democratic Party. Sponsored by Joseph Gavagan, a Democrat and the Congressman for Harlem, the bill had nearly universal support in the North and none in the South. The vote in the House was 252 to 131 in favor. No Southerner voted for the bill; Democrats from the North, Midwest, and West favored it by nearly five to one.
Similar bills had passed in 1937 and 1938. They met their demise in the Senate, whose filibuster rules permitted a Southern minority to kill the bills.
The NAACP campaigned vigorously for the anti-lynching bills. Its leader, Walter White, noted “the states have continued as they have in the past, to do nothing about lynching. The federal government must act.”3
Senator Robert Wagner, Democrat of New York, supported the bill. So did any number of high profile Democrats, including Eleanor Roosevelt.
Gallup polls showed Americans closely divided but favoring the measure, with 50% in favor, 41% against, and 9% with no opinion.4
Senator Pat Harrison, Democrat of Mississippi, addressed himself to the president, saying, “We see the people of the South confronted with the terrible situation of a Democratic majority betraying the trust of the Southern people.… The next thing, in all probability, will be to provide that miscegenation of the races cannot be prohibited, and when that has accomplished … to say that every colored man in every Southern State should take part in the primaries in the State!”5
That outcome remained far in the future. It was 1940; the Gavagan bill had no chance when the president viewed the fight against Nazism as his chief priority.
He had seen the world that way, really, since the moment Adolf Hitler became German chancellor in January 1933. Roosevelt then told Rexford Tugwell that Hitler’s “black sorcery appealed to the worst in men; it supported their hates and ridiculed their tolerances” and his rise was a “portent of evil for the United States.”
And so from the moment he took office, Roosevelt was racing the clock; recovery from Depression meant not only ending the crisis and restoring prosperity, but restoring the United States to a physical and moral strength sufficient to fight Nazi Germany.
But to do it, as Tugwell said, “He had to compromise with Hitlerites in our own electorate.”6
1“Anti-lynching Bill is Passed by House,” NYT 1/11/40, p. 17.
2“Quick Passage of Measure,” Chicago Defender 1/13/40, p. 1.
3Cited in Harvard Sitkoff, A New Deal for Blacks, 221.
4Gallup Poll, Jan, 1940. Retrieved Jan-10-2015 from the iPOLL Databank, The Roper Center for Public Opinion Research, University of Connecticut. http://www.ropercenter.uconn.edu/data_access/ipoll/ipoll.html
5Cited in Sitkoff, 220.
6Rexford Tugwell Papers, Franklin D. Roosevelt Presidential Library. See also the brilliant Ira Katznelson, Fear Itself (Liveright, 2013).
Ahem. Is this thing on?
RIP Levon Helm, who was not only of course the voice of The Band, but also of The Right Stuff, the voice warning softly,
There was a demon that lived in the air. They said whoever challenged him would die. Their controls would freeze up, their planes would buffet wildly, and they would disintegrate. The demon lived at Mach 1 on the meter, seven hundred and fifty miles an hour, where the air could no longer move out of the way. He lived behind a barrier through which they said no man could ever pass. They called it the sound barrier.
Helm also played Ridley, the trusted friend of Chuck Yeager, as depicted by Sam Shepard. I thought it was a kind of subversive genius, casting those two countercultural Dylan-associated types as these otherwise strait-laced American heroes.
As Pierce says, and as seems appropriate in this particular sidelight on Helm’s career, Godspeed.
Did anyone notice that Steve Rogers’s politics are basically those of PM?