On this day, the first of February, in 1934, the New York Times carried Franklin Roosevelt’s proclamation of a new gold value for the US dollar. Previously it had been worth 25 8/10 ounces of gold 9/10 fine; now it would be worth 15 5/21 ounces of gold 9/10 fine—or, as it is more commonly said, the dollar had been valued at $20.67 to an ounce of pure gold and now it would be $35 to an ounce of pure gold. But the US was not in 1934, nor would it ever again be, on a gold standard.
Hillary Clinton is taking flak today for her summary repetition of the white supremacist Dunning School of historical interpretation, which held that the attempt in the 1860s and 1870s to provide African Americans with their civil rights was a terrible imposition on the white folks of the South.
[Lincoln] was willing to reconcile and forgive. And I don’t know what our country might have been like had he not been murdered, but I bet that it might have been a little less rancorous, a little more forgiving and tolerant, that might possibly have brought people back together more quickly.
But instead, you know, we had Reconstruction, we had the re-instigation of segregation and Jim Crow. We had people in the South feeling totally discouraged and defiant. So, I really do believe he could have very well put us on a different path.… let’s also think about how we do try to summon up those better angels, and to treat each other, even when we disagree, fundamentally disagree, treat each other with more respect, and agree to disagree more civilly, and try to be inspired by, I think, the greatest of our presidents.
I’ll leave critiques of the Dunning School in other hands because I think they’re obvious, sadly, and Clinton should really know better. I’ll even forgo detail on the obvious point that if you’re a modern Democratic presidential aspirant asked who’s the greatest of the US presidents, your answer is Franklin Roosevelt.1 Instead I want to focus on Clinton’s counterfactual: “had [Lincoln] not been murdered”. Read the rest of this entry »
In the current Los Angeles Review of Books Quarterly Journal Michael W. Clune1 writes about odd small episodes, “particularly ephemeral perceptual experiences” we have that may alert us to the gap between how things seem and what they are. Riffing on Rei Terada’s Looking Away, he lists mirages, after-images; “clouds taken for mountains … looking at a landscape with one’s eyes half-closed so that it appears underwater.” He notes that we have nothing to say to each other about these experiences even if we share them, but that they remain with us. They remind us that if we pay too much attention to the mechanism by which we draw meaning from appearance we attenuate that meaning. One of Clune’s examples is an imaginary exchange with a sales clerk about money.
A year from today, the US will inaugurate a new president. But inauguration day has not always been thus fixed.
In the early years of the Republic, habit (rather than statute) placed the date of inauguration at March 4—though even that convention was not quite firm. In 1821, with the incumbent President James Monroe about to take the oath of office for his second term, March 4 fell on a Sunday. Monroe asked of Chief Justice John Marshall whether he could take the oath on the following day, rather than sully the Lord’s day with secular business.
I dislike the term “Great Recession” to describe our times, for technical and political reasons alike. Technically, the severe recession ended in June 2009. But, as the NBER says there,
In determining that a trough occurred in June 2009, the committee did not conclude that economic conditions since that month have been favorable or that the economy has returned to operating at normal capacity.
And indeed it still hasn’t, six and a half years after the recession ended. In fact, as Kevin O’Rourke noted, in August of 2015
the inevitable happened: measured in terms of industrial output, our current recovery was overtaken by that of the interwar period. Pretty dismal stuff.
So now, having avoided quite so severe a contraction as the 1930s, we are suffering a less impressive recovery. What do we call this ongoing period?
“Malaise” is taken, and rather ruined, by Carter-related discourse. I’ve lately been suggesting “the great economic unpleasantness” but without, I confess, really expecting it to catch on. Krugman’s old “Lesser Depression” is looking depressingly correct.
Non-specific plot details discussed herein.
Quentin Tarantino’s The Hateful Eight opens with a close shot of Christ’s face as he hangs on a cross; the frame widens to show us it’s a roadside crucifix somewhere in a desolate snowscape. Along that road sweeps a stagecoach bearing a bounty hunter—whose dual nature is revealed in his nickname, “the hangman,” and his surname, which is Ruth, meaning mercy—together with his prisoner, a murderer and thief.
The film so swiftly displaces the lingering shot of the stationary crucifix with the rush of the coach, which is seeking to beat a blizzard to shelter, that perhaps by the end of the film a viewer has forgotten the opening invitation to contemplate the image, and concept, of redemption-through-suffering. But that is where the movie starts, and also where it ends.
The opening shot looks a lot like the beginning of Samuel Fuller’s The Big Red One, which also opens on a long close shot of a crucifix, this one in France during World War I—or rather, in the hours just after the armistice, when Fuller’s protagonist slays one more German before finding out his license to kill has expired.
Fuller’s movie is largely about how arbitrarily, but how completely, the declaration of war permits human slaughter. It mocks the idea of meaning emerging through violence; it mocks meaning altogether, I think. There is only survival and, where permitted, empathy. By invoking it—we know Tarantino admires Fuller, and shares some of his preoccupations—is the Hateful Eight doing the same?
Maybe; I think it may go further. In Fuller’s movie the hate between antagonists switches off instantly once the war ends. In Tarantino’s, the hate persists and intensifies after the war. Denied the outlet of legitimate killing, hate finds other ways to erupt. War gives way to guerrilla fighting, terrorism, outlawry; murder after murder.
For Tarantino, “war” here is specifically the US Civil War. In the film, the artifact embodying the myth of this war’s meaning is revealed literally to be a lie. The Christ crucified of the Civil War—Lincoln—has brought no redemption. Indeed the only “Redemption” in the offing—the end to any attempt at Reconstruction of the defeated South—will repudiate the purpose for which the war was fought.
Tarantino’s Pulp Fiction can only give its main characters what seems like a happy ending by twisting the order of the tale, and ending in (what the film lets us know is) the middle of the story. Apart from a fairly conventional flashback, The Hateful Eight proceeds more or less chronologically to its conclusion. That flashback shows us an evidently idyllic Minnie’s Haberdashery in the hours before the movie’s main action begins—a place where black and white people live in blissful harmony and women are in charge, driving the action.
On an uncareful reading, one might assume this flashback is supposed to tell us that the West really could have been a new and better society (it is, after all, Wyoming territory, the first to enact woman suffrage)—but Tarantino undercuts this illusion too, making Minnie’s hatred of Mexicans into a plot point. There was already a serpent in the garden then.
Straightened out, the film’s narrative begins with this false Eden and ends with a mocking Calvary. One of its most thoughtful monologues comes from Tim Roth’s character, who muses that justice demands that an executioner act without passion. A passion play ends with a crucifixion. This movie ends with another kind of execution, and none are saved.
Peter Hitchens is less well known in the United States than his late brother, but when asked to write for the New York Times, he delivers his Mail columnist goods, full-strength. Regarding Robert Tombs’s English and their History:
Even in free countries it is sometimes necessary to alter the past to suit the present. For instance, I recall the day at my English boarding school in the early 1960s when our sober, patriotic old history books were gathered up and carted away to a storeroom. In their place we were handed bright, optimistic replacements, with a good deal less to say about the empire, the Protestant martyrs or what we had been taught without embarrassment to call the Glorious Revolution.… Older English people look back fondly on 1940, when we supposedly stood alone. In fact we were a major industrial and exporting power with a global navy, more or less self- sufficient, nationally cohesive and bolstered by the tribute of a still-great empire. Now all of that is gone. Is it possible that, after a thousand astonishing years, our island story has finally come to a full stop? Will the next great history of our nation and people be written in Chinese?
Now that George MacDonald Fraser has died, the sources for such views of the empire and its history seem fewer each day.
In The New Republic, Jeet Heer says that Donald Trump is not a populist, he’s “the voice of aggrieved privilege—of those who already are doing well but feel threatened by social change from below, whether in the form of Hispanic immigrants or uppity women.” Or the voice of the white American man enraged at the possibility he might lose his ill-gotten privilege. Heer doesn’t use the f-word, but it’s the elephant in the room.
For the alleged misunderstanding of Trumpism as “populism,” Heer blames the historian Richard Hofstadter, who in the middle 1950s explained he was interested in “that side of Populism” that sounded to Hofstadter a lot like McCarthyism. Hofstadter was right: there was a side of Populism, and not a trivial side, that sounded like McCarthyism—and Trumpism too. Read the rest of this entry »
Ronald Reagan in “A Time for Choosing,” the Gipper’s speech for Barry Goldwater in 1964:
Welfare spending [is] 10 times greater than in the dark depths of the Depression. We’re spending 45 billion dollars on welfare. Now do a little arithmetic, and you’ll find that if we divided the 45 billion dollars up equally among those 9 million poor families, we’d be able to give each family 4,600 dollars a year. And this added to their present income should eliminate poverty.
David Brooks in the New York Times, regarding the case of Freddie Gray in 2015:
The problem is not lack of attention, and it’s not mainly lack of money. Since 1980 federal antipoverty spending has exploded. As Robert Samuelson of The Washington Post has pointed out, in 2013 the federal government spent nearly $14,000 per poor person. If you simply took that money and handed it to the poor, a family of four would have a household income roughly twice the poverty rate.
Annie Lowery points out why Brooks’s argument is numerically bogus: just as conservatives don’t count millions of government employees as employed in 1930s, Brooks doesn’t count federal money as money in the 2010s:
Brooks is claiming that federal spending on anti-poverty programs is not lifting families out of poverty… when the government specifically does not include the value of those very programs in its poverty calculations.… A fuller accounting shows that food stamps alone lift 4 million people above the poverty line. The earned-income tax credit lifts nearly 6 million above it. Which is to say that “not bringing down the official poverty rate” is not a good yardstick by which to judge these programs.
But I would like to take David Brooks up on his suggestion: with the absolute same degree of sincerity as 1964-era Reagan, he’s supporting a straight-up transfer of wealth from the rich to the poor. It is a radical solution to poverty, this long-standing Republican proposal, but perhaps one that we should consider.
With Abenesia in the news, I thought it might be useful to talk about another Axis nation’s complicated struggle with the memory of the Second World War. Jennifer Teege found out, at the age of 38, that not only was her grandfather a Nazi, he was an especially infamous Nazi: Amon Goeth, the commandant of Płaszów concentration camp, the man played by Ralph Fiennes in Schindler’s List. On trial after the war, Goeth sneered at the witnesses against him, “What? So many Jews? And we were always told there’d be none left.” He gave a Hitler salute on the gallows.
Hence the title of Teege’s memoir: she has an African father, and a Nazi grandfather who would have regarded as subhuman a person of African descent. The book is a great deal subtler than the title suggests. It is saturated in memory, and forgetting, and the fault lines between memory and history run throughout it. Teege describes her attempt to reconcile what she learns about her grandfather with the kind – but, she now knows, complicit – grandmother she remembers. The book presents Teege’s reminiscence and necessarily somewhat therapeutic work alongside the sober, reportorial voice of Nikola Sellmair, whose dry factual rendering of verifiable history often undermines Teege’s hopeful, emotional writing.
There are different kinds of memory in the book. Teege’s adoptive German family had a more usual relationship to the Nazi era – her father didn’t really know the extent to which his family had taken part in Nazi crimes. Sellmair discusses such modern Germans, summarizing Harald Welzer’s study “Grandpa Wasn’t a Nazi.” Latter day Germans seize on any opportunity to construct a guiltless, even noble past for their forebears – as with the French, they were all in the resistance.
Teege’s brief narrative also encompasses also the memory kept by Holocaust survivors and their descendants: before Teege found out about her grandfather, she traveled to Tel Aviv, made friends there, and lived there. Her discovery imposes silence between her and her Jewish friends. She doesn’t know what she can say. Her grandfather might have shot their grandparents.
“There is no Nazi gene,” Teege insists, struggling against the idea that she must bear some guilt for her grandfather. But she clearly feels that guilt. We all inhabit the world the bloodthirstiest conquerors made; only some of us grew up with them, personally.
|Now, or recently, at newsagents|
In the TLS for 17 April, you can find my essay on Nicholas Wapshott’s The Sphinx, about the presidential election of 1940, the isolationists, and how Franklin Roosevelt engineered the US shift toward war. The essay starts like this:
Franklin Roosevelt recognized the threat Adolf Hitler posed from the moment of the German Chancellor’s appointment. In January of 1933, Roosevelt—not yet inaugurated, though already elected, President—told an aide that Hitler’s ascent was “a portent of evil”, not just for Europe but “for the United States”. He “would in the end challenge us because his black sorcery appealed to the worst in men; it supported their hates and ridiculed their tolerances; and it could not exist permanently in the same world with a system whose reliance on reason and justice was fundamental:. From then onward, Roosevelt’s policies raced Hitler’s: the New Deal was not merely a programme for recovery from depression, but one to rebuild economic strength while preserving democracy in the United States so the nation would be ready to fight Nazism when the time came.
The New Deal gave Americans not only the material capacity to fight fascism, but faith in American institutions. Which is why, of course, the prevalence of remarks like this one remains so appalling.
If you’re interested, Slate just posted an excerpt from a chapter — on Appomattox and memory — from Battle Lines, the graphic history of the Civil War I’ve written with Jonathan Fetter-Vorm.
Perhaps my favorite story of the Civil War comes from Lee’s surrender to Grant at Appomattox, which took place 150 years ago today. Here’s an excerpt, below the fold, from a piece I wrote last year about that episode.
Paul Campos writes in the New York Times about what he claims is the “real reason” for higher college tuition in the USA:
far from being caused by funding cuts, the astonishing rise in college tuition correlates closely with a huge increase in public subsidies for higher education.… a major factor driving increasing costs is the constant expansion of university administration
And he singles out the California State University (CSU) system as an example:
while the total number of full-time faculty members in the C.S.U. system grew from 11,614 to 12,019 between 1975 and 2008, the total number of administrators grew from 3,800 to 12,183 — a 221 percent increase
|Forthcoming in September, from Basic Books|
As readers of this blog know, Franklin Roosevelt declared he had taken the US off the gold standard on March 6, 1933, as the first substantial act of his presidency. But scholars have not been so quick to accept this date or, with firmness, any other.
When Roosevelt first said he had taken the US off the gold standard, he didn’t want to make too great a fuss about it because he was trying to quiet a panic that had nearly broken the Federal Reserve System. He hoped Americans would bring their gold back for deposit in the nation’s vaults. And they did. Even though the papers were reporting that the president might issue scrip for temporary currency; even though the Emergency Banking Act provided that Federal Reserve notes could be backed by commercial bank assets, people generally preferred paper money to gold, so long as they trusted the paper money – which, with Roosevelt’s assurances, they did, as you can see from the chart. Read the rest of this entry »
|Forthcoming in September, from Basic Books|
On this day in 1933, it was the first Friday of Franklin Roosevelt’s administration, and the new president met reporters to talk to them about ending the bank holiday with which he had begun his term. The Federal Reserve Banks would open on Saturday so that member banks of the Federal Reserve System could open on Monday. A reporter asked if the banks would be open “[a]ll along the line, Mr. President; that is, all functions?” Roosevelt replied, “Yes, all functions. Except, of course, as to gold. That is a different thing. I am keeping my finger on gold.”
The president’s week had started with his inauguration on March 4, the previous Saturday, when he had told the American people they had only fear itself to fear, and promised them an “adequate but sound currency.” The next day Roosevelt worked all through the day with members of his team and holdovers from Herbert Hoover’s to draft orders to close the banks and halt all payouts of gold. They worked so hard that Sunday that it was late by the time the order was ready for presidential signature – so late that Federal Reserve counsel Walter Wyatt urged the president to wait a little longer to sign, so that it would be Monday, and not so sacrilegious. Roosevelt did wait, and then, on signing the order said – gleefully, according to one account – “We are now off the gold standard.” That was in the wee hours of March 6; later that morning, Americans began a week of doing business without access to banks.
At Roosevelt’s first press conference, on Wednesday March 8, he told reporters that “what you are coming to now is really a managed currency.… It may expand one week and it may contract another week.” The end of the dollar’s convertibility to gold was not temporary but “part of the permanent system so we don’t run into this thing again” – which was what he repeated when he said, on Friday, that he was keeping his finger on gold.
Roosevelt was strikingly consistent in his monetary policy declarations.1 The US went off the gold standard with the bank holiday, he never meant to go back on, and he didn’t. He wanted to establish an international system of managed currencies, with an agreement that would allow them to remain stable for long periods, but adjustable in case of need – that was what he told the World Economic Conference at the end of summer 1933, and that was why it broke up – because other countries weren’t yet ready to join the US. At the end of 1933, Roosevelt talked up the dollar in value, stabilizing it in January 1934, while saying “I reserve the right … to alter this proclamation as the interest of the United States may seem to require.”
Despite some often-quoted barbs at Roosevelt’s method of getting there (complaining that the president’s talking up of the dollar by unpredictable amounts looked “more like a gold standard on the booze than the ideal managed currency of my dreams” or a “game of blind man’s bluff with exchange speculators”) John Maynard Keynes approved of the destination, writing in January 1934 that the president’s policy “means real progress.” Roosevelt had “adopted a middle course between old-fashioned orthodoxy and the extreme inflationists.” He had done nothing “which need be disturbing to business confidence,” and the monetary policy was “likely to succeed in putting the United States on the road to recovery.” Roosevelt’s adoption of a value for the dollar to be kept generally stable, if altered at need, also opened the possibility for an international conference on money, to “aim for the future not at rigid gold parities, but at provisional parities from which the parties to the conference would agree not to depart except for substantial reasons arising out of their balance of trade or the exigencies of domestic price policy.”
In other words, before Roosevelt had been in office a full year, he had articulated, with Keynes’s approval, all the elements of what would become the Bretton Woods monetary policy in 1944: currencies would be kept at stable exchange rates, but would be adjustable in keeping with the needs of economic prosperity in each country.
1Historians have a real problem recognizing this, owing I think to the influence of a couple of misleading memoirs by disaffected Roosevelt advisors who didn’t like his monetary policy, and who departed the administration early and therefore got their licks in early. Maybe I’ll write a post about this particular thing.
Leonard Nimoy’s Spock made the most powerful case for the value of emotional intelligence* that I’ve ever seen. I’m also pretty sure that the Roanoke episode of “In Search Of…” made me the historian I am today. And, more important still, his life offscreen suggested that personal decency can far outstrip fame. RIP to a wonderful entertainer and a better person.
I’d like to think that he wouldn’t want this forgotten:
And here’s Nimoy on Roanoke:
Also, as commenter Kevin points out, Nimoy was good for the Jews:
* Yeah, it’s not a great phrase, but there it is.
As this article makes clear, I’m a plucky outsider who beat long odds!
Nothing brings me more joy than Edgar Allan Poe’s obituary.