You are currently browsing the monthly archive for February 2012.
Students in my Civil War class tend to be fascinated by the disjuncture between the Lincoln of memory, who stands tall as the Great Emancipator, and the Lincoln of history, who only very gradually embraced emancipation as a necessity of war and then later as a moral imperative.* One of the crucial moments in that evolution was the controversy over treating slaves as contraband of war, an episode during which several of Lincoln’s generals, in fall 1861, outstripped their Commander-in-Chief and began practicing not-quite-emancipation on the ground. They refused to return slaves that crossed the Union lines to their former owners, leaving those people in an odd situation: not quite free, but no longer enslaved either.
The BBC has a story up about one of the sites where that controversy played out: the South Carolina Sea Islands. I have to admit that this is the kind of article that threatens to get under my Civil War historian’s skin but then ends up totally tickling my historian of memory’s fancy. The story is “little known” and a “secret history,” says the Civil War historian? Not so! There’s a great book on the subject! And Chandra Manning is working on a new monograph about contraband camps! But then the historian of memory says, “stop being such nit-picking jerk and check out the stuff about the two Mitchels discussing their sense of shared heritage.”
Anyway, it’s an interesting (and annoying, yes) story that’s at least in part about the enduring nature of a certain kind of reconciliationst Civil War narrative.
* This formulation is decidedly too simple, I know. Consider it a kind of shorthand for the sort of thing about which scholars write long books that win prizes.
Anita Creamer’s article in the Sacramento Bee on Executive Order 9066 and the effect of internment on Japanese Americans.
Today Japanese American educators and researchers say that the community’s third generation – the Sansei, most of them born after the war to parents who had been imprisoned – has inherited a complicated generational legacy that has played out in the Japanese American culture ever since the days of camp.
“A lot of what people experience in adulthood can be traced back to the trauma their parents passed on intergenerationally,” said Satsuki Ina, 67, a psychotherapist and retired Sacramento State professor who was born to Nisei (or second) generation parents at the Tule Lake camp in Northern California.
Through her research, which culminated in an Emmy-winning PBS documentary, “Children of the Camps,” she discovered that post-traumatic stress scarred the lives of Nisei and their Sansei children in the years after they were released from camp.
In her own case, she said: “I think of what it was to be a baby carried in the arms of a mother who wrote in her diary, ‘Is today the day they’re going to shoot us?’ “
As one Telegraph blog says,
Students of economic history are in for a treat. An official studying deep in the bowels of the US Treasury library has recently uncovered a prize of truly startling proportions – an 800 page plus transcript of the Bretton Woods conference in July 1944, the meeting of nations which established the foundations of today’s international monetary system. … Those who have seen it say it is hard to point to any outright revelation about the talks, in which for Britain, the economist John Maynard Keynes was a leading player. But the level of intellectual debate is said to have been extraordinarily impressive, with exactly the same arguments as to voting rights and undue Western influence at the IMF and World Bank as exist today. The Indian delegation is said to have been particularly outspoken, despite the fact that India was still then a colony of the UK.
It’s not actually surprising that the Indian delegation was outspoken; this was one of the UK’s complaints about the conference. The problem was that the UK owed India money, and India wanted to get paid. Of course what you read in the transcripts about the Indian debt conversations isn’t everything that went on – “These conversations were carefully stage-managed by Lord Keynes, Sir Jeremy Raisman, and Dr. White,” as one official of the UK Commonwealth office said after the conference.
But the transcripts are in fact terrific. In my view they do hold some revelations, or at least illuminating portions, to do with the discussion over convertibility – this is one of the areas where the Indian delegation is outspoken – and over the clauses defining the goals of the IMF.
We are fortunate to have a guest post today from Robin Averbeck. Ms. Averbeck is a doctoral candidate working on the community action programs of the Lyndon Johnson administration, and has some insights appropriate to the current interest in Charles Murray’s new book and the idea of a culture of poverty. It’s always a privilege to work with a student whose research is interesting on its own terms and also engages current events in an intriguing way.
In the winter of 1963, the sociologist Charles Lebeaux argued that poverty, rather than merely a lack of money, was in fact the result of several complex, interrelated causes. “Poverty is not simply a matter of deficient income,” Lebeaux explained. “It involves a reinforcing pattern of restricted opportunities, deficient community services, abnormal social pressures and predators, and defensive adaptations. Increased income alone is not sufficient to overcome brutalization, social isolation and narrow aspiration.”1 Lebeaux’s article was originally published in the New Left journal New University Thought, but also appeared two years later in a collection of essays by liberal academics and intellectuals called Poverty in America. In the midst of Johnson’s War on Poverty, the argument that poverty was about more than money – or, as was sometimes argued, wasn’t even primarily about money – was common currency in both liberal and left-leaning circles.
Read the rest of this entry »
So Colonel Daniel Davis criticizes the American effort in Afghanistan (a criticism I don’t agree with, by the way)? Watch as (it sure seems) the Pentagon media machine spins up to discredit him through cooperative media folk, in this case, Tom Ricks of Foreign Policy (and formerly the Washington Post). First comes the reasoned response, from a professor at the National War College:
I was prepared for a real critique and came away profoundly disappointed. Every veteran has an important story, but [Davis'] work is a mess. It is not a successor piece to HR McMaster’s book on the Joint Chiefs during Vietnam, or Paul Yingling’s critique of U.S. generalship that appeared in Armed Forces Journal a few years back. Davis is not a hero, but he will go into the whistleblower hall of fame. If years hence, he doesn’t make full Colonel, it will be construed as punishment, but there is nothing in this report that suggests he has any such potential.
Then, two days later, comes the character assassination:
For a reservist, Lt. Col. Danny Davis (author of the recently touted and then critiqued report on the Afghan War) sure gets around. I was told yesterday that when he was a major, he proposed that the United States conduct a ground invasion of Iran by air dropping an armored division northeast of Tehran and then doing a tank assault into the city.
I also was told that he proposed to General Abizaid that he be promoted to lieutenant colonel and put in command of the lead tank battalion in the assault.
Note the blinding passiveness of “I was told” and “I also was told.” Note how Ricks refers to Davis as “Danny” (in the text and the title) a diminutive coinage that seems to be particular to Ricks. Other news stories have (as far as I can tell) universally referred to him as “Daniel Davis.” Note how Ricks finishes him off by quoting an awkwardly-written email from Davis. Davis shouldn’t have sent the email that way, but Ricks quotes the whole thing, grammatical errors, capitalization problems and all.
All this doesn’t make Davis correct, of course, but it’s an impressive effort nonetheless.
Welcome to the big leagues, Lt. Colonel. I trust you’ve given up on your ambitions to make full Colonel?
(See also this fascinating comment in the post thread.)
Some of the best works on the American Empire are being done by reporters and publicatioins exclusively focused on the various branches of the US military. Sean Naylor, of the Army Times, is an example. His six part series on covert American activities in Somalia is an enormously valuable insight into the nuts and bolts of global imperial efforts, missions that will go on long after the United States has left Iraq and Afghanistan. An excerpt:
The official referred to Joint Special Operations Command’s notion of “the unblinking eye” — using intelligence, surveillance and reconnaissance assets to keep a target under constant watch. In Iraq and Afghanistan, JSOC was “developing the concept of ‘we don’t want any blinks in our collection’ — the unblinking eye,” the senior intel official said.
But the wars in those countries deprived commanders in the Horn of the overhead assets they needed, “so in Somalia, it was a blink all the time,” the official said, adding that commanders “would go days without any kind of overhead collection capability” they controlled.
Not a military-focused organization, but still intensely informative is The Bureau of Investigative Journalism, a London-based non-profit that “bolsters original journalism by producing high-quality investigations for press and broadcast media.” They’ve created a timeline of known American actions in Somalia.
A map of the world, split into American unified combatant commands:
The counterfactual is what would have happened if Obama had proposed a much larger stimulus to begin with. His political team believed it would have risked delaying the bill or caused it to collapse entirely. Perhaps. It’s also possible it would have simply shifted the frame of the debate, so that “large” was now defined by $1.8 trillion rather than $800 billion, and the “centrist” position would settle in at, say, a trillion and a half or thereabouts.
This is what you would do if you were buying a car or a house. It is elementary bargaining. It is something that even the most lackluster of legislators does, or should, know. Why it is coming as a revelation now, I cannot imagine.
We are not only safer than we think, we are safer than we have ever been, say Micah Zenko and Michael A. Cohen.
The world that the United States inhabits today is a remarkably safe and secure place. It is a world with fewer violent conflicts and greater political freedom than at virtually any other point in human history. All over the world, people enjoy longer life expectancy and greater economic opportunity than ever before. The United States faces no plausible existential threats, no great-power rival, and no near-term competition for the role of global hegemon. The U.S. military is the world’s most powerful, and even in the middle of a sustained downturn, the U.S. economy remains among one of the world’s most vibrant and adaptive. Although the United States faces a host of international challenges, they pose little risk to the overwhelming majority of American citizens and can be managed with existing diplomatic, economic, and, to a much lesser extent, military tools.
This reality is barely reflected in U.S. national security strategy or in American foreign policy debates.
By exaggerating threats, we overemphasize the need for defense spending. It’s a dynamic we saw during the Cold War though there, Zenko and Cohen say, the threat was genuine if overhyped. Here, they argue, it’s nearly nonexistent.
Zenko and Cohen also say, “Such hair-trigger responsiveness is rarely replicated outside the realm of national security, even when the government confronts problems that cause Americans far more harm than any foreign threat.” I don’t know about this. What about inflation- and deficit-hawkery?
Anyway, the takeaway is not only that we are a nation of cowards, but that we are a nation of cowards shooting ourselves in the feet.
Indeed, the most lamentable cost of unceasing threat exaggeration and a focus on military force is that the main global challenges facing the United States today are poorly resourced and given far less atten- tion than “sexier” problems, such as war and terrorism. These include climate change, pandemic diseases, global economic instability, and transnational criminal networks—all of which could serve as catalysts to severe and direct challenges to U.S. security interests. But these concerns are less visceral than alleged threats from terrorism and rogue nuclear states. They require long-term planning and occasionally painful solutions, and they are not constantly hyped by well-financed interest groups. As a result, they are given short shrift in national security discourse and policymaking.
Which is to say, Zenko and Cohen write, we should stop going nuts over the one percent (or less) threats and concentrate on the 99 percent. Which is a nice translation of the Occupy rhetoric to foreign policy.
In the Washington Post, David Mayhew asks which was the most important presidential election in US history.
This is tough, because not all consequential presidencies derived from consequential elections. Roosevelt sailed to victory in 1932 and 1936, which respectively inaugurated and ratified the New Deal. It’s hard to feel the elections themselves were consequential, because they were nothing like close; it’s the surrounding circumstances and the appeal of Roosevelt’s reactions that were important; the presidency was won or lost before the actual election.
The 1860 election wasn’t close, so it’s more like 1932 that way. The 1960 election was very close, and maybe an earlier Nixon presidency would have made a big difference, but I’m not sure. A Dewey win in 1948 might well have mattered, but again, I’m not entirely sure.
I think the best match of consequential election – where the election itself determined who would go to the White House – to consequential presidency – where the person in the White House really shifted the course of history – was probably 2000.
IF A PERSON WHO PROVIDES CLASSROOM INSTRUCTION IN A PUBLIC SCHOOL ENGAGES IN SPEECH OR CONDUCT THAT WOULD VIOLATE THE STANDARDS ADOPTED BY THE FEDERAL COMMUNICATIONS COMMISSION CONCERNING OBSCENITY, INDECENCY AND PROFANITY IF THAT SPEECH OR CONDUCT WERE BROADCAST ON TELEVISION OR RADIO:
1. FOR THE FIRST OCCURRENCE, THE SCHOOL SHALL SUSPEND THE PERSON, AT A MINIMUM, FOR ONE WEEK OF EMPLOYMENT, AND THE PERSON SHALL NOT RECEIVE ANY COMPENSATION FOR THE DURATION OF THE SUSPENSION. THIS PARAGRAPH DOES NOT PROHIBIT A SCHOOL AFTER THE FIRST OCCURRENCE FROM SUSPENDING THE PERSON FOR A LONGER DURATION OR TERMINATING THE EMPLOYMENT OF THAT PERSON.
2. FOR THE SECOND OCCURRENCE, THE SCHOOL SHALL SUSPEND THE PERSON, AT A MINIMUM, FOR TWO WEEKS OF EMPLOYMENT, AND THE PERSON SHALL NOT RECEIVE ANY COMPENSATION FOR THE DURATION OF THE SUSPENSION. THIS PARAGRAPH DOES NOT PROHIBIT A SCHOOL AFTER THE SECOND OCCURRENCE FROM SUSPENDING THE PERSON FOR A LONGER DURATION OR TERMINATING THE EMPLOYMENT OF THAT PERSON.
3. FOR THE THIRD OCCURRENCE, THE SCHOOL SHALL TERMINATE THE EMPLOYMENT OF THE PERSON. THIS PARAGRAPH DOES NOT PROHIBIT A SCHOOL AFTER THE FIRST OR SECOND OCCURRENCE FROM TERMINATING THE EMPLOYMENT OF THAT PERSON.
B. FOR THE PURPOSES OF THIS SECTION, “PUBLIC SCHOOL” MEANS A PUBLIC PRESCHOOL PROGRAM, A PUBLIC ELEMENTARY SCHOOL, A PUBLIC JUNIOR HIGH SCHOOL, A PUBLIC MIDDLE SCHOOL, A PUBLIC HIGH SCHOOL, A PUBLIC VOCATIONAL EDUCATION PROGRAM, A PUBLIC COMMUNITY COLLEGE OR A PUBLIC UNIVERSITY IN THIS STATE.
Note that in the bill as written the “speech or conduct” is not limited to the classroom. By current FCC standards, I believe this means that, were I a professor in Arizona and this bill were to pass, I could say “shit” after 10 pm at night, but never say “fuck.” I won’t even explore the conduct side of things.
Paging George Carlin.
The sources available to historians jump exponentially for the post-1945 era. The rise of typewriters, copy machines, computers, and printers created a blizzard of paper that shows no sign of ending. Add into that all the electronic files, email, and the like, not to mention oral history recordings, and historians studying the years after World War II might be forgiven for having a thousand-yard stare and powerful bifocals. Google (which I am using as a generic word for search & indexing of all type. There goes the trademark) has helped some, but has its own problems.
Now comes the flood of video. The Air Force, the linked article notes, collects 6 petabytes (which is technical language for “Holy sh#$%$#%, that’s a lot of data”) of high-definition video per day. Such video could be remarkably useful for military historians (want to watch a combat engagement in real time?) but wading through it will be the work of generations.
Yesterday in the Aggie one read,
The UC sent cease-and-desist letters to notehall.com on Nov. 10, 2010, a note-sharing website owned by the Santa Clara company Chegg, as well as coursehero.com on Jan. 10, 2011, appealing to the websites to stop encouraging students to post notes on their sites. They remained in negotiations for several months before the sites removed the content.
Today, I received an email from someone named Tracy King, Content Administrator, reading
Thanks for being part of the Notehall family. We are working hard to expand our services at UNIVERSITY OF CALIFORNIA-DAVIS and need your help.
Apply to be a Note Taker this term.
The job is a flexible semester long position. Depending on the class you cover you can earn up to $450.
• Take notes for a class you’re currently enrolled in
• Create study guide for exams
• Earn commissions
• Make money for being a good student!
I don’t think the UC is making much of an impression on Notehall; if they took down the content, they are apparently now keen to replace it.
Both military historians and the United States military have long had an unhealthy fascination with the German Army of World War II. The Wehrmacht, the thinking goes, was both enormously effective (much more so than their enemies), and apolitical. Unstained by its lack of involvement in Nazi war crimes, the Wehrmacht was thus a useful military model. Add to that the start of the Cold War, in which the Soviets became the main enemy, and the American military looked to the Germans for information and inspiration. The apotheosis of this was Colonel Trevor Dupuy’s Quantified Judgment Model, which used a statistical analysis to conclude that Germans were more effective soldiers than Americans in World War II. The German’s Combat Effectiveness Value, according to Dupuy, was significantly higher than that of the Allies, western and eastern front alike. Each German soldier, Dupuy figured, was worth about 1.55 Americans.
The result of this fascination has come out in a number of ways, including the Marine Scout Snipers who decided that the SS symbol was a good one to adopt as their logo. But it also made the American Army focused on the kind of mechanized warfare that they took as the German model. Facing the Soviets across the Fulda Gap during World War II, American soldiers found themselves symbolically in the same position as the Germans in WWII (excepting the whole Germans invading the USSR thing, of course). That focus was (arguably) useful in Western Europe, but less so in other theaters, like Southeast Asia. The Germans were notoriously bad at counterinsurgency, and some of the American difficulties, I think, came over from the German model (note that pre-WWII, Americans had been pretty reasonable at counter-insurgency). The fascination with the Germans might also have influenced American imperial behavior; alone (I think) among major imperial powers, the United States did not have a separate imperial military force (like the Indian Army) that it used abroad.
The US military has gotten away from that obsession somewhat in recent years, and adapted quite well to the requirements of counterinsurgency (though it shows signs of backtracking in recent years), but, as the Marines demonstrated, the fascination with Nazi Germany remains.
 Military historians have pushed back against the apolitical image of the Wehrmacht in recent decades.
 Dupuy explained his model concisely here (subscription required, sorry). There were substantial problems with the analysis, outlined here and here. Dupuy recognized some of these problems, including a chapter called “Fudge Factors” in his book on the topic, Numbers, Predictions, and War.
On the jacket of Alexander Field’s new book A Great Leap Forward, my colleague Greg Clark says this:
As we sit mired in the Great Recession, Alexander Field’s exciting reappraisal of the Great Depression offers surprising solace. By showing the Great Depression was coupled with the most rapid technological advance in U.S. history, he fundamentally recasts the history of the 1930s. But he also offers hope that our own depression likely will have no long-run costs to the U.S. economy.
By measuring total factor productivity (TFP), or the improvement in productivity not accounted for by traditional inputs, Field finds tremendous gains during the Depression. They owe in part to private investment in manufacturing efficiencies, chemical processes, and other technical improvements. Historiographically, there’s a major payoff in showing that the vast majority of such innovation came during the Depression, not during the war.
But (as the bulk of Field’s book is devoted to showing) the productivity improvement owes mostly to construction transportation infrastructure – to the construction of roads, bridges, and all that made the modern trucking industry possible. Field even goes so far as to say the end of the golden age of productivity in the American economy in 1973 “coincides with [he does not quite say owes to] a tapering off of gains from a one-time reconfiguration of the surface freight system in the United States”.
And this massive public investment in infrastructure, which made possible the postwar suburbanization and boom, went along with financial regulation. Field attributes both the current crisis and that of the 1920s to “a failure to control, or really to be interested in controlling, the growth of leverage.” If we want to come out of the Current Unpleasantness with less than a Great Depression to show for it, we’ll have to see regulation that responds accordingly, he says. “If an even more serious crisis occurs within the next decade, it will be because the regulatory response ended up being less effective than that which was summoned during the New Deal.”
Which makes Field sound a lot less optimistic than Greg. The Great Depression turned out relatively well in the long run because we had not only significant private investment in R&D and other improvements, but also the New Deal – road-building and regulation. Do we have that, or anything like it, now?
The dark secret behind the bucket:
(I know that it’s vintage 2009. Still awesome. h/t Kevin Levin)
Abraham Lincoln, Vampire Hunter:
Michael Bérubé runs the numbers on Penn State:
In 1985, the state provided 45 percent of Penn State’s budget; in 2011 it provided 6 percent. In 1985, in-state tuition was just over $2,500; today it is over $16,000. Over the past twenty-five years, the cost of a public college education has increasingly been offloaded onto individual students and their families, as education has been redefined from a public good to a private investment.
And he concludes:
A fully privatized Penn State no longer has any reason to call itself “Penn State.” Indeed, the name would amount almost to false advertising, since there would be nothing “State” about us. And that means a whole new vista would be open to us – and in different ways, to Temple and to Pitt. In two words: naming rights … Let the bidding begin.
My hopes are in the title.
In a WSJ op-ed called “Forty Years of Paper Money,” Detlev Schlichter, a supporter of the gold standard, begins thus:
Forty years ago today, U.S. President Richard Nixon closed the gold window and ushered in, for the first time in human history, a global system of unconstrained paper money under full control of the state.
Now, with that title, that lede, and Schlichter’s very stern opinions about paper money, you’d think that the paper money era began right then, forty years ago. But as Schlichter himself says in his very next sentence,
It is not that prior to August 15, 1971, there was a gold standard. Far from it. Most countries had severed any direct link between their currencies and gold many years earlier.
Right. So the shift to a paper money system didn’t begin with Nixon. And the monetary thing that existed before Nixon’s intervention – the monetary thing that was not, per Schlichter, a gold standard – the monetary thing that happened to go along with decades of global growth and prosperity, the monetary thing that goes wholly unmentioned in the op-ed, was the Bretton Woods system.
Read the rest of this entry »