You are currently browsing the monthly archive for January 2010.
Kevin Drum says he’s adopting Sir
Rex Richard Mottram’s extended conjugation as his personal mantra:
We’re all f*cked. I’m f*cked. You’re f*cked. The whole department is f*cked. It’s the biggest cock-up ever. We’re all completely f*cked.
Which reminds me of what I believe Walter LaFeber said was Brooks Adams’s shaving song: the phrase “God-damn” repeated to the melody of the Westminster chimes. I mean, it probably is too much to ask that one’s fatalism be cheerful, but musical seems a reasonable request.
Welcome to the January 17, 2010 edition of the Military History Carnival, a roundup of the best recent military history from around the web. This is the first time that H-War and Edge of the American West have co-hosted. Today’s edition ranges widely, from the Ottoman Empire to the Atomic Bomb to the American Civil War.
Jason presents 1683: Merzifonlu Kara Mustafa Pasha, for the Battle of Vienna posted at Executed Today.
World War I
Rich Landers presents Sarrey, France 1/5/1919 posted at Soldier’s Mail, saying, “Soldier’s Mail features the writings home of U.S. Sergeant Sam Avery from the front lines of American involvement in the Great War. Letters are posted on the same date they were written more than 90 years ago, and make for fascinating eyewitness history from the hot sands along the Rio Grande to the cold mud along the Meuse. Come march along with the Most Gallant Generation!”
World War II
Joseph McCullough presents Osprey Publishing – Military History Books – Blog – Photographs from a life in the Royal Navy posted at Osprey Publishing.
Nikolaos Markoulakis presents The Security Battalions: ‘Quislings’ on Behalf of the King posted at Hoplite, saying, “The reasons for the existence and use of the Axis’ created Security Battalions. The paper focuses on the volunteers of the Security Battalions. Why did they join? Was it only to combat communism? Or were the reasons more complicated?”
Scott Manning presents Nazi Body Count: 20,946,000 Non-Battle Deaths posted at Digital Survivors, saying, “The Nazi Body Count represents non-battle deaths caused by Nazi Germany between 1933 and 1945. This includes genocide, execution of civilians and POWs, forced labor that resulted in deaths, bombing of civilian populations, imposed famine and resulting diseases, and “euthanasia.” These numbers do not include civilians who got caught in the cross-fire of battle.”
Steven Germain presents In The Dead Silence Of The Morning…(Now I am become Death – the destroyer of worlds…) posted at Rough Fractals, saying, “Perhaps the defining moment in the history of man – the capability of annihilation becomes a tool of modern warfare.”
David Gross presents The Unconquerable World (Jonathan Schell) posted at The Picket Line, saying, “In The Unconquerable World Jonathan Schell tells the story of the evolution of the logic of war and political power in a way that might just give it a happy ending after all.”
That concludes this edition of the Military History Carnival. Submit your blog article for the next edition using the carnival submission form.
Past posts and future hosts can be found on our blog carnival index page.
Jean Baudrillard turns out to have had it wrong: I say television creates real communities. Friday Night Lights is no false simulacrum. It’s practically as real as real life—a show about high-school football that’s also about race and class and physical handicap and angst and sex (fraught sex between teenagers, mature sex between their parents) and why people fear things they don’t know. When I watch it, and know that millions of others are—and when I visit its Web site or read chat rooms devoted to the show—I become a part of something.
I think Tomasky’s broad point is correct, though I prefer my virtual television-inspired community at an ironic remove—I never visit a show’s website, but I will go to its Television Without Pity site.
But I want to address his narrower point, about the virtues of Friday Night Lights. He’s right about all the things the show is about, and that—with the exception of the crazy stupid homicide plot in season 2—it does a good job being about these things.
Tomasky might have gone a bit further, though. By being a good show about race, sex, disability, football, and fear of the unknown in a small Texas town, it’s also a powerful show about Texas Republicans as real people, and as good people.
Read the rest of this entry »
If you’ve got questions, we’ve got answers. Or maybe we don’t. But you can still try. Send your queries, serious or otherwise, to edgeoftheamericanwest AT geemail DOT com, and we’ll do our best to give you absolutely ridiculous answers.
* Adapted from Lillian Hellman.
A reader writes in to ask what I do about students who talk during my lectures. It’s a good question, as the problem seems to be getting worse the longer I teach. Whether I’m getting more boring (likely), my students are getting more unruly (perhaps), or the classroom culture is becoming more and more like the comments section of Matthew Yglesias’s blog (I doubt it, but maybe), I don’t really know.
As for the question, at the beginning of every quarter I talk to my students about my expectations of them, including my desire that they not talk during lecture. Honestly, I no longer care if they sleep, read, or surf the web. So long as they don’t keep other people in the class from listening to me and maybe learning (I can dream, right?), and so long as they’re somewhat respectful of me, we’re cool. Which is to say, I prefer that they not snore loudly while sleeping or make a big show of reading their friends’ facebook pages. Other than that, though, whatevs.
But a few years back, I singled out some backward-ball-cap-wearing kewl kidz for repeatedly talking and laughing during classes in the latter part of the term. They were recidivists, in other words, and should have known better. They had initially ignored my subtle looks and later my not-so-subtle glares. And they were clearly going to keep up their shenanigans until I smacked them down. So I did. I didn’t say anything too terribly harsh, something along the lines of, “Please stop talking during class. It’s incredibly hard for me to concentrate when you guys behave in this way.” And they stopped. For the rest of the term. Mission accomplished, right?
Well, I later wished that I had talked to them individually*, as the punishment — public humiliation — seemed to outstrip the crime. So since then, I’ve either tried to pull people aside after class or make an announcement, to the whole room, without looking at the offending parties, during my lecture: “You’ll recall that on the first day I said that I really can’t stand it when people talk during class. Please try to keep that in mind, okay.” And that seems to do the trick. But I’m open to new ideas, as the older I get, the more crowded my lawn seems to become.
* Of course I felt guilty. Because I always feel guilty. (Seriously, always. It never stops.) Put another way, one’s approach to classroom management is almost certainly going to vary depending on one’s personality. Not to mention one’s gender, which obviously has a huge impact on how one approaches these issues. So while I’m a guilt-ridden Jew, I’m also a relatively big** guy, which means that I probably have to deal with less of this crap than many other people out there do.
** Fine, fat. You’re so mean. This is supposed to be a safe space, you know.
In the hands of a writer sick with ambition, these subjects might have become the occasion for a meditation on the virtues of discipline; for a writer poisoned by sentiment, they might have become treacly elegy. But Ebert seems these days just to be writing because he really wants to tell you how it is, and it’s very good writing indeed.1
1Which is not to say that I’ve never felt misled by his movie criticism. Not to go too deeply into things, but I would leap to play the dour Siskel to his thumbs up for Indiana Jones and the Crystal Skull and Synecdoche, New York.
On my return from San Diego and another American Historical Association Convention, I received the following in the mail:
What “quaint and curious volume of forgotten lore” was this? And how would I know if it was good or not? Then I spied the supreme endorsement on its back:
From the web edition of Jobs for Philosophers, put out by the American Philosophical Association:
306. SAINT MARY’S COLLEGE OF CALIFORNIA, MORAGA, CA. POSTDOCTORAL RESIDENT, COUNSELING CENTER. Saint Mary’s College of California – Moraga, CA. For the 2010-2011 academic year. The Residency requires a 9.5 month, 5 day per week commitment in order to meet California licensing requirements of 1500 hours supervised postdoctoral experience. Qualifications: Psy.D, Ph. D. in Clinical/Counseling Psychology. College/University Counseling Center experience at practicum and/or internship level (desired). Fluency in Spanish (desired). Salary and benefits are competitive and subject to the availability of funding sources. Complete details are available at http://jobs.stmarys-ca.edu. Preferred deadline is 01/18/10. Open until filled. EOE. www.stmarys-ca.edu. (184W), posted 1/11/10
Yes, that’s right. Once again, our esteemed national organization has accepted a listing for someone with a psychology degree. Wrong APA! I’m sure there are charitable explanations, but my preferred non-charitable one is that the national organization is suffering from the same confusion as the typical acquaintance who hears a philosopher explain that he does philosophy and then asks if he’s allowed to prescribe Prozac or if he needs a medical degree to do that.
For nascent military historians, this is well worth it:
The West Point Summer Seminar in Military History seeks to improve the teaching of military history at the collegiate level by broadening its Fellows’ knowledge of military history and improving their ability to teach it. The Seminar brings together a select group of historians for an intensive series of seminars, lectures, and staff rides to local American Revolutionary War and American Civil War battlefields. These activities—led by members of the USMA faculty as well as a variety of noted military historians—seek to facilitate detailed discussions of historiography and pedagogy within the field of military history. Upon completion of the seminar, participants are prepared to return to their home institutions and develop or enhance a program in the study of military history.
Further information here.
I had a pleasant conversation with a new acquaintance at the AHA that turned, at one point, to the merits of the 1873 Slaughterhouse decision.1 Most historians know the case because in it the Supreme Court offers a highly constrained reading of the Fourteenth Amendment‘s critical first section:
All persons born or naturalized in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside. No State shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any State deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws.
In Slaughterhouse, the Court says that this passage clearly grants two forms of citizenship, citizenship of the United States and of the state of residence; moreover, the clause protecting citizens from abridgment of their “privileges and immunities” clearly only covers those privileges and immunities people enjoy by virtue of their citizenship in the United States. It does not cover such privileges and immunities as people may enjoy by virtue of citizenship in a state.
Read the rest of this entry »
Remember what I said about false positives? An interesting list from Kevin Drum.
One thing I’d add is that I’m not inclined to make too much of his being rejected for a visa from the UK, nor would I want to see a policy where being rejected from one country entails automatic suspicious treatment at all other countries, because being rejected for a visa or denied entry into a country is both highly subject to the particulars of admitting countries’ laws and usually requires little in the way of evidence. A US CBP officer can deny you entry even if you have a valid visa; they have a lot of discretion. You can be rejected for one kind of American visa while retaining your eligibility for others. And it’s not clear in this case that it would have helped much. Abdulmutallab’s visa wasn’t renewed not because they thought he was a terrorist, but because the UK cracked down on those abusing the sponsorship of student visas by starting fake colleges. (He had been a genuine student at a UK university in the past.)
While Iraq has been cooling down, Afghanistan has been heating up:
The conflict there has usually been divided between summers and winters, with summers seeing most of the combat. The above chart shows the divide pretty clearly, and it also shows that coalition fatalities in-country are increasing in both seasons.
The worst month for coalition forces in Afghanistan was July 2009, when 77 fatalities occurred. That month still does not rival the worst months of Iraq (141 in November 2004 and 131 in May 2007), but each summer and each winter has seen a higher number of fatalities, and January 2010 is, only 12 days into the new year, already the second-worst January of the war.
1. With all due respect to Lizardbreath, she’s going about her business plan completely the wrong way. I’m sure there are merits to the Paleo diet, yuppies thinking that raw grass-fed ground beef approximates paleolithic mastodon meat aside, but if you think about it, designing a diet for an organism that can thrive just as well on whale blubber and the occasional plant as it can on soybeans and rice really shouldn’t be all that challenging. Omnivores are adaptable! You might not lose weight, but it’s probably not going to kill you. If we analyze it conceptually, we can see a fad diet consists of the following elements: a) a ban or near ban on pre-packaged foods b) a ban or near ban on one kind of macronutrient c) some form of calorie restriction d) an exhortation to exercise and e) a story about why this is so, the more romantic the better.
For example, in the caveman diet, we take “follow our conception of the caveman” as a maxim, thus, we do not eat packaged foods, cereal grains (Grok don’t farm), we eat lots of meat (Grok hunt) and berries and nuts (Grok forage), we fast now and then (Grok no have refrigerator), and we sprint and jump (Grok surprise mastodon.)
Thus, the growth industry here is in telling a good story. I therefore present the Path of Hypatia, designed especially for women, drawn from the wisdom of the ancients who said let thy food be thy medicine and thy medicine thy food. The program recommends that you eat primarily olives, cheese, bread, lemons, fish, and wine, and recommends both quiet time for contemplation (stress makes you retain belly fat, so do your metaphysics!) and vigorous Spartan exercise every day, for is it not said, “wealth, flutes, and instruments generally?” I’m sure LB can come up with a good lawyer saga.
2. Someone give me an argument why this isn’t simply common-sense. Here are the terms of the mortgage: pay it back or the bank can take back the house and your credit rating will take a hit. Everyone agreed to this mortgage. There is no clause that says “you can let the bank take the house back only if you’ve bankrupted yourself in other ways first” or (in many states) “if the house isn’t worth what the loan is, the bank can come after you for the balance.” So by “walking away” we mean “fulfilling the terms of the contract.”
Now, sure, I think there’s generally a moral obligation not to be dork, and there would be repercussions for the lending market generally if lots of people defaulted strategically. But.. look, there’s lots of repercussions when the housing market collapses, and I haven’t seen a call for the banks independently to do the moral thing and let people pay 33% of their income on their mortgate in order to stabilize prices. Why is it that the homeowner is held to a higher standard?
3. I just have to note this because the second item is cracking me up.
I was talking about this fantastic bit of parenting with a friend recently:
Notoriously, [Auberon Waugh] claimed that [his father] Evelyn [Waugh], after the war, had made all his children sit round and watch while he scoffed their banana rations with cream and sugar.
[Auberon’s son] Alexander believes that the banana story was true: “He was a very greedy little boy, and he definitely would have remembered the bananas and he definitely would have resented them. But my point in the book is that you cannot trust the testimony of a very greedy jam tart thief, who would rather have a jam tart than meet his father.”
And it came out that someone we knew didn’t know that Evelyn Waugh was a man. Shameful! But then again I was in graduate school before I realized that L.A. Paul is a man, so I’m not one to talk. Hilary Putnam, too.
Did you know Evelyn Waugh’s first wife’s name was also Evelyn? Their friends called them He-Evelyn and She-Evelyn. Cute!
Of all places, however, the Republic of Texas (1836-1845) was the undisputed champion of political suicide during the 19th century. No other state witnessed the self-murder of so many of its early leaders, with no fewer than five prominent Texans taking their own lives between independence and the end of the US Civil War. George Childress, for instance, one of the republic’s founders, gutted himself with a Bowie knife in 1841 after three attempts at establishing a law practice came to naught. Two other founders, Thomas Jefferson Rusk and Royal Tyler Wheeler, shot themselves in 1857 and 1864, respectively.
In mid-July 1838, the republic’s presidential campaign — a contest to succeed Sam Houston — was thrown into chaos when two of the major contenders committed suicide within a span of 48 hours. On July 8, Peter Wagner Grayson — Houston’s attorney general and heir apparent — ended his two-decade long struggle with mental illness by shooting himself in Bean’s Station, Tennessee, one day after complaining in a letter to a friend that his mind had been taken over by “fiends.” On July 11, James Collinsworth, another notable founder and member of the republic’s first senate, concluded a week-long bender by launching himself into Galveston Bay. (Mirabeau B. Lamar — a bitter enemy of Houston’s wound up with the presidency, an outcome that so irritated Houston that he delivered a three-hour farewell address at Lamar’s own inauguration.)
When statehood came in 1845, Texans named entire counties in honor of Collinsworth, Grayson and Wheeler. Rusk and Childress were commemorated with counties as well as towns.
Anson Jones, the last president of the Republic of Texas, was similarly honored. Jones County, Texas — one of 46 dry jurisdictions in the state — was named after him, with the town of Anson serving as the county seat. A physician by training, Jones had renounced medicine and migrated from Great Barrington, Massachusetts to Texas during the early 1830s and played important roles both the war with Mexico and — a decade later — the annexation of the republic by the United States. After Texas’ absorption into the union in 1845, Jones was bitterly disappointed not to be appointed to the US Senate. Sam Houston and Thomas Jefferson Rusk were awarded the seats instead. In a spite-soaked letter to a friend, Jones predicted that his tombstone would someday read, “Murdered by a Country He Served and Saved.”
During the last decade of his life, Jones wallowed in his disappointment, which was only accentuated by a crippling arm injury that came when he fell from his horse in 1849. In 1857, Rusk vacated his seat by committing suicide; his wife had recently died of tuberculosis, and a tumor was discovered in his neck. Coupled with Sam Houston’s decision to run for governor, the death of Thomas Rusk meant that Anson Jones’ senatorial ambitions suddenly appeared nearer to realization. Buoyed by misplaced optimism, he returned to the state capital, where he expected a triumphant welcome and a quick election to the upper house of Congress. Instead, Jones’ arrival was virtually unnoticed in Austin, and he spent his days in his hotel room, brooding over his memorabilia from days gone by, ruminating over newspaper clippings and old letters that he believed would vindicate him in the eyes of history. He received exactly zero votes in the state legislature, which instead preferred James Pinckney Henderson, another former attorney general whom Jones dismissed as a “gamester and a sot.” (Henderson enjoyed the briefest of terms in office, dying of pleurisy in August 1858.)
Rejected by the country he served and saved, Jones’ life spiraled toward its conclusion. After selling his plantation for a quarter of its value, Jones traveled to Houston in January 1858 and sequestered himself in the Old Capitol Hotel for four dismal, lonely nights. There, he opened a letter from his wife, Mary, who expressed confidence “this little trip will be of service to you.” She urged him to “blot out the past” and forget Texas’ “ingratitude toward you.”
On the morning of January 10, it was reported that the forlorn Texas statesman had been discovered “lying across his bed this morning at half past 8 o’clock, a discharged pistol in his hand and his brains blown out. This is all the particulars of this lamentable affair we have been able to obtain.”
As the only US historian at my tiny university, I’m obligated to teach both halves of the survey as well as all sorts of courses that take me outside my areas of expertise. And lacking many of those, I spend much of my time in the classroom wondering if I’m not selling expired merchandise, interpretations or approaches that more accomplished professionals with better time management skills might have dismissed years if not decades ago.
That said, I’m relieved to report that at least where my treatment of the Revolutionary War is concerned, I manage to avoid these seven myths, described and debunked in nice detail by John Ferling. I come somewhat close to repeating two of the items on the list — I probably understate the possibility that Britain might have wom the war after 1778, and I probably overstate the turning-pointedness of Saratoga — but I think a jury of Ferlings would probably vote to acquit.
The minute Abdulmutallab’s father walked into a U.S. Embassy with news that his son was a potential terrorist, the official in charge was duty-bound to see this through. Every scrap of paper and every byte of data on the suspect should have been called up and frozen. That’s why we have embassies. When the information was passed to the first special agent at the CIA, he or she was duty bound to see it through. When the information was passed to the first administrator at the National Counterterrorism Center, he or she, too, was duty bound to see it to the end.
Everyone who read the name “Umar Farouk Abdulmutallab” prior to December 25, 2009 should be reprimanded and fired.
Much has been made of the fact that Abdulmutallab’s father, in a modern Euthyphro dilemma, informed on his own son. What has been made has generally taken one of two forms: jokes about how hard it must be for a Nigerian banker to get his proposals taken seriously FOR OUR MUTUAL BENEFIT, and incredulity that when the man was turning in his son we didn’t immediately arrest the young man or at least put him on the no-fly list or revoke his visa.
The second form is an understandable reaction (his son!), but a moment’s reflection on our recent adventures in Afghanistan and Iraq should show the intelligent observer the perils of concluding that someone is a terrorist based on the say-so of a relative, colleague, or acquaintance. If I recall, this is now one of the many lingering problems because it turns out that when you detain and torture someone based on the say-so of an informant, chances are that person is innocent, and your best case scenario is now hoping you didn’t radicalize a formerly innocent person who now has plenty of reason to hate you.
Now, obviously, no one’s suggested that Abdulmutallab should have been hauled off and tortured, but the point really needs to be made that just because someone says another person is a terrorist doesn’t necessarily mean that they are, and it’s a good thing if the United States doesn’t act like that. That it was his father who informed on him is fascinating, but not proof of anything particularly.*
All that aside, there’s an interesting puzzle here that’s being overlooked.
Let’s treat the proposition someone is a terrorist as a proposition for which we can gather evidence that warrants belief that the proposition is true. Let’s treat all the evidence that we can gather — parental informants, ties to radical imams, patterns of study, religion, country of origin, one way ticket, lack of checked luggage** — like a test, and let’s stipulate further that we have a very reliable terrorist test. If we present the test with the profiles of 100 terrorists, it will correctly identify 99 of them as terrorists. 1% slip through. For simplicity’s sake, it also misidentifies, 1% of the time, an innocent person as a terrorist.
Now suppose a person about whom nothing is known comes to the attention of the powers that be, and they administer the test, and it’s positive for terrorism.
What is the chance that the person really is a terrorist? Formulate your answer and then follow the jump.