You are currently browsing davenoon’s articles.
Kevin Levin has been having some fun with Larry Schweikart’s recently published — and oddly titled — 48 Liberal Lies About American History. (I mean, only 48? Seriously? He couldn’t find two more? Clearly, he hasn’t reviewed the latest scholarship on George Washington’s ursine sex fetishes and contributions to the early cocaine trade, to say nothing of his extra testicles and his callous disregard for the British children.)
Anyhow, Schweikart — last seen writing a book that should have embarrassed his mother — has discovered some remarkable untruths that are, he claims, standard leftist issue in US History texts. Among them:
- “John F. Kennedy was Killed by LBJ and a Secret Team to Prevent Him from Getting Us Out of Vietnam”
- “Ronald Reagan Knew ‘Star Wars’ Wouldn’t Work but Wanted to Provoke a War with the USSR.”
- “September 11 Was Not the Work of Terrorists. It Was a Government Conspiracy.”
It hardly needs mentioning that none of these claims are even remotely endorsed by any current US history textbooks — or at least those that haven’t been self-published by unmedicated crazy people — and that Larry Schweikart must be confusing “liberal US history textbooks” with “amateur videos I found on YouTube.” None of that will matter to the Texas School Board, for whom I’d guess Schweikart is eagerly preparing a high school version of his Patriot’s History, complete with its reassurances that the men who died at the Alamo were “freedom fighters” and that Mexico’s finest soldiers ran from San Jacinto like screaming children.
But since Schweikart seems particularly concerned about the alleged presence of delusional conspiracy theories in American history texts, perhaps it’s worth reviewing his and Michael Allen’s treatment of the Oklahoma City bombing for a point of comparison. From pp. 785-786 of A Patriot’s History of the United States, the authors treat us to this:
[I]n his haste to lay the blame on antigovernment extremists, Clinton and the entire U.S. intelligence community missed several troubling clues that perhaps McVeigh and Nichols had not acted alone. Nichols, for example, was in the same part of the Philippines — and at the same time — as Al Qadea [sic] bomb maker Ramzi Yousef. Moreover, numerous witnesses testified that McVeigh and Nichols lacked sufficient bomb-making skills, but that their bomb was a near-perfect replica of the 1993 World Trade Center bomb devised by Yousef.
The footnotes to this section lead us to a handful of books published by the distinguished Regan Press and — the phrase “no shit” comes to mind here — World Net Daily’s publishing house. All of which makes me wonder if the University of Dayton’s history department allows Larry Schweikart to teach its undergraduate methods seminar. At any rate, the “Third Terrorist” theory has long been a staple of right-wing mythology and was promoted vigorously in 2001 and 2002 by such totally credible experts as Bill O’Reilly, Frank “Sharia” Gaffney and Larry “Whitey Tape” Johnson. The fact that the theory has no basis in evidence hardly disqualifies it from inclusion in Schweikart’s book; apparently, its top-shelf wingnuttery more than compensates for its actual flaws. It’s an impressive trick, though, to follow up this sort of insane conspiracy-peddling by publishing a book that indicts “liberal” historians for circulating conspiracy theories they’ve actually done nothing to promote.
Today we remember the hapless epicure Adolf Frederick I, a twig from the Holstein family tree who reigned as King of Sweden from 1751 until his death in on this date in 1771. His two decades on the throne were significant only to the extent that he presided helplessly over the decline of the Swedish kingdom.
As sovereign, Frederick I was almost completely powerless and functioned more or less as an ornament while the riksdag managed the affairs of state, which included Sweden’s commitment to the Seven Years’ War — the first truly global war in human history, provoked by a poisonous mixture of European internal politics and colonialism. Sweden’s contribution to the struggle consisted mainly of throwing a series of tiny, scrofulous armies into the field against Frederick II of Prussia. The Prussian ruler was so unimpressed with the Swedish effort that when hostilities ended in 1762, he expressed mock surprise upon learning that he had been at war with Sweden in the first place. The war upended Swedish national life, resulting in the temporary collapse of the political hegemony enjoyed by the upper nobility (known as the Hats); the lower nobility (known as the Caps) assumed power over the government but faced feirce resistance from the Hats as well as constant foreign interference from Prussia, Russia and Denmark. King Adolf Frederick was irrelevant to the plot. An avid woodworking hobbyist, he spent most of his time playing with his beloved turning lathe.
Among his other passions in life, Frederick was an avid collector of biological specimens. During his years as the Swedish crown prince, he served as an important resource for Carl Linnaeus, who studied the prince’s cabinet while sorting out the details of his famous taxonomic system. On February 12, 1771, Frederick’s two decades of idle monarchy came to an end. That night, he celebrated Fettisdagen — better known to us as Fat Tuesday — by gathering his final collection of specimens for a titanic pre-Lenten feast of lobster, boiled meats, caviar, sour cabbage, smoked herring, turnips and champagne. For dessert, the king gobbled fourteen servings of semla, a traditional wheat pastry usually served in warm milk with cinnamon and raisins. He died that night — propped up on Queen Louisa Ulrica’s knees — of a massive digestive event, the details of which have sadly not been preserved in Swedish historiography.
Whatever we might offer on the merits of Zinn’s signature work, I think we can all unite around the knowledge that it beats A Patriot’s History of the United States like a rented, red-headed step-mule. I don’t ordinarily have favorite paragraphs in anything I write — preferring instead to focus on paragraphs I hate slightly less than all the rest — this one was a lot of fun to produce:
Worse, in their chapters on recent U.S. history, the authors make claims that are not even remotely endorsed by the footnoted sources. In excoriating the Great Society, for instance, Schweikart and Allen observe that one “malignant result of AFDC’s no-father policy was that it left inner-city black boys with no male role models” (p. 689). In support of this Gingrichian pronouncement, the authors cite a single 1989 study from Social Forces— an article that makes no mention of AFDC, inner-city black youth, or role models and indeed has almost nothing to do with the argument to which it is attached. In the same paragraph, we read further that after the 1960s, “gang leaders from Portland to Syracuse, from Kansas City to Palmdale, inducted thousands of impressionable young males into drug running, gun battles, and often death” (p. 689). For this dramatic observation, the authors rely on two broad studies of family structure and drug use, each published eight years apart in the Journal of Marriage and the Family. Among the phrases that do not appear in either study: “Gang leaders,” “Portland,” “Syracuse,” “Kansas City,” “Palmdale,” “impressionable young males,” “drug running,” “gun battles,” and “death.” With little effort, this reviewer has identified nearly a dozen such cases in which the authors have tortured their sources to score points against social programs they oppose, political philosophies to which they object, or historical actors whom they do not like.
Wow, that was a shitty book.
Of all places, however, the Republic of Texas (1836-1845) was the undisputed champion of political suicide during the 19th century. No other state witnessed the self-murder of so many of its early leaders, with no fewer than five prominent Texans taking their own lives between independence and the end of the US Civil War. George Childress, for instance, one of the republic’s founders, gutted himself with a Bowie knife in 1841 after three attempts at establishing a law practice came to naught. Two other founders, Thomas Jefferson Rusk and Royal Tyler Wheeler, shot themselves in 1857 and 1864, respectively.
In mid-July 1838, the republic’s presidential campaign — a contest to succeed Sam Houston — was thrown into chaos when two of the major contenders committed suicide within a span of 48 hours. On July 8, Peter Wagner Grayson — Houston’s attorney general and heir apparent — ended his two-decade long struggle with mental illness by shooting himself in Bean’s Station, Tennessee, one day after complaining in a letter to a friend that his mind had been taken over by “fiends.” On July 11, James Collinsworth, another notable founder and member of the republic’s first senate, concluded a week-long bender by launching himself into Galveston Bay. (Mirabeau B. Lamar — a bitter enemy of Houston’s wound up with the presidency, an outcome that so irritated Houston that he delivered a three-hour farewell address at Lamar’s own inauguration.)
When statehood came in 1845, Texans named entire counties in honor of Collinsworth, Grayson and Wheeler. Rusk and Childress were commemorated with counties as well as towns.
Anson Jones, the last president of the Republic of Texas, was similarly honored. Jones County, Texas — one of 46 dry jurisdictions in the state — was named after him, with the town of Anson serving as the county seat. A physician by training, Jones had renounced medicine and migrated from Great Barrington, Massachusetts to Texas during the early 1830s and played important roles both the war with Mexico and — a decade later — the annexation of the republic by the United States. After Texas’ absorption into the union in 1845, Jones was bitterly disappointed not to be appointed to the US Senate. Sam Houston and Thomas Jefferson Rusk were awarded the seats instead. In a spite-soaked letter to a friend, Jones predicted that his tombstone would someday read, “Murdered by a Country He Served and Saved.”
During the last decade of his life, Jones wallowed in his disappointment, which was only accentuated by a crippling arm injury that came when he fell from his horse in 1849. In 1857, Rusk vacated his seat by committing suicide; his wife had recently died of tuberculosis, and a tumor was discovered in his neck. Coupled with Sam Houston’s decision to run for governor, the death of Thomas Rusk meant that Anson Jones’ senatorial ambitions suddenly appeared nearer to realization. Buoyed by misplaced optimism, he returned to the state capital, where he expected a triumphant welcome and a quick election to the upper house of Congress. Instead, Jones’ arrival was virtually unnoticed in Austin, and he spent his days in his hotel room, brooding over his memorabilia from days gone by, ruminating over newspaper clippings and old letters that he believed would vindicate him in the eyes of history. He received exactly zero votes in the state legislature, which instead preferred James Pinckney Henderson, another former attorney general whom Jones dismissed as a “gamester and a sot.” (Henderson enjoyed the briefest of terms in office, dying of pleurisy in August 1858.)
Rejected by the country he served and saved, Jones’ life spiraled toward its conclusion. After selling his plantation for a quarter of its value, Jones traveled to Houston in January 1858 and sequestered himself in the Old Capitol Hotel for four dismal, lonely nights. There, he opened a letter from his wife, Mary, who expressed confidence “this little trip will be of service to you.” She urged him to “blot out the past” and forget Texas’ “ingratitude toward you.”
On the morning of January 10, it was reported that the forlorn Texas statesman had been discovered “lying across his bed this morning at half past 8 o’clock, a discharged pistol in his hand and his brains blown out. This is all the particulars of this lamentable affair we have been able to obtain.”
As the only US historian at my tiny university, I’m obligated to teach both halves of the survey as well as all sorts of courses that take me outside my areas of expertise. And lacking many of those, I spend much of my time in the classroom wondering if I’m not selling expired merchandise, interpretations or approaches that more accomplished professionals with better time management skills might have dismissed years if not decades ago.
That said, I’m relieved to report that at least where my treatment of the Revolutionary War is concerned, I manage to avoid these seven myths, described and debunked in nice detail by John Ferling. I come somewhat close to repeating two of the items on the list — I probably understate the possibility that Britain might have wom the war after 1778, and I probably overstate the turning-pointedness of Saratoga — but I think a jury of Ferlings would probably vote to acquit.
In any case, I want to make clear that there is no way I would ever send Cornel West a box of fried chicken. If we’re going to indulge in identity politics, let me just mention that I come from a Southern working-class family. If I had a box of fried chicken, I would eat it myself. Cornel West earns more in a weekend of public speaking than I do from a year of writing. Let him buy his own food.
The basic problem with my love relationships with women is that my standards are so high — and they apply equally to both of us. I seek full-blast mutual intensity, fully fledged mutual acceptance, full-blown mutual flourishing, and fully felt peace and joy with each other. This requires a level of physical attraction, personal adoration, and moral admiration that is hard to find. And it shares a depth of trust and openness for a genuine soul-sharing with a mutual respect for a calling to each other and to others. Does such a woman exist for me? Only God knows and I eagerly await this divine unfolding. Like Heathcliff and Catherine’s relationship in Emily Bronte’s remarkable novel Wuthering Heights or Franz Schubert’s tempestuous piano Sonata No. 21 in B flat (D.960) I will not let life or death stand in the way of this sublime and funky love that I crave!
#1: Is Cornel West admitting that he is a zombie? And if so, by “sublime and funky love,” does he actually mean “sweet, nourishing brains?”
#2: Um. It’s been almost 20 years since I read Wuthering Heights, but somehow I don’t recall the novel being a useful guidebook for the fully-unfolded openness and flourishing of sublime and funky love. Not that I didn’t also crave back then a love fully-blown with funkiness and sublimity, mind you. Indeed, I — like Cornel West and Schubert — refused to allow anything (e.g., my homely appearance, my regrettable hygiene, my social clumsiness) stand in the way of my craving for the sublime and the funky, love-wise. But when I imagined what a love flourishing with soul-sharing funk and mutual cravings for the eager and trusting sublime, a character like Heathcliff — who, if memory serves,
hung hanged his wife’s dog for the fun of it — would have seemed like a pretty unconvincing role model as I searched for evidence of God’s unfolding revelation of death-defying and life-affirming funkiness. But setting aside animal cruelty and spousal abuse, I will not let life or death stand in the way of this sublime and funky love that I crave!
What about you? What — if anything — will deter your funk-related cravings?
So I’m currently suffering my way through Sarah Palin’s book, in a style not altogether dissimilar to Jesus’ ordeal in the hands of the Roman Empire. I won’t pollute the air around here with too many details from the book, but I was amused to see that my former governor repeats the cherished myth that Americans mocked 19th century maverick William Seward for writing a Facebook note about “death panels” arranging the purchase of Alaska in 1867.
Critics ridiculed Seward for spending so much on a remote chunk of earth that some thought of as just a frozen, inhospitable wilderness that was dark half the year. The $7.2 million purchas became known as “Seward’s Folly” or “Seward’ Icebox.” Seward withstood the mocking and disdain because of his vision for Alaska. He knew her potential to help secure the nation with her resources and strategic position on the globe. . . . [D]ecades later, he was posthumously vindicated, as purveyors of unpopular common sense often are.
As Richard Welch pointed out more than a half century ago in the pages of the American Slavic and East European Review — a title that I’ll concede is likely not a part of Sarah Palin’s titanic reading list — the “Seward’s Folly” narrative has very little evidence to support it. Americans in fact knew quite a bit about the Russian territory prior to its purchase. Anyone connected to the whaling and fishing industries of New England, or to the West Coast fur trade, would have understood the potential value of securing Alaska; anyone who appreciated the value of thwarting British ambitions to round out their Canadian empire would have been pleased as well. (This would have included those Americans who still subscribed to Polkian-era fantasies about capturing British Columbia up to the 54th parallel. With the purchase of Alaska, the westernmost British possessions were now in “an American vice,” as Seattle’s Puget Sound Gazette theorized.) Moreover, there was a great deal of emerging scientific literature on the territory, with recent expeditions funded by Smithsonian Institute as well as by other public and private backers.
So far as public opinion was concerned, most newspapers actually supported the purchase. The major exception was the New York Tribune, which was owned by Horace Greeley, a Republican who was nevertheless one of William Seward’s avowed enemies. (Greeley believed Seward had been too radical on the slavery issue, among other things). Even Democratically-aligned papers in the North — while not missing the opportunity to crack wise about polar bears and walruses — tended to support the purchase, mainly because there was no compelling reason to oppose it. And at the end of the day, the treaty with Russia passed the US Senate by a vote of 37-2, with no significant expressions of opposition during the floor debate.
What’s odd — or not, depending on what view you take of Palin’s intelligence — is that most educated Alaskans are aware of all this, at least in its broad outline. It’s taught in the schools, and the few textbooks that have been written about Alaskan history all incorporate Wright’s findings into their treatment of the Alaskan purchase. Certainly someone who claims to know and love the state as much as the abdicated governor does should know that the “Seward’s Folly” myth survives because most people outside the state know very little about Alaska and are perfectly comfortable substituting fable for fact when thinking about its history, culture and geography. But since Sarah Palin’s entire schtick requires an audience that believes the myth — that believes, for example, that we can drill the shit out of the state without wrecking its ecology — I’m not surprised that she believes it as well. It’s certainly not the only bit of nonsense she’s peddling, but it’s a revealing bit at that.
…As an added bonus, Palin describes William Seward as just the sort of “colorful” character — like Soapy Smith and Skookum Jim Mason — that the Alaskan territory attracted. I don’t think anyone has ever described Seward as “colorful,” but I’m going to assume that Palin is actually thinking of William Seward Burroughs, whose fondness for guns and drugs would indeed have suited him well for an authentic Alaskan life.
Steve Benen, in the course of making an argument that most of his commenters don’t want to hear, overstates FDR’s intentions with the Social Security Act.
Roosevelt, the towering political figure of the 20th century, with an electoral mandate, a Democratic Congress, and the stench of a failed Republican president fresh on the nation’s mind, had to take what he could get on Social Security, which was far less than what he wanted.
Now, in a perfect world, a unicorn or magic pony of some kind would have written a history of the Great Depression and the New Deal that corrected this gentle myth in a short, introductory fash–OMIGOD! LOOKEE HERE!
The report [Committee on Economic Security] sent to Roosevelt called for universal coverage of the American elderly by pensions paid for partly by their own contributions and increasingly, over time, out of the general revenues of the U.S. Treasury. Roosevelt rejected this plan, declaring it was ‘‘the same old dole under another name’’—he wanted a self-financing plan under which old-age pensions worked on the model of insurance premiums.Workers and their employers would pay into a fund a percentage of their paychecks. In the event of retirement in old age, workers would draw a pension funded by their savings. The program would thus constitute ‘‘a wholly contributory scheme with the government not participating,’’ as Roosevelt asked.Critics immediately pointed out the drawbacks of this plan. No other country financed social insurance this way, and for good reason.
Contributions calculated as a percent of payroll put a relatively heavier tax burden on poorer earners. Within the administration, Harry Hopkins pointed out the regressivity of the payroll taxes and recommended a tax on wealthier Americans’ incomes instead. In the press, opinion-makers fretted that ‘‘the law is almost a model of what legislation ought not to be,’’ as the New Republic wrote.
The administration’s concern with fiscal soundness also prevented the Social Security system from reaching all Americans. Because the United States came late to the business of old-age insurance, it had the advantage of other countries’ experience to examine. As Abraham Epstein, an advocate of old-age insurance, noted in 1922, ‘‘It is evident that it can only be made to apply to persons who are in regular employment. It is next to impossible to collect contributions from persons who are irregularly employed, from agricultural laborers, from those who are not their own employers, from women who work at home not for wages, from small merchants, and so forth.’’ The Roosevelt administration therefore sought to follow other countries that had excluded farm workers and domestic servants from their old-age pension policies at the start, and Congress complied.
Now, you could certainly argue that because Roosevelt and the CES trimmed their interest in health insurance as part of the bill, what they got in the end was “less than what [they] wanted.” And we could point out that Congress did make revisions (for example, to the morals test for “mothers’ pensions”) that made the bill more conservative than perhaps the administration would have preferred. But Benen’s suggestion that FDR had to swallow major concessions to get the Social Security Act passed really isn’t well founded. There’s little doubt that FDR wanted — at least at first — a less comprehensive, less expensive, and more regressive program than what his own advisers outlined. Moreover, there’s plenty of evidence that FDR was ambivalent about aspects of the bill (including the old age pensions) that were not directly related to unemployment relief, which was his strongest motivation for pursuing social insurance in the first place. Even so, when conservative Democrats like Missouri’s Bennett Clark tried to give private employers a way to opt out of contributing to Social Security, Roosevelt successfully fought to preserve the system he’d proposed. All things considered, he pretty much got what he wanted.
If future civilizations needed to reverse-engineer my Pollocky whiteboard splatter to learn something about the state of historical knowledge in 2009, I honestly can’t imagine what sense they’d make of it all.
That wheel-like thing there in the second picture? That’s the US banking system in 1929. The thing is, it really looked like that! But someone, someday, will probably see that as a really terrible visual metaphor for . . . something or other. On the other hand, I’m particularly proud of the stick figure buying $100 worth of stock on margin. I think I did a nice job of capturing the “I’m Really Getting Screwed Here, But I don’t Yet Know It” look.
In the wake of George Wallace’s June 1963 “Stand in the Schoolhouse Door” — when the recalcitrant governor made good on his campaign pledge to “Stand Up for Alabama” by attempting to block two black students from enrolling at the state’s flagship university — Wallace began entertaining dreams of greater glory. Unlike certain young, recently-elected, revanchist governors in our own historical moment, George Wallace believed he would be better positioned for a run at the presidency if he were actually sitting in office at the time of the campaign; with 1968 in mind, he asked the Alabama legislature to amend the state constitution so that he might win a second term in 1966. While he waited — fruitlessly, as it would happen — Wallace began considering an intra-party challenge to John Kennedy. (He would eventually announce his intentions in Dallas in November 1963, not far from where Kennedy would die less a week later.)
Urged on in this illusion by telegrams and letters he received from whites outside the South, Wallace seems to have attached a kind of Lost Cause mythology to his encounter with the Kennedy boys. Though overrun by a power-mad federal government bent upon the destruction of the south’s racial folkways, Wallace could imagine himself as a noble hero who — by keeping the segregationist faith — would soon enough be redeemed. One of the keys to Wallace’s perception of himself was, oddly enough, the belief that he was acting in a non-violent and dignified fashion, that his June encounter demonstrated strength rather than weakness before the law; in standing alone, he simultaneously embodied the spirit of all “true Alabamans” while demonstrating that he could keep their bloodiest impulses at bay. Styling himself a man of law and order, Wallace contrasted his own conduct with the actions of civil rights protesters around the country, the degenerate berserkers whom the governor believed were aiming to destroy the nation. His governorship was a blessing to the White Citizens’ Councils, who also believed in their own “respectability” and rewarded Wallace with enduring and unflinching loyalty. Read the rest of this entry »
In the course of arguing that Congress should really do virtually nothing about health care, Joe Lieberman approvingly cites the Civil Rights movement as a model of incremental change. Why he believes it was a good thing that nearly a century passed before the federal government outlawed racial discrimination and provided meaningful substance to the Reconstruction Amendments, I won’t bother to speculate, but one would have to be a complete tool not to recognize that if a society is morally obligated to dismantle an exploitative and violent caste system, there’s no especially good reason to advocate that such change should take place “in steps.”
It’s another thing entirely to recognize that such changes did take place incrementally, though it’s worth pointing out that the legislation of 1964 and 1965 were dramatic and comprehensive by comparison with anything the previous ten decades had produced from the institution Lieberman allegedly serves. It’s also worth recalling exactly why the gradual transformations Lieberman celebrates were so long in coming:
The leadership of two major political parties colluded for decades to avoid dealing with an evident national problem; when even the mildest of remedies were suggested, they relied on parliamentary tactics to block debate and preserve minority rule. “Sensible” opinion-makers argued that change, while acceptable perhaps in theory, should be delayed for the time being because current economic growth would solve all problems, or because economic catastrophe required that other issues receive more immediate attention, or because there was a war to be won. Advocates of change were cited by their opponents as evidence that a pernicious, foreign ideology was eagerly seeking the republic’s destruction. Dray loads of irate throwbacks, styling themselves patriots, organized themselves and vowed to preserve the status by all available means.
If Holy Joe wants to align himself with that history, he should at least recognize that he is, as the cliche goes, on the wrong side of it.
Oh, the things you’ll learn from Google Books:
You know, for a man who apprently knew his way around a kitchen — having once been discovered making sweet, sweet love to Carrie Fulton Phillips on the kitchen table of his Marion, Ohio home — Warren Harding had some pretty unadventurous taste in waffles. Though I suppose we can grant him some courage (albeit anachronistically) for admitting he liked waffles in the first place. He’d never hear the end of it if he made the same mistake today.
…and a recipe for corn bread, from the last bearded man to run for US President on a major party ticket.
Clearly, this is the most excellent cookbook ever.
Imagine, if you will, a football field of standard dimensions. The faculty in Computer Science and Engineering, Mathematics and Statistics, along with a smattering of faculty from other sciences and a few in the Humanities, are sitting in the stands, spectating. The rest of the faculty are crowded together down on the field, wearing football helmets and running into each other at random, over and over and over again. There are no referees.
Such is the electronic behavior of the LSU faculty this afternoon, after geniuses in the registrar’s office decided that all 5000 faculty at LSU needed to be added to an unmoderated listserv. Yes, you read that correctly: an unmoderated listserv. We may never understand how the office arrived at this decision, especially in consideration of the fact (announced proudly on the listserv when it sent its first broadcast this morning) that the listserv would be used infrequently to make general announcements (which we already get through the usual university e-mail broadcast system used by, yes, the registrar).
Every single one of 5000 faculty members has now received several hundred messages containing short little phrases like “sign me up!” later followed by “remove me from this list” and much later by “remove me from this goddamn list you fucking idiots!” (I saved that last one, and I keep rereading it because it makes me happy.)
Something like this happened once on a conference call; I wish I could recall the context, but there were literally hundreds of people involved, at least half of whom were not evidently aware of that fact. And so the first ten minutes were wasted as seemingly every other caller piped in and then announced his/her name and affiliation, to the cascading dismay of everyone else on the line. Eventually, all participants had to be muted so that the meeting could actually proceed.
This is 31 flavors of stupid:
“There was a conscious effort on our part to counter some of the criticism of The Inquirer as being a knee-jerk liberal publication,” Mr. [Harold] Jackson [the Inquirer’s editorial page editor] said. “We made a conscious effort to add some conservative voices to our mix.”
Asked if the release of the memos affected his view of hiring Mr. Yoo, Mr. Jackson said: “From a personal perspective, yes. We certainly know more now than we did [in 2008], but we didn’t go into that contract blindly. I’m not going to say the same decision wouldn’t have been made.”
But Mr. Tierney said the memos did not alter his opinion.
“What I liked about John Yoo is he’s a Philadelphian,” [the paper’s publisher, Brian] Tierney said. “He went to Episcopal Academy, where I went to school. He’s a very, very bright guy. He’s on the faculty at Berkeley, one of the most liberal universities in the country.”
It would be hard to adequately describe the laziness at work in those two explanations. I certainly have no problem with the Inquirer highlighting, albeit unintentionally, the fact that most contemporary conservatives have no evident qualms about using the power of the state to break people in half; that defenders of waterboarding fail to grasp the difference between consent and force is, as my blogging colleague djw pointed out the other day, “easily on the seven or eight creepiest things about the contemporary right.” The more people come to understand this, the better the world will be. But if I were a different sort of conservative — one who, say, objected to tokenism or believed that presidential authority pulled up somewhere short of the right to crush a child’s testicles — I’d probably wonder why Jackson and Tierney couldn’t have found someone whose main function, it seems, will be to placate me while pissing off readers who believe (among other things) attorneys shouldn’t be rewarded for urging their clients to break the law. Beyond that, what exactly are Yoo’s merits as a public intellectual? His column the other day was bog-standard Republican crap about activist judges and affirmative action, thrown together with a few Amity Shlaes talking points about FDR and the New Deal.
Why, it’s almost enough to make one question the intellectual mien of the administration that took his other, even more repugnant ideas so seriously.
My wife just received a certified letter alerting me to the odd news that I’ve somehow received tenure and promotion to the rank of associate professor. Read the rest of this entry »
As a companion to Eric’s post below, Kevin Murphy offered a helpful survey of the recent efforts among conservatives to say exceedingly dumb things about the past. (He wrote it a week ago, but [insert several tedious excuses here on the subject of infants and toddlers and the howling nexus thereof, plus some stuff about grading and general existential lethargy] and so I’m just beginning to catch up on my rss feeds.)
I don’t know if there’s something qualitatively unique about the historical butchery that’s dominated right wing discourse during the first few months of the Obama administration. As Eric and many others have been chronicling across the internets, the history of Great Depression and New Deal have been grotesquely misrepresented in a variety of venues — like cable news, the Washington Post or the Wall Street Journal — that the young and/or the gullible continue to revere. As a result, my US survey course this semester had to devote a non-trivial amount of time to refuting an array of wingnut talking points about the causes, consequences and cures of the Great Depression. And yeah, I had to explain who Amity Shlaes is and why she’s properly regarded as a dishonest hack. In a way, though, it was a useful exercise, particularly to the degree that it allowed us to think about how historical knowledge matters to debates about public policy; to discuss historical methods in a preliminary way; and to raise questions about what historical study can and can’t teach us about the past. I was also able to make use of all the new economic history I’ve been trying to learn recently (e.g., Eric’s book, DeLong’s podcast), since that’s an area that’s been an enduring weak spot in my teaching.
But Murphy’s post covers material that isn’t even close to being serviceable. The Bush era took an especially awful toll on popular historical memory — a gentle casualty compared to all the rest — and perhaps their unreconstructed endorsement of the war on Iraq liberated a certain species of conservative to develop an entirely new, Rumsfeldian lifestyle oriented around the absence of evidence. It’s a remarkable thing, but they make people like Amity Shlaes seem like honest sparring partners.
Though as a parent I have the predictably mixed feelings toward Elmo, I can nevertheless appreciate the brilliance of Kevin Clash (see, for example, this interview).
And this is even better: