You are currently browsing the monthly archive for August 2008.

(An unrelated link for those who’d prefer to laugh at rather than with me.)

What’s wrong 1) with the modern conservative movement and 2) pandering to it?  This:

I told my wife this morning we can now go vote on November 4th. McCain is not as stupid as we right-wingers thought he was.

Ain’t it sweet of him to enfranchise his wife?  Now she can join the 909 (of 7,500) citizens of Wasilla who elected Sarah Palin mayor and the 114,697 (of 683,478) citizens of Alaska who elected her governor.*  I know I’m being unfair.  This in no way reflects on Palin, who is her own woman, who possesses strong feminist credentials, and who would never let any man speak for her.


*Ignore as irrelevant the fact that the equivalent of the entire state of Alaska (655,923) voted Obama into the Senate.  Because no matter what, come what may, roam if you want to (only don’t), the important thing to remember is Obama lacks executive experience (whatever that is).

Advertisement

Wait for the 1:20 mark. Then watch Michael Moore call Hurricane Gustav proof of god’s existence. He’s kidding, of course. And maybe I should have a better sense of humor about this kind of thing. But it’s hard for me to laugh when I consider that tens of thousands of people along the Gulf Coast are having their lives disrupted by another hurricane. That this horror is unfolding so close to the anniversary of Katrina heaps insult atop injury. So please, people on the political left, try to restrain your glee as Gustav screws up the Republican National Convention.

Even more disgusting? John McCain and Sarah Palin are headed to Mississippi, hoping to use Gustav as a photo-op. Thanks, Senator McCain, we already have an excellent photographic record of your response to major hurricanes. Really, how about if you just stay out of the way? Lord knows you don’t have the strength to lift a sandbag.

Why is Sarah Palin running with John McCain if she hates old people so much? From the Anchorage Daily News, 6 August 1997, we read of Palin’s successful efforts to rid Wasilla of septuagenarian museum curators. Born during the New Deal and nurtured to maturity during the high point of American economic liberalism, these women probable harbored sympathetic attitudes toward the very state apparatuses that hundreds of renegade Wasillans had elected Sarah Palin to drown in a bathtub.

Behold:

Opal Toomey, Esther West and Ann Meyers don’t seem like politically active types. There are no bumper stickers on their cars, no pins on their lapels.But the three gray-haired matrons of Wasilla’s city museum decided to take a stand last week. Faced with a $32,000 budget cut and the prospect of choosing who would lose her job, the three 15-year-plus employees decided instead to quit en masse. They sent a letter to the mayor and City Council announcing they plan to retire at the end of the month, leaving the museum without a staff. They also sent a message: They’d rather quit than continue working for a city that doesn’t want to preserve its history.

”We hate to leave,” said Meyers, who at 65 is the youngest of the three. ”We’ve been together a long time. But this is enough.” If the city were broke, it would be different, she said. ”If they were even close to being broke.”

Instead, the city is flush thanks to a 2 percent sales tax passed in 1994 that has left it with $4 million in reserves. There is no reason the museum’s budget should be cut, Meyers said . . . .

The women are only the latest to leave the city payroll, noted John Cooper, who was the museum’s director until Palin fired him last fall.

In addition to Cooper, Wasilla Police Chief Irl Stambaugh left last winter after Palin fired him, and planning director Duane Dvorak and Public Works director John Felton turned in their resignations this summer.

”People are voting with their feet,” he said.

Palin maintains she is doing what voters asked. To have $4 million in reserves is prudent. That’s not even an entire year’s budget, she said.

Much of the latest flap over the museum is a misunderstanding, she said.

All the council wanted was to cut back the museum’s hours in winter from seven days a week to five. The women made the decision to resign, Palin said.

West, Toomey and Meyers disagree. They say they were told that one of them would have to leave in September.

Probably by drawing straws to see who might be processed into bland, green wafers.

Our government once again demonstrates its commitment to conduct unbecoming a free people.

Why did new Mayor Palin fire the police chief and librarian of Wassila?

Palin said in letters she wanted a change because she believed the two did not fully support her administration…. Palin refused Friday to detail her complaint … saying only that “You know in your heart when someone is supportive of you.”

“Wasilla librarian keeps job,” Sitka Daily Sentinel 2/3/97*

Political loyalty above all else! Look into your heart! If that doesn’t sound like four more years of the same, I don’t know what does.

*Looks like the librarian stayed but the police chief went.

So, did you hear? Sarah Palin was a beauty queen. I figured I’d offer a reason why going after her on beauty queen grounds is bad on more than just for its sexism. And due to contractual terms here at EoTAW, I’ll nerd it up for you!

Read the rest of this entry »

[Editor’s note: Senior frontier correspondent, awc, sends along the following dispatch from the wilds of Alaska’s history. Please be aware that this post likely will be updated and edited when awc’s team of intrepid sled dogs, carrying in their intrepid maws more history, arrives later today at EotAW world headquarters.]

One of the most amusing storylines being floated by Republicans is the idea that Sarah Palin is a maverick. Certainly, she appears less corrupt than her peers, no hard feat in the nation’s most crooked state. And she has diverged from party orthodoxy on a few issues.

The most obvious theme, however, that emerges from Palin’s bio is a Bush-esque obsession with loyalty. We all now know about her alleged attempts to pressure the state police to fire her former brother in-law. Less well known are her actions as mayor of the city of Wasilla.

Shortly after her election in October 1996, she asked the police chief, librarian, public works director, and finance director to resign. In addition, she eliminated the position of city historian. Her critics charged the directors were dismissed for supporting her opponent, John Stein, in the preceding election. Palin further rankled city employees by issuing a gag order, forbidding them to speak to the media without her permission. Her controversial behavior led to demands for her recall. The press characterized the situation like this:

Four months of turmoil have followed in which almost every move by Palin has been questioned, from firing the museum director to hiring a deputy administrator at a cost of $50,000 a year to a short-lived proposal to move the city’s historic buildings from downtown. Critics argue the decisions are politically motivated. Palin says people voted for a change and she’s only trying to streamline government.

Daily Sitka Sentinel, 2/11/1997, 13.

The matter was only resolved three years later, when a federal judge upheld her right as mayor to remove the chief without cause.

Palin claimed she was trying to streamline government, and we might believe her were it not for the trooper incident. But the pattern of evidence suggests that she believes public employees should be politically, ideologically, and personally faithful to her. And if anything has been discredited by this administration, it’s the idea that loyalty is more important than ability.

Heckuva VP choice, Johnny!

What sort of executive is Sarah Palin?

The newly elected mayor of Wasilla has asked all the city’s top managers to resign in order to test their loyalty to her administration…. She’s also been criticized by the local semiweekly newspaper for a new policy requiring department heads to get the mayor’s approval before talking to reporters. An editorial in The Frontiersman labeled it a ‘gag order.'”

“Wasilla’s new mayor asks officials to quit,” Daily Sitka Sentinel, 10/28/96, p. 3.

Does that sound like a management style familiar to you? (What was that phrase? “Mayberry Machiavellis”?)

More to come, thanks to intrepid frontier correspondent awc….

Here’s exclusive video of John McCain announcing his selection of Sarah Palin as his Vice Presidential candidate. Eli is playing the role of the United States.

I have more extended thoughts here.

At the American Political Science Association annual meeting, in Boston’s Hynes convention center, room 107.

7:58 All the panelists have their laptops up and open. Henry has brought donuts to entice the audience.

For an 8AM panel, there’s pretty good attendance—30-or-so people [by 8:15, more like 35; if this trickling keeps up, I’ll have the biggest audience of the panelists] in a room that might hold 50—I think we call this the Krugman effect. Or else it’s the donuts.

Krugman will talk first, then Pierson, McCarty and me.

Perlstein sees my laptop screen and says, “Hi Mom!”
Read the rest of this entry »

[Editor’s Note: Vance Maverick, the real original maverick, returns for another guest post. Thanks, Vance, for doing this. We very much appreciate it.]

On August 29, 1952, at the Maverick Concert Hall in Woodstock, New York, David Tudor gave the first performance of John Cage’s composition 4’33” — consisting, notoriously, of nothing but silence. It remains Cage’s best-known piece: many more people have been provoked by it (take our own Dave Noon) than have ever attended a performance. In one sense it’s unrepresentative of Cage’s work — he wrote many, many pieces before and after it, full of all kinds of sound — but it’s a landmark, and understanding it does shed light on his extraordinary career. Further, it’s the representative extreme of a tendency in his work which is well worth learning to listen to — and in the right frame of mind, it’s 4’33” well spent.

John Cage was born in 1912 in Los Angeles, the son of an inventor. He was the valedictorian of Los Angeles High School, and went on to Pomona College. But he soon dropped out, and traveled the world as a young bohemian would-be writer. On his return to Los Angeles, he decided to devote himself to music, and studied composition, most notably with Arnold Schoenberg. There was admiration and resistance on both sides — Schoenberg called him “not a composer, but an inventor, of genius.”

In the later 1930s, he made a name for himself with a series of percussion pieces. Cage was not the first composer to work with percussion alone (Varese, for one, was there first), but he made good use of found instruments like automobile brake drums, and was a witty showman and spokesman.

His next, and richer, innovation was the “prepared piano”. This is a piano temporarily modified by attaching various small objects to the strings, each one adding a characteristic buzz or jangle, or muffling the note, turning the piano to a one-man percussion orchestra. It was a provocative gesture to tinker with the grand piano, master instrument of the European nineteenth century — but a gentle provocation, making the piano quieter and less resonant, and less standardized, and setting it back to rights again after the performance.

At the same time, Cage was growing more interested in composition by system, and under severe procedural constraints. (One rewarding example is the String Quartet in Four Parts, of 1950.) The obvious inspiration for this is the twelve-tone system he learned from Schoenberg. But Cage’s purpose was the opposite of his teacher’s, or of the latest refinements in serial technique (Babbitt or Boulez). Rather than seeking to control his music in all its details, to maximum expressive effect, Cage realized that he wanted to get away from direct conscious control, in order to open his ears, and his audience’s, to something beyond what an individual could intend.

His next steps in this direction, in 1951, were to use chance procedures in composition and performance. For Music of Changes, he used the I Ching to make compositional decisions. And in Imaginary Landscape IV, he specified what the performers should do, but gave them an intrinsically unpredictable instrument, the radio. (At the premiere, the concert ran so late that most stations were off the air, and the realization was mainly silence and static. Cage took this in stride, but his advocate Virgil Thomson was not amused.)

From there, it was only natural that Cage should take the step of not making any sound at all. He had been interested in silence for years. On the one hand, it’s worth listening to: we’ve all experienced highly charged silences between musical sounds, or between words, or in special physical places. And on the other, silence is never truly silent — there are always “incidental” sounds, which we generally bracket out of our experience of music. The concert environment, with its social ritual to focus the attention collectively, is an opportunity to bring the listening skills we’ve trained on Beethoven to bear on silence. And in the outdoor setting at Woodstock, there were plenty of ambient sounds to listen to. That evening, though, Cage didn’t do much to prepare the audience (the program was terse). Some were irritated, and some walked out.

Having done 4’33”, Cage had no need to repeat it. He developed his toolkit of chance techniques in a seemingly endless series of pieces and improvisational events. His influence at home and abroad, already considerable in 1952, only continued to grow. He branched out into other arts. His books, particularly Silence, are very well worth reading; he has rightly been anthologized among poets, and he was a master of anecdote. And he also did strong graphical work — his music scores were better looking than anyone’s, and in the 1980s he made a beautiful series of prints. But of course his main work remains musical: he changed the way we all hear and make music, by making it and by writing about it.

The only performance of 4’33” I’ve attended myself was in a Cage tribute concert at Mills College in Oakland, a year or two after his death in 1992. The concert was four hours thirty-three minutes long, a kind of continuous variety show, with many Bay Area avant-gardists (including Pauline Oliveros and Terry Riley) performing in groups. It began and ended with 4’33” itself. The audience was warm, and well in tune with Cage (and one another), and the silences were rich. The first rendition felt a little long, but the second rang with all the sounds of the evening, and with the history of all the experimental music Cage inspired.

On this date in 1963, the March on Washington for Jobs and Freedom brought several hundred thousand Americans together in the nation’s capital, where — depending on whom you might have asked — they had convened un support of national civil rights legislation, to chastise the Kennedy administration for its meandering commitments to racial justice, to summon young African Americans into action, or to dramatize the “Beloved Community” of which Martin Luther King, Jr., had so often spoken. Though united by a wish to rivet the nation’s attention to the cause of black freedom, the event’s organizers famously disagreed on nearly everything else. Indeed, the march has to be considered the symbolic high point of the post-war freedom struggle as well as the point at which generational, regional and tactical fault lines began to slip.

Arguably, 1963 was the most consequential moment for the civil rights movement. In this, the centennial year of the Emancipation Proclamation, the SCLC and other groups won a crucial victory in Birmingham by stuffing its jails and forcing the city to defend its apartheid with a spastic display of police violence. The effects of “Project C” — as the Birmingham campaign was known — were transformative. It brought thousands of poor, working-class, urban blacks into the struggle, and it drew greater attention to the meshed relationship between segregation per se and the broader patterns of economic and residential inequality that shaped the likes of blacks nationwide. The “package settlement” that activists won from the city of Birmingham encouraged civil rights leaders to push for what Whitney Young called a “domestic Marshall Plan” for black America. Others in the movement saw Birmingham as a template for grassroots action. In the wake of Birmingham, groups like the Congress of Racial Equality (CORE) and the Student Non-Violent Coordinating Committee (SNCC) — which had pushed the tactics of the movement toward direct action and confrontation — hoped to spur a new wave of protest, especially in the urban North and West.

When President Kennedy announced his support for a federal civil rights bill in early June, a coalition of groups assembled to revive the notion — posed two decades earlier by labor organizer A. Philip Randolph — of a mass civil rights demonstration in the nation’s capital. The Kennedy administration, for its part, was terrified that “a big show at the Capitol” might accomplish precisely what its younger activists had hoped it might. Envisioning a city brought to its knees, and fearful that a march would cost him the support of whites in states like Michigan and Illinois, Kennedy pressured King to call the whole thing off. From the Justice Department Robert Kennedy and Burke Marshall warned King that he his “communist” associates were jeopardizing the bill and imperiling the President’s political future. Meantime, J. Edgar Hoover helpfully offered to tap King’s phones.

Somewhat belated, JFK at last endorsed the march in July — all the better to try and contain it. “If we can’t stop it,” he huffed, “we’ll run the damn thing.” And on August 28, they more or less did. The event was rigorously scripted, and the “march” itself consisted of a short, unobtrusive walk from the Washington Monument to the Lincoln Memorial. Mindful of the fact that CBS would be broadcasting everything, the White House sought to convert the march from a black protest to an administration pep rally. Organizers were instructed to lard the crowd with as many conservatively-dressed whites as they could find, and speeches — most notably that of SNCC’s John Lewis — were trimmed of their inflammatory barbs. (To some extent, the debate over Lewis’ speech was moot. If Lewis or anyone else had, in fact, shaken their fists too vigorously, administration aides were prepared to cut the microphone and play a Mahalia Jackson record instead.) The day was a triumph of moderation.

In the intervening 45 years, the march — and King’s synecdochal address at the Lincoln Memorial — has been converted into perhaps the most recognizable expression of American civic nationalism, a day-long ode to aspirations deferred and fulfilled, a colorblind vindication of the American creed, a Lincolnesque utopia from which angry Negroes and peckerwood throwbacks alike had been effortlessly dismissed. We’ve inherited comfortable, self-congratulatory and ahistorical mythology about that day, a story easily assimilated into liberal, minor-key tales of progress as well as the obnoxious, conservative efforts to claim Martin Luther King, Jr., as an opponent of multiculturalism and affirmative action. (King’s epic speech, with the perverse assent of his own estate, has even been used to shill for fiber optic companies that manufacture components for “smart” bombs and missile “defense” systems.)

To believe in the myth, it’s necessary to forget the intra-movement rivalries that only grew in intensity after the march (and to a great extent because of it); to forget that the march failed to sway Congressional support for the civil rights bill; to forget that the march did little to enhance local civil rights organizing; to forget that the event was bracketed by white supremacist violence throughout the South (including the assassination of Medgar Evers and the obliteration of four young girls at a Birmingham church); and of course to forget that King himself lived another five years, during which time he articulated truths about his country that remain a thousand times more relevant than any words he delivered 45 years ago today.

(cross-posted at LGM)

Barack Obama became the first African-American man in this country’s history to be nominated by a major political party to serve as its candidate for the presidency. And in this moment, if only just for a moment, I can forget my cynicism, my growing outrage at the mainstream media, and my occasional disappointments with the Obama campaign and instead feel proud to be a Democrat and an American.

Update: Let me take another crack at this. The Democratic Party, historically the party of slavery, secession, and segregation*, just nominated a black man as its candidate for the presidency. Sometimes things get better — if only incrementally.

Update II: All due respect to Senator Clinton, who knows how to seize a moment.

* And, it should be said, later the party of Civil Rights. Like I said, change happens.

So all right, publius says this about Sean Wilentz on Obama in Newsweek:

this particular column is not merely inaccurate, it calls his larger credibility as a historian into question. I don’t say that lightly, so hear me out.

Historians are supposed to apply an empirical methodology to their trade. You spend years poring over documents and you go where the evidence takes you. Because it’s so difficult for the lay reader to verify accuracy, it’s particularly important that historians refrain from letting personal opinions and blinding emotions taint their work. When that happens, inconvenient facts get ignored and historical research becomes an exercise in cherry-picking to support pre-existing subjective opinions.

That, in a nutshell, is precisely what Wilentz’s column did.

And Lemieux piles on.

Let’s stipulate that I disagree with essentially every substantive claim in the Wilentz article.

Let me repeat and amplify: Sean Wilentz is wrong—about Obama, about what liberal intellectuals ought to say about Obama—the whole thing.

Does this mean the column “calls his larger credibility as a historian into question”? No.

Let me tell you a story.1
Read the rest of this entry »

[Editor’s Note: Bryan Waterman, associate professor of English at NYU, joins us today to talk about, well, read it for yourself. Bryan was gracious enough to send along a bevy of links so that I could do some research and “make fun of [him].” To which I’d reply, friend, I’m not sure you understand the seriousness of this blog. And also: I do research at my day job. Anyway, Bryan’s first book, Republic of Intellect, is here. And he blogs, among other places, at a history of new york, where you can visit, if only virtually, Yonah Schimmel’s Knishery and experience some of the things about New York that are missing from your goyishe life in California’s Central Valley. Wait, did I say that out loud?

Thanks, Bryan, for doing this.]

On August 26, 1970, the fiftieth anniversary of the Nineteenth Amendment, the notorious feminist author and activist Betty Friedan, out-going president of the four-year-old National Organization of Women, led tens of thousands of women in a march down Fifth Avenue toward Bryant Park, where, packed on the lawns behind the New York Public Library, the crowd heard addresses from Friedan, Gloria Steinem, Bella Abzug, and Kate Millett, among others.

The Women’s Strike for Equality, as it was billed, called on women to withhold their labor for a day as a way to protest unequal pay—roughly 60 cents to every dollar a man made at the time—though the march itself didn’t begin until after 5 pm in case potential marchers elected to stay on the job. Organizers also asked housewives to refuse work: “Don’t Cook Dinner—Starve a Rat Tonight,” a typical sign read. The Equality march even included some who were old enough to have paraded for women’s suffrage over a half century earlier, and some marchers demanded complete constitutional equality under the Equal Rights Amendment, which, once it passed the House in 1971 and the Senate in 1972, would spend the next decade being debated, ratified (and in some cases rescinded) by states, yet ultimately refused.

(August 26, 1970, also happens to have been the day I was born, across the continent in the rural Southwest, a world away from New York City and Women’s Lib alike. A few years later I would ride with other children on a July 4th parade float, dressed as a tree holding a stop sign that read: “STOP THE ERA!”

But I digress.)

The Times coverage seems by turns both excited by the prospect of the women’s movement and bewildered by the day’s spectacle, noting the support of state and national political figures for commemorative celebrations as well as the apparently surprising fact that the Bryant Park rally was uninterrupted by hecklers. The article also reports on oddball moments: for instance, a smaller crowd had gathered earlier in Duffy Square (Broadway between 46th and 47th), where one “Ms. Mary Ordovan, dressed in cassock and surplice as a ‘symbolic priest,’” consecrated the spot for a statue of Susan B. Anthony, which would replace the one of Father Francis Duffy, a WWI chaplain and Hell’s Kitchen reformer. Crossing herself, Ordovan called on the name of “The Mother, the Daughter, and the Holy Granddaughter. Ah-Women, Ah-Women.”

In a brief aside, the reporter then explains that “‘Ms.’ is used by women who object to the distinction between ‘Miss’ and ‘Mrs.’ to denote marital status.” (Within a year Ms. magazine would be founded by Steinem.)

I first came across this Times article—which was itself my introduction to the history of the Women’s Strike for Equality—a decade ago when, as a grad student in American Studies, I had the chance, by an odd set of circumstances, to teach several semesters of U.S. Women’s History. The experience was rewarding and humbling for several reasons—not least because the classes often included one or two elderly women who spent their retirements as “evergreen” students, taking a class a semester in topics that interested them. Their presence initially made me somewhat uncomfortable once we’d reach the 1940s and I’d realize that from here on out some of my students had lived—as women—through the very history I had to lecture on, as a 28-year-old male.

But the courses were also made challenging by the advent of what was just then being called “post-feminism,” a fact that made me somewhat uncomfortable when I’d inevitably realize that a lot of my younger students thought they had no need for feminism in their own lives. To them the world as all a hold-hands-and-sing Coca Cola Christmas commercial; they thought gender inequality belonged to the past to distant cultures whose traditions, short of female circumcision and slavery, needed to be respected. When I asked them to recall Hillary Clinton’s controversial “stay home and bake cookies” moment during the 1992 campaign—after all, it had happened only five or six years earlier—they reminded me that they had been in middle school at the time; such things were as remote to them as playground bullies and kickball.

Only a quarter-century after the Women’s Strike for Equality, as we were routinely told in the late 1990s, the television series Ally McBeal had driven the last nails in the movement’s coffin. Remember that Time Magazine cover? Looking back, it also seems like a watershed moment when feminist studies in the academy gave way to cultural studies of feminism; rather than argue about what women had or hadn’t gained, how they’d done it, and when, we’d henceforth talk, for better or worse, about how feminists exploited or were exploited by celebrity culture and mass media. Was the Equality march really a landmark event in American women’s history? Or had Friedan’s media tactics simply ensured it would be remembered that way?

Either way, what those 50,000 women had done—their march spilling over from the police-approved single lane, filling the Avenue from curb to curb—seemed almost impossible to imagine, not so much because their feminism seemed outdated, but because so many younger women had become politically apathetic, appeased by a modest set of gains that masqueraded as equality. The media were full of stories about younger women who bought the line that feminism had done them wrong, powerful women who decided to quit their jobs, once they’d begun to reproduce, and give traditional stay-at-home motherhood a chance. And voila! We have contemporary Park Slope, Brooklyn, and its hordes of organic, free-range—but highly monitored—children.

At 3pm on August 26, 1970, according to the Times,

Sixty women jammed into the reception area of the Katherine Gibbs School, on the third floor of the Pan Am building at 200 Park Avenue, to confront Alan L. Baker, president of the secretarial school, with their charges that the school was ‘fortifying’ and ‘exploiting’ a system that kept women in subservient roles in business. Mr. Baker said he would ‘take a good look’ at the question.

About 10 members of NOW, starting at 9 A.M. and continuing on into the afternoon, visited six firms, business and advertising agencies, to present mocking awards for allegedly degrading images of women and for underemploying women.

Among the businesses they visited, the article concludes somewhat dryly, was the New York Times itself. Who knew that NOW anticipated Michael Moore by all those years? Too bad they hadn’t taken more cameras with them.

Betty Friedan, the “mother of modern feminism,” died in 2006 on her 85th birthday; her landmark 1963 book The Feminine Mystique, reductively credited with jump-starting the movement, is now generally considered quaint—even offensive in places—if surprisingly compelling.

Gloria Steinem, on whom I developed a mad, Harold-and-Maude style crush on hearing her speak in the early 90s, is now in her 75th year; during the recent primary season she endorsed Clinton and wrote in a Times op-ed that gender, rather than race, remained the bigger obstacle to equality in American life.

Bella Abzug wore big hats and talked refreshingly brash talk until she died in 1998; I hope she was spared the debate about Ally McBeal‘s impact on the movement.

Kate Millett, who in 1970 had just published her excoriating if wooden Columbia Ph.D. dissertation as Sexual Politics (the only really exciting parts are the summaries and quotations from dirty, sexist books) survived years of troubled relations with media outlets and, more recently, Bowery developers; though her Christmas tree farm has gone the way of her downtown loft, she continues to run an upstate artist’s colony for women at age 74.

Can anyone name four feminist leaders of their stature—or even their celebrity—today? If not, whose fault is it?

On this day in 1964, the Democratic National Convention’s credentials committee

approved a compromise that would permit the seating of an all-white Mississippi delegation plus two members of a competing, integrated delegation from that state

as Tom Wicker reported in the New York Times.

This is what happened: in the midst of Freedom Summer, black and white Civil Rights workers put together a racially integrated slate of delegates to represent the state of Mississippi, challenging the legitimacy of the customary all-white Jim Crow delegation.

Although Lyndon Johnson had lobbied for and signed the Civil Rights Act earlier in the summer, he was not so wholeheartedly committed to the cause that he wanted any disturbances at the convention that would nominate him, in his own right, for the presidency. So he (in the words of historian James Patterson) “leaned on” the credentials committee to put forth a proposal seating the segregationists as Mississippi’s delegation, plus also seating two members of the MFDP—a white one and a black one, at Johnson’s direction. And they wouldn’t vote. And the black one, Johnson said, couldn’t be Fannie Lou Hamer, who on August 22 had delivered an account of what happened to her when she encouraged other blacks to register to vote:

I was carried to the county jail…. And it wasn’t long before three white men came to my cell where they had two Negro prisoners. The state highway patrolmen ordered the first Negro to take the blackjack…. And I laid on my face. The first Negro began to beat, and I was beat until he was exhausted…. The state highway patrolman ordered the second Negro to take the blackjack. The second Negro began to beat….

Johnson didn’t want to give Hamer any more of the spotlight. Hence his efforts to give something to both sides.

But of the credentials proposal Hamer declared, “We didn’t come all this way for no two seats.” And the MFDP rejected the proposal. On the other side, segregationist Mississippi delegates left the convention. So did some Alabamans. And both states went for Arizona Republican Senator and GOP nominee Barry Goldwater in the election, along with a small band of deep south states.

Well, as Nina Simone says, “Everybody knows about Mississippi Goddam.” At the time it looked as though Johnson had isolated these relics of an older era. Of course, as Joe Crespino points out, soon these states led the way to changing the GOP and indeed the country in their own image.

The estimable Robert Farley has already said most of what needs to be said on this matter, and not to pile on PZ, but this just isn’t going to work. I mostly agree with Myers: science classes are for science. But that’s in part why there’s one specific thing here I want to argue about, though, and it’s this claim of Myers in response to ‘is there a God?’ that’s (to use a technical term) wrongheaded:

One is to accept the usual open-ended, undefined vagueness of the god entity and point out that the reason it can’t be answered is that it is a bad question — it’s not even wrong. Science doesn’t answer it, but then no discipline can, because it’s a garbage question like “what color are invisible elephants?”

Look, Myers is a scientist, so of course he’s going to like the tools of science, but this question — whether it is rational to have faith in God — is a question that has been, like, seriously investigated. By atheists and theists. By philosophers! And philosophically-minded scientists! At well-respected universities! At a level far above gotchas!.

I’m not going to pretend that the debate is settled one way or another. It’s not.

But here’s the thing. Because there’s been a lot written about this, in the academy and in the popular press, lots has trickled down. The evangelical students are getting exposed to amateur theodicies all the time. As one example, the Jack Chick-type comic books have panels pretty much designed to deal with the caricature of the evil high school biology teacher who is going to make the Christian renounce their faith and feed them to the plethiosaurs. (One was passed around in my high school biology class! It had a brave Christian student taking down the evil biology teacher who had a picture of a gorilla in a frame that said ‘Our Father.’) A high school teacher who tries to toss off a few gotcha questions to prove God doesn’t exist is going to get roasted intellectually.

So, what to do? One can raise it above the level of gotchas, but then it really starts to turn into a philosophy of religion class. Which I’d be all for if done properly. But for the high school teacher not inclined to learn philosophy just to teach about crossing pea plants and dissecting squids? Maybe take a page from Galileo: if both science and religion are true, then they’re necessarily compatible, so the job of science is to learn about the world the best we can. Also, here is what is on the test.

(If the teacher is feeling brave:

I do not feel obliged to believe that the same God who has endowed us with sense, reason, and intellect has intended us to forgo their use.

Probably not the most effective. But infinitely more fun that gotchas!)

You asked for it, Henley.

Read the rest of this entry »

In Bosnia, prisoners used pigeons as drug couriers.

Bert was unavailable for comment.

Everyone else is talking about it. I just wonder what the rates are for universal default due to a VP selection.

This is officially an award-winning blog

HNN, Best group blog: "Witty and insightful, the Edge of the American West puts the group in group blog, with frequent contributions from an irreverent band.... Always entertaining, often enlightening, the blog features snazzy visuals—graphs, photos, videos—and zippy writing...."