You are currently browsing the monthly archive for June 2008.
(Hearing Obama’s remark about respecting those who “make us uncomfortable with their words” got me thinking: what would happen if my irate former student got his hands on Roget’s regnant tome and started bloviating about Obama on the Internet?)
First in the foremost if you are stimulated by new ideas because you are the kind of person like me and you can think for yourself rather than simply accept what Obama says as ultimate truth I think you will find this post importantly of interest to you.
I guess I should start by saying that what Obama thinks is not what he says and even what he says he thinks only provides cover for an incoherent agenda of sorts. Even if we overlook the logistical impossibilities of such ideas as the ones Obama has profligated the underlying premise of all of them is flawed and you can tell this is the ontological truth of the matter because of the insults he employs like raucous scoundrel and pugnacious.
But if you look soberly and carefully at the evidence all around you on the television you will obviously see Obama’s grandisonant effusions are not pedantic treatises expressing theories or extravaganzas dealing in fables or fancies they are substantial sober outpouring from the very soul of demagogism.
Now stay with me for a moment here because I am making a point. Specifically that there may be nothing we can do to prevent Obama from making good on his word to destabilize and undermine the already volatile social fabric he purportedly aims to save.
When we compare this disturbing conclusions I have drawn from the premises presented we can only conclude that the comforting picture purveyed by his conspirators and feel what is called by experts cognitive dissonance from it. Our only recourse is to kick ass and take down names.
If his attempts to perpetuate what we all know is a corrupt system have spurred us to challenge his uncouth libidinous assumptions about merit then Obama may have accomplished a useful thing because if we are to set our sights on eternity then we must be guided by a healthy and progressive ideology not by the salacious ornithological ideologies Obama promotes.
It is possible that Obama does not realize this because he has been imbibed by so much of the propaganda of quislingism and if that is the case as I believe it to be then I tremulantly recommend we halt the malmish adulation heaped upon wretched misfits. If anything Obama has hair in his brain and even though we all are to some extent as hairy Obama sets the bar on the curve that much higher up the mountain.
Does Obama have trouble living with himself knowing that he gets perfervid about unilateralism because whenever that question is asked Obama and his conspirators run and hide. I suspect that that is precisely what they are going to do now so as to avoid hearing me say that Obama’s bloodthirsty ipse dixits leave the current power structure untouched while simultaneously killing countless children through the starvation of disease.
ARE THE CHILDREN HIS ENEMIES?
As you know doubt realize by now that is a particularly timely question because in fact I just half an hour ago heard someone express the opinion that Obama may be reasonably cunning with words and what he says even though he is nerdy with everything else.
Obama believes that ageism resonates with the body’s natural alpha waves but unfortunately as long as he believes such absurdities he will continue to commit atrocities.
Let us not sink to his level and let us combat plagiarism by exercising our right to speak out and denounce his publicitiy stunts as totally unrepresentative of the valuves of this society and let me conclude by stating that Obama’s brain must work very differently from mine because it is black.
You can quote me on that.
A map derived from War Plan Orange, the US war plan for defense of American possessions in the Pacific, as drawn up in the years before the first world war. Or rather, perhaps a plan for the inability to defend American possessions in the Pacific: Orange asked the US garrison in the Philippines to hold out for sixty days against a Japanese assault until relieved by a fleet sailing from the Atlantic. The theory was that the fleet sailing from the Atlantic would engage the Japanese near Guam.
Only, of course, the facts couldn’t hope to match the theory, as the planners themselves admitted. The fleet wasn’t up to the chore, for although it had plenty of battleships, it didn’t have enough support ships or staff. And the army wasn’t going to be able to hold out for two months.
From p. 15 of J. A. S. Grenville, “Diplomacy and War Plans in the United States, 1890-1917,” Transactions of the Royal Historical Society 11 (1961), 1-21.
On this day in 1971, the Supreme Court by a 6-3 majority ruled that the US government could not stop the publication of the Defense Department’s “History of U.S. Decision-Making Process on Viet Nam Policy,” better known as the Pentagon Papers.
The short per curiam opinion merely “held that the Government had not met that burden” required for such restraints on the press. Some of the concurrences went into greater detail on the inadequacy of the government’s claims, particularly its minatory generalities on the subject of “national security.”
The word “security” is a broad, vague generality whose contours should not be invoked to abrogate the fundamental law embodied in the First Amendment. The guarding of military and diplomatic secrets at the expense of informed representative government provides no real security for our Republic.
The Government says that it has inherent powers to go into court and obtain an injunction to protect the national interest, which, in this case, is alleged to be national security….
A debate of large proportions goes on in the Nation over our posture in Vietnam. That debate antedated the disclosure of the contents of the present documents. The latter are highly relevant to the debate in progress.
Secrecy in government is fundamentally anti-democratic, perpetuating bureaucratic errors. Open debate and discussion of public issues are vital to our national health.
The entire thrust of the Government’s claim throughout these cases has been that publication of the material sought to be enjoined “could,” or “might,” or “may” prejudice the national interest in various ways…. Even if the present world situation were assumed to be tantamount to a time of war, or if the power of presently available armaments would justify even in peacetime the suppression of information that would set in motion a nuclear holocaust, in neither of these actions has the Government presented or even alleged that publication of items from or based upon the material at issue would cause the happening of an event of that nature.
… under the Constitution, the Executive must have the largely unshared duty to determine and preserve the degree of internal security necessary to exercise that power successfully. It is an awesome responsibility, requiring judgment and wisdom of a high order. I should suppose that moral, political, and practical considerations would dictate that a very first principle of that wisdom would be an insistence upon avoiding secrecy for its own sake. For when everything is classified, then nothing is classified, and the system becomes one to be disregarded by the cynical or the careless, and to be manipulated by those intent on self-protection or self-promotion. I should suppose, in short, that the hallmark of a truly effective internal security system would be the maximum possible disclosure, recognizing that secrecy can best be preserved only when credibility is truly maintained.
Taking these excerpts as statements as much of basic political philosophy as of law, it appears that to at least five of the Justices in 1971, the use of a general claim that national security requires secrecy posed a presumptive threat to American democracy, and ought to be viewed with the greatest suspicion.
“The central question,” Senator Howard Baker (Republican of Tennessee) said on this day in 1973, in an effort to focus the minds of a crowded room, “is what did the President know? And when did he know it?”
He was questioning John W. Dean III in pursuit of the elusive truth about the welter of misdeeds known as Watergate. If those misdeeds failed utterly to destroy the American republic, it’s because the Congress and the courts exposed them, with the help of members of both parties, and passed laws to prevent their recurrence.
And at the time–in the middle of a long, drawn-out, ideological war–the administration was not shy of shouting “national security” every time it seemed its secrets might get out.
But the courts weren’t having it; as Justice Black had written two years before, “The word ‘security’ is a broad, vague generality whose contours should not be invoked to abrogate the fundamental law[.]”
Amid the general threat posed to the United States by the expansionist ambitions of the communist Soviets, a general threat sharpened by the specific need to support our troops mired in a war whose purpose was uncertain and end indefinite, the repeated claim that it would imperil national security to expose what the administration had done in the name of American citizens gained force.
Yet some politicians managed to do the right thing anyway, to ignore that specious claim, and seek the truth.
For the Baker quotation: Christopher Lydon, “Dean Navigates Watergate Shoals with Ease,” NYT 6/29/1973, p. 22.
On this day in 1969, police raided the Stonewall Inn in New York City, touching off riots that some historians and journalists argue began the modern gay rights movement. The above video is Charlie Rose’s show from June 24, 1994, an episode marking the 25th anniversary of the riots. Featured on the show are historian Martin Duberman, who literally wrote the book on Stonewall, along with Jim Fouratt, Barbara Smith, Tony Kushner, Andrew Sullivan, Bruce Bawer, Donna Minkowitz, and Ian McKellen. I try never to insult Charlie Rose, and shows like the above are the reason why. Also, he loves his MacBook Air as much as I do mine. And he’s way better than Terry Gross, who I really should insult more often.
* Written with tongue in cheek. Please don’t hate me.
The reason I’m an historicist instead of an historian—besides the obvious, like being in an English department—is because I’m a bit of a romantic when it comes to my history. For example, to appropriate a line from (of all things) Elizabethtown, I’m a connoisseur of first looks. When, I ask myself, did this figure of future historical significance first enter the national consciousness? Who, for example, is buried in this paragraph from the 30 July 1967 edition of the LA Times?
The flames quickly spread to the hangar deck … setting off bombs, rockets and other ordnance, while touching off many jet planes, all of which were fully loaded with fuel and heavily laden with ordnance.
You can almost picture him in one of the “many jet planes,” “bombs, rockets, and other ordnance” exploding around him; but as of 30 July he has no name, no face. For the moment, this anonymous pilot sits alone, as yet undisturbed by history, on the burning pitch of an aircraft carrier in the middle of the Tonkin Gulf.
The next day, the New York Times provided a fuller, albeit tentative, account of the fire that took the lives of more than 130 US sailors:
For some unknown reason, a plane parked near the carrier’s island, midway up the 1,045-foot flight deck, experienced an “extreme wet start.” This malfunction, comparable to what happens when a cigarette lighter is ignited after having been filled too full, occurs about once a week on attack carriers, but almsot never so severely as as it did yesterday.
A thick tongue of flame lashed backward from the parked jet, igniting a missile on one of the dozen or so planes parked near the fantail, their engines turning over in readiness for a strike launching scheduled for 11 A.M.
The rocket “shot across the deck,” Captain Beling said, “and by a quirk of fate smashed into a fuel tank under a plane on the port side.”
No one aboard the Forrestal seemed to know today which plane the missile had hit — but it was probably either the Skyhawk whose cockpit was occupied by Lieut. Comdr. John S. McCain 3d or the one immediately to his right.
He is become historical, mentioned by name in the paper of record.
Not that this is the first time his name has appeared, as the father and grandfather with whom he shares it appear regularly in Vietnam and WWII reportage; but this is our first glimpse of the man who will lose to Obama in November—alone, almost exploded, surrounded by the agonized screams of the likely dead.
What can I say? I’m like Paul Harvey. (Only macabre.)
The Bee had a nice article today on the new Civilian Conservation Corps memorial in Sacramento.
“I didn’t have a job. I didn’t have anything. There was no work at the time. I had finished the eighth grade and there was no high school to go to…. They put me on a train and we went all the way across the state to the Black Hills.”
The CCC was one of the earliest, most remarkable, and least controversial of the New Deal’s relief programs. It hired young unemployed men and put them to work saving the nation’s natural resources. You might could read Neil Maher’s book; also, Sarah Phillips’s book.
Or, you know, for the big picture, you know where to go.
Every department houses one person who doesn’t understand the purpose of the departmental listserv. What is, for the majority of its recipients, a simple means of alerting interested parties to talks, job offers, and other relevant departmental business is, for this person, a public confessional. If an email opens:
LGBT Event at Cross-Cultural Center
This person’s outraged response will look like this:
LGBTQ Event at Cross-Cultural Center
Or perhaps like this:
GLBTQ? Event at Cross-Cultural Center
Point being, this person will find fault with whatever the first email said because the entire department must be constantly reminded that his or her devotion to justice entails questioning of any potentially privileging statement. If the “G” follows the “L,” the author replicates—and in replicating legitimizes or, even worse, appears to legitimize—divisive structures of representation within the queer community (as broadly as the word can be defined).
Such grievous offense requires a brave crusader to parse every last permutation and then decide, in the end, that since every possible combination privileges something, the only thing to do is consider the sending of any email ever a pregnant “teaching moment” and append something like the 5,000 words he or she has just written to every one of them … for great justice.
Should someone respond to this email with a sarcastic recommendation that the letters be stacked, so as to avoid the any possibility of untoward LTR directional privilege, this person will write another 5,000 words, this time about how unfunny the resulting mess would be:
Only it is not merely unfunny—nothing is ever “merely” anything for this person—it is also symptomatic of the desire by hegemons to obliterate the unique identities of the various groups represented by each letter by transforming their hard-won bonds into an amorphous blob of non-identity, thereby effacing the personal struggles of people, like this person, some of which he or she will now share with the entire department no matter how inappropriate discussing the loss of your virginity on a departmental listserv might seem.
Positive responses to this cathartic effusion will be deemed patronizing; negative, evidence of insensitivity born of privilege; silence, a vile attempt to shame this person back into closet in which the queer-uncomfortable majority clearly believes he or she belongs. Any attempts to pull this person aside—via email, in the break room, it matters not—and convince him or her that no offense was offered or intended will result in yet another 5,000 word missive about the conspiracy to silence him or her. Were someone to pipe in with a reminder of listserv decorum, complete with a lengthy quotation of its charter, the conspiracy becomes institutional and still another 5,000 word letter must be written, this time only CC’d to the listserv as a courtesy when it is sent to the Office of Equal Opportunity and Diversity.
When this person finally leaves the department, he or she will write a lengthy email reminiscing about how crucial the listserv was to development personal and professional and apologize if exchanges were ever unnecessarily heated. Sooner or later, word will come through grapevine or the Chronicle of Higher Ed that this person has moved up in the world—what was once limited to interdepartmental listservs is now featured on the local news.
And I will laugh at the poor television producers who have no idea what they’ve gotten themselves into.
On this day in 1963, John F. Kennedy gave his “Ich bin ein Berliner” address. More than 100,000 people looked on as the President of the United States said that he was a jelly doughnut. Or that Berliners were jelly doughnuts. Well, it turns out not really. But that would have been totally awesome.
So, I have a question: beyond the political theater, especially the image of East Germans watching in silence as Kennedy spoke, can somebody spell out for me why this speech is still considered a big deal? I understand that JFK was identifying with the plight of Berliners. And I understand that Berlin at the time served as a symbol for the dangers of the Soviet Union’s ostensible program of world conquest. That explains, I suppose, why the BBC says that this “was seen as a turning point in the Cold War.” But such a claim hinges on a pretty loose definition of “turning point,” right? Not to mention the convenient use of the passive voice. In the end, the “Ich bin ein Berliner” speech seems to me like little more than anti-Communist demagoguery. I mean, JFK was no Mayor Quimby. Or am I missing something? I suppose we should just be glad that Kennedy didn’t try to give the Chancellor Adenauer a backrub. That would have been embarrassing.
On this day in 1876, just a week before the nation celebrated its centennial, George Armstrong Custer, along with more than 200 men of the Seventh Cavalry, died at the Battle of the Little Bighorn. Or, if you prefer, the Battle of the Little Big Horn. Or, perhaps, the Battle of the Greasy Grass. Whatever. On the one hand, the history of the fight is so well known that I feel silly blogging about it. On the other, the particulars are unknowable and there’s still so much mythology and hagiography surrounding Custer that both his career and demise remain shrouded in mystery.
That’s where Michael Elliot, a professor of English at Emory, enters the fray. His new book, Custerology: The Enduring Legacy of the Indian Wars and George Armstrong Custer, is an easy read and one of the finest recent works in the field of memory studies. At its best, it hearkens back to some of the canonical literature in the discipline of American Civilization: books by Henry Nash Smith, Leo Marx, and, most aptly, Richard Slotkin. I should note, so there’s no confusion, that this is just about the highest praise I can lavish on a scholar. Well, short of saying that they remind me of Richard Hofstadter (kindly take note of the last paragraph).
As Elliot’s title suggests, he sets his sites on the people who have studied Custer, individuals for whom, in many instances, the Little Bighorn is an ongoing concern. Elliot hits the predictable targets (reenactors, Park Service personnel, history buffs) and all the hot spots (Custer’s hometown, the Crazy Horse memorial, the battlefield itself). But he’s at his very best when he ventures into Indian country, where he considers the meaning of Long Hair and the Battle of the Greasy Grass for the Native people — and their descendants — who fought with Custer and those who killed him. Elliot’s portrait of Crow and Northern Cheyenne country, its people, and their simmering conflicts over history and cultural sovereignty is remarkable.
If I have a complaint about the book — and I suppose I have to come up with something, otherwise you people might think I’m not very smart — it’s that Elliot’s argument is somewhat predictable. Memory is contested in the New West, he says. To which I reply: um, yeah. Still, in this case, because Elliot works across huge cultural divides, and traverses so much time and space, he can be forgiven a thesis that’s something of a cliché in a field that’s still struggling to find its raison d’etre.*
[Author’s Note: Thanks to commenter Levi Stahl for sending me a copy of Custerology. I’m sorry it took me so long to read it. But I’m glad that I finally did.]
* Ooh la la, thees sintence ees vedy French, non?
I’ve been thinking quite a bit about book covers lately — less because I’ll produce anything worthy of one than how dreadful the covers of most academic books are. You’d think the book-designing avant-garde would be thrilled to work with people unconcerned with sales or money, yet the best cover I’ve seen in ages still falls prey to put-a-giant-fingerprint on-a-book-about-fingerprints logic.
That said, it could be worse:
Had I access to my old laptop and its surplus of image manipulation software, you’d be treated to covers of novels entitled POISONED! featuring drowned bodies.
Or something like this. (Only better.)
See update here.
By popular demand, the bumper sticker has become reality.
Get it here.
Couple points: Look, this really is a pro-Obama bumper sticker. It’s just a realistic, not to say jaded, pro-Obama bumper sticker. And maybe it’s more than a little whiny. You really think you can win the presidency without courting or crafting constituencies that Good People don’t like? I don’t. Of course, I’m a notorious pain; to borrow from the great Michael Bérubé, the number of people whose politics I can wholly accept would fit comfortably in a phone booth.1
Inasmuch as it really is a pro-Obama bumper sticker, albeit one with attitude, The Edge of the American West is not going to profit from it. We clear a dollar on each sale, and those dollars will go to the Obama campaign. Somehow we3 feel this will send the right message.
Second point: I only just ordered one myself, and I haven’t seen the physical thing yet. If it turns out to look crummy, I’ll let you know. You can order one now, but consider yourself an early adopter, downloading beta software, if you do.
1Remember phone booths?2
2Ooh, that’s another demerit from Leslie.
3“We” is Cala, whose idea the slogan was, and me, who designed it, and Ari, who supplies advice.
An anonymous emailer offers this answer to andrew’s question, “why has labor history declined so much as a subfield? I’m not sure I’ve seen a satisfying explanation….”
- Labor history suffers because unions are currently moribund. No strikes=no labor history. You can try to write a history of work, but the absence of conflict drives away the audience.
- Historians are now thoroughly middle class. Many of them not only don’t care about working people, they secretly distrust labor unions and dislike the white working-class men who have historically run them.
- Labor history attracts ideologues who are attracted to struggle for its own sake. They have an embarassing crush on the tiny, widely hated, IWW because Big Bill Haywood was RADICAL, DUDE!!! This kind of silliness can tend to drive serious people away.
- Labor history is no worse off than any number of fields. Given the explosion in the number of books, genres, subjects, and methods, there is no center. You often hear political, diplomatic, business, and economic historians complain that “social historians” have blackballed them. But you know what? Women’s historians have the same bitterness. Their work is also largely ignored. Everybody feels unloved.
So, what attracts the recognition of a wide audience? Race, race, race. It’s the only issue that really resonates with the boomers who currently run the profession.
Whether andrew will find that satisfying, I don’t know. It’s certainly vigorous.
[Editor’s Note: Ben Alpers, author of this excellent book, is back. And since I spent yesterday first driving to San Francisco, then getting on a plane at SFO, then flying to Cleveland, and then driving from the Cleveland airport to the East Side, all with two kids in tow, I really appreciate the help — even more than usual, that is.]
On this day in 1916, Mary Pickford signed a contract with the Famous Players Film Company that made her the most highly paid, and powerful, female star in Hollywood. The contract guaranteed her $10,000 per week for two years, thus totaling over $1 million. Just as importantly, it created a separate production unit within the studio, Pickford Film Corporation, over which Pickford, and her mother Charlotte, would have control. This gave the star enormous say over her roles and even the final cut of her films. She was also able to reduce the number of feature films in which she had to appear to six a year, still a very large number by today’s standards.
Pickford’s ability to garner such a deal reflected both her astuteness as a businesswoman and her status as Hollywood’s biggest star. Born Gladys Smith on April 8, 1892 in Toronto, Canada, Pickford became a child star of the stage in Canada and, in 1907, traveled to New York to pursue a career on Broadway. Two years later, she signed with D.W. Griffith and became part of his Biograph motion picture troupe. Pickford moved with Griffith’s company from New York to Hollywood in 1910. By the time she left Biograph in 1911, she had appeared in seventy of the short, one-reelers that dominated the American, and world, film markets until the rise of the feature film in the middle of the 1910s. Griffith famously did not credit his actors, as he feared that they would become too popular and powerful. But Pickford became the first great female star of American motion pictures while still working for him, “The Biograph Girl with the Curls.”
Pickford easily made the transition to features, appearing in seven feature films in 1917 and eight in 1915. In such movies as Tess of the Storm Country (1914), Rebecca of Sunnybrook Farm (1917) and The Poor Little Rich Girl (1917), she created a strong and marketable image. While her beauty was legendary, she usually played plucky, somewhat tomboyish girls, often dramatically younger than the actor’s own age. In The Poor Little Rich Girl, the then-twenty-four-year-old Pickford played a twelve-year-old.
Pickford’s business savvy never left her. In 1919, along with her husband Douglas Fairbanks, Charlie Chaplin, and her former employer D.W. Griffith, Pickford founded the United Artists studio, which was designed to give these screen legends total control over the production and distribution of their films.
But Pickford’s screen persona remained remarkably unchanged, and this eventually proved the undoing of her film career. Pickford continued to play adolescents into the mid-1920s (when the star was in her thirties). But as Pickford aged, these roles became less credible. And audience tastes in female stars, too, began to change. Pickford successfully made the transition to sound, winning a best-actress Oscar for her first talkie, Coquette (1929). But her career was already in a downward spiral. Coquette marked by a belated attempt by the actor to remarket herself. She cut her hair, which had been long, into a fashionable bob. And she played a more overtly sexual character closer to her own age. But audiences never warmed to the new Mary Pickford. Pickford retired from the screen in 1933.
A complete Mary Pickford one-reeler, The Dream (1911), as well as clips from two other Pickford silents, Rags (1915) and Little Lord Fountleroy (1921), can be seen here. A clip from Coquette is available here.
“Mary Pickford” (PBS’s The American Experience)
“Biography” from the Mary Pickford Library Website.
Gaylyn Studlar, “Mary Pickford” in Geoffrey Nowell-Smith (ed), The Oxford History of World Cinema. New York. 1996, pp. 56-57.
During the 80th Congress, from 1947-1949, for one two-year period in the postwar era, the Republicans enjoyed a majority in both the House (246 R to 188 D) and the Senate (51 R to 45 D). The GOP seized its moment, passing over President Truman’s veto on this day in 1947 the Labor-Management Relations Act, better known as the Taft-Hartley Act.
Taft-Hartley gave back to management and also to the government much of what the Wagner Act of 1935 had taken away.1 Where the Wagner Act sought to limit the size and power of government by giving unions the power to bargain legally for workers’ compensation—“[W]e intend to rely upon democratic self-help by industry and labor instead of courting the pitfalls of an arbitrary or totalitarian state,” Wagner explained—the Taft-Hartley Act increased the power of the state to regulate unions.
It was easy to find the reason for Taft-Hartley’s popularity. In 1946, hours of labor lost to strikes reached a record high. Two steel strikes, two coal strikes, and a railroad strike that year all threatened to shut down of the national economy.
The reason for the strikes was equally easy to find: the war had ended and with it the no-strike pledge. Pent-up demand for increased wages, of which a generation had been deprived by depression and war, suddenly sprang loose. Not all such demands met with understanding from management. And so the strikes came.
On January 6, 1947, in his State of the Union address, Truman asked Congress for “the early enactment of legislation to prevent certain unjustifiable practices.” Despite the submission of some dozens of bills to Congress, the lawmakers reached a decision relatively quickly, and once Truman vetoed their law on June 20, they re-passed it over his objections almost immediately.
Where the Wagner Act had left much basic state law in place, Taft-Hartley increased the scope of federal control. The new law created a provision for “national emergency,” which let the president shut down strikes. It banned the closed shop and permitted states to ban union shops. It offered a list of forbidden kinds of strikes. And it went on and on; as one analyst noted, “It is a long law, covering twenty-nine pages of eight-point type….”2 But the basic point was pretty clear: to give the federal government new powers to curb unions.
Now, it turned out the 80th Congress was not very popular. In 1948, despite a Dixiecrat challenge, the Democrats kept the presidency and won back both houses of Congress. The Democratic platform included a paragraph advocating repeal of Taft-Hartley. Despite the victories the repeal didn’t come. Perhaps it is because some of the equalization provisions of the law seem to provide a needed, equitable treatment, applying the same restrictions to unions as to management. Perhaps it is because lawmakers like the government’s increased power over labor-management relations, preferring a bigger government to Wagner’s vision of a nation in which the increased power of unions prevented an increase of power to the state. Perhaps it is because the Dixiecrat challenge revealed how beholden the Democrats still were to their southern wing which, among other things, was not particularly pro-union, and a shift in Democratic policy on unions would have to wait—even longer than the shift in Democratic policy on civil rights.
1More excellent material about the Wagner Act here.
2Sumner H. Schlichter, “The Taft-Hartley Act,” Quarterly Journal of Economics 63, no. 1 (February 1949): 1-31, quotation on 8.
The title is a paraphrase from Harry Millis and Emily Clark Brown, From the Wagner Act to Taft-Hartley.
As a youth I was fortunate that my parents put me in nerd camp—computer programming classes at the Science Center. They had a Honeywell mainframe, in a room full of tape drives and disk drives, the disks that looked like stacks of LPs in a covered cake dish made of clear plastic.1 All that was housed in a room with plate-glass windows, and on the other side was a room full of terminals. Many if not most of them were basically teletypes with keyboards—every time you hit a key, it would dot-matrix the character right onto a roll of perforated paper that just kept on scrolling as you typed. At first I preferred these to the LED screens, because they reminded me of typewriters and if you had to debug code you reached behind the machine and lifted up a yard of paper to scan down it, holding a pencil, making you look like someone reading the stock-ticker or telegraph tape in an old movie. We started in BASIC, and the first program they showed us produced an ASCII art picture of Snoopy.2 Oh, brave new world. I think the appeal of the thing was basically identical to that of playing with an insect or a lizard you found in the yard: you do something to it and it reacts, not always in a predictable way. Maybe you can train it, you think….
When did you first realize you could get along with a computer?
1Not unlike this, but I remember them being cylindrical.
2I think it was this one, but this page has annoying music so maybe you don’t want to open it.
You would assume women would love to date someone who calls himself “The Great White Elf” and writes poetry like:
I am howling all night
and prowling till the early light.
I hunger for blood and desire the fight.
I am the alpha male seeking my mate
and for her alone I will wait.
That’s from “Werewolves Are Everywhere.” More evidence—not that we really needed any—that white supremacists don’t know from meter. What? We do need more evidence? Really? How about this, from “I’ve Got Relatives in Mexico”:
Each time I’ve fought back anger
At the stupidity of their race
Who think a white man wouldn’t know
From where this infestation takes place
Convinced yet? No? Have you read “Vote for Ron Paul”?
This’ll make you mad
At least I think
But the guy you love
Will make things stink!
In this election
None of the candidates are good
Except one, that’s all
Who every freedom lover loves
And that’s Ron Paul
But I digress. This isn’t a post about the inability of white supremacists to grok meter.
This is a post about white supremacy and the contemporary dating scene.
[Editor’s Note: Commenter Matt Dreyer sends along the following as food for thought. I don’t know how many times I need to tell the rest of you people to step up and start pulling your weight. This is a group endeavor.]
On June 21st, 1789 New Hampshire ratified the United State Constitution. It was the ninth state to do so. Article VII of the Constitution states, in full, “The ratification of the conventions of nine states, shall be sufficient for the establishment of this constitution between the states so ratifying the same.” June 21st is therefore, whatever a bunch of crazed July 4th partisans might want to tell you, the birthday of the United States as opposed to the date, give or take, on which a motley collection of colonies declared independence from Great Britain.
On this day in 1975, Universal Studios released the scariest horror film of my youth: Jaws. Speaking of which, while I was in graduate school, my closest friend had a summer house in the Hamptons. We used to spend a fair amount of time there, as you might imagine. Because that’s how we rolled. Anyway, the nicest and closest beach was at East Hampton. And nearly every time we went there, no matter the day or time, we’d see this wiry, George-Hamilton-bronzed older dude in a black banana hammock stretched out on the sand. Although he was often the only other person there, we never paid much attention to him. Until, that is, this one day, when the guy got up, stretched, and turned toward us. It was Roy Scheider! In all the times we saw him there, he never once went in the water. True story.