You are currently browsing the monthly archive for November 2009.
The Legislative Analyst’s Office has a report on the California Master Plan at fifty, or nearly fifty—guess they figured they’d better hurry, the MP might not outlast forty-nine, the way things are going. Some key conclusions:
Key higher education funding decisions have been made without the benefit of clear state policy guidance. For example, the state has no formal policy to guide the setting of student fees at the public colleges and universities. As a result, fee levels have been unpredictable and volatile, with little alignment to the cost of instruction or to students’ ability to pay. Similarly, the state lacks a policy for funding enrollment growth at the public universities. For the past several years, the state budget has not specified any particular enrollment level at the universities, instead allowing the universities’ governing boards to decide for themselves how much enrollment to support with their funding. Moreover, there is not even consensus among state policymakers as to what it does or should cost to educate a university student.
The state’s Cal Grant financial aid programs have been somewhat more consistently funded, generally adhering to statutory eligibility criteria and fully covering educational fees for students at public institutions. However, the state’s ability to meet these commitments has been threatened as the Governor and others have sought to reduce or even eliminate Cal Grant benefits as a way to address the state’s budget deficit. Moreover, recent state budgets have departed from statutory guidelines for setting Cal Grant levels for students at nonpublic institutions.
Some components of the state’s higher education apparatus have also declined or are under threat of elimination. For example, the California Postsecondary Education Commission (CPEC)—the state agency charged with coordinating the state’s higher education efforts—saw its budget and staffing reduced by almost half in 2003, and several past and current bills have sought to eliminate or radically change the commission. Meanwhile, a state law that provided for regulation of for–profit private colleges was allowed to expire, leaving these colleges to operate without state oversight for over two years. (Legislation was passed in fall 2009 that would establish a new regulatory bureau and framework for 2010.)
Finally, demographic changes have altered the types of higher education challenges the state faces. At the time of the Master Plan’s adoption, the state sought to contend with an anticipated “tidal wave” of students seeking access to higher education. Today, the state is facing projected shortages of college graduates and is seeking ways to increase college enrollment. At the same time, incoming students are less prepared for college, resulting in college completion rates far lower than they were 50 years ago.
Overall, the state’s vision for its higher education system is less cohesive than it was a half century ago. There is little methodical state oversight and planning, and the linkage between state budget decisions and policy goals is weak. Instead, the individual segments of higher education are largely left to develop their own policies according to their own priorities, with little guidance from state policymakers.
There supposed to be a new plan, maybe in time for Christmas.
Anthony Portantino, D-La Cañada Flintridge (Los Angeles County), who chairs the Assembly Higher Education Committee, said those conversations have already begun in preparation for hearings on overhauling the Master Plan, possibly in December.
“California has dramatically changed in 50 years,” Portantino said. “We need to make sure the promises made are kept.”
Ricardo Gomez of the UC Student Association agrees. But the Cal undergrad is skeptical that conversations and hearings will change the fundamental problem.
“We’ve been lobbying legislators for years telling them that UC is not living up to the Master Plan,” said Gomez, legislative affairs chair for the association.
“We can talk about innovative solutions, but at the end of the day it comes down to fully funding higher education,” he said. “The state needs to increase its revenues.”
Drastic times call for (at least talking about considering) drastic measures.
“I looked as hard as I could at how states could declare bankruptcy,” said Michael Genest, director of the California Department of Finance who is stepping down at the end of the year. “I literally looked at the federal constitution to see if there was a way for states to return to territory status.”
There were no bankruptcy options, and the legislature chose to cut back sharply on education and health care to fill the gap.
Perhaps California could secede, and then hope that as part of the inevitable defeat and reconstruction that the federal government would force us to write a proper constitution! Have we considered that?
Seriously, though, what does it mean—this is not a rhetorical question, I’d really like to know and don’t have an answer—when it seems more plausible to engage in constitutional shenanigans than to, for example, restore the vehicle licensing fee to its full former level, and other measures of that sort?
Back when I was in grad school, lots of people were buzzing about Foucault.* But the really hip kids were deep into Walter Benjamin. And being hip**, I hopped on the bandwagon and never jumped off. Benjamin’s work has become especially important for me recently, as I’ve tried to finish my book on the politics of memory surrounding the Sand Creek massacre. Which is all just a long way of pointing out that Terry Eagleton’s study of Benjamin has been re-released (though maybe not in the States). Regardless, it’s worth a read. And now, having said all of that, I find myself wondering: which theorists are the kewl kidz*** reading these days?
* Yes, I’m that old. And also washed-up, but that’s a story for another day.
** Well, not really. But some of my best friends were Europeanists.
*** I know, I know, historians can never really be kewl kidz. Except for Marc Bloch, bitchez, who was kewler than Elvis and Beeker combined.
Michael Bérubé says if you don’t watch this video, he “will come to your house—and you don’t want that.”
Now, in my experience Michael is a perfectly delightful guest, so I don’t know how much of a threat this is. At most I expect he might give you a special JoePa-chair head-slap, and urge you to roll up your pant legs.
But you get the point.
I enjoyed this outburst.
I just don’t get it. I give up. I’m, like, off the bus.
However, a confession: It struck me as I was writing this that Tye simply couldn’t be saying what I was taking him to say… It struck me that nobody could believe that. So I went and tried it out on a couple of philosophy friends … and they agreed that nobody could believe what I was writing that Tye believes. Fair enough, but then, what is one to make of such a passage as this: “An object’s looking F . . . [isn’t] a matter of an object’s causing an experience which represents simply that something is F [sic]. The experience one has of the seen object is one into whose content the seen object itself enters” (my emphasis)….
Now, I’m kind of a Tarskian about meaning. I don’t do “radical interpretation”. So, when someone writes “the experience one has of the seen object is one into whose content the seen object itself enters” I suppose that he is probably saying that the experience one has of the seen object is one into whose content the seen object itself enters. Perhaps someone of a more hermeneutical temperament than mine will correct this reading in next week’s Letters page in the TLS, and I will then feel a perfect goose. For now, however, I shall proceed on the assumption that I have got Tye more or less right.
I’m sure it’s wrong of me to like this passage, not least because I know I would hate anything like it to appear in a review of a book I’d written. And I know full well that could happen—which means I keenly feel how unfair it is. Yet we read reviews with the morals we have, not the morals we might want or wish to have.1
1And yes, I also feel bad about quoting Donald Rumsfeld, but I enjoy that too. I’m really quite morally indefensible.
On this day in 1918, Susan Owen (center in picture) received word that her son, Wilfred, had been killed the previous week while fighting with his unit in the Battle of the Sambre. She thus might have read the words of his death while listening to the bells of the town church peal the news of the Armistice that ended World War I. Peace had come for Britain, if not perhaps for her.
She likely feared such a telegram. Wilfred’s letters to her rarely tried to conceal the situation at the front. One, from 1917, said that:
I can see no excuse for deceiving you about these last 4 days. I have suffered seventh hell. I have not been at the front.
I have been in front of it.
I held an advanced post, that is, a ‘dug-out’ in the middle of No Man’s Land.
Those fifty hours were the agony of my happy life.
Every ten minutes on Sunday afternoon seemed an hour.
I nearly broke down and let myself drown in the water that was now slowly rising over my knees.
No death is preordained, of course, but those of a frontline soldier in World War I came closer than most. One of Wilfred’s poems may have suggested to Susan that her son was at rest, of a sort, while all around people loudly celebrated. At a Calvary Near Ancre:
The scribes on all the people shove
And bawl allegiance to the state,
But they who love the greater love
Lay down their life; they do not hate.
Or perhaps not.
Happy Birthday Sesame Street! And many more! For a wonderful series of posts marking the occasion, see here, here, here, and here. Also, if you’d like to share your favorite Sesame Street moment(s) in the comments, with or without links, that would be lovely. And finally, yes, I know the above clip isn’t exactly celebratory (and that we’ve talked about it here before), but for me it represents the essence of the show. Put another way: it’s my party, and I’ll cry if I want to.
On this day in 1989, once everyone stopped patting themselves on the back for bringing down the Berlin Wall, Secretary of Defense Dick Cheney lamented what East German freedom would do to defense contractors in the pages of the Wall Street Journal:
Come hear Leo Chavez talk about The Latino Threat.
It’ll be good value.
Steve Benen, in the course of making an argument that most of his commenters don’t want to hear, overstates FDR’s intentions with the Social Security Act.
Roosevelt, the towering political figure of the 20th century, with an electoral mandate, a Democratic Congress, and the stench of a failed Republican president fresh on the nation’s mind, had to take what he could get on Social Security, which was far less than what he wanted.
Now, in a perfect world, a unicorn or magic pony of some kind would have written a history of the Great Depression and the New Deal that corrected this gentle myth in a short, introductory fash–OMIGOD! LOOKEE HERE!
The report [Committee on Economic Security] sent to Roosevelt called for universal coverage of the American elderly by pensions paid for partly by their own contributions and increasingly, over time, out of the general revenues of the U.S. Treasury. Roosevelt rejected this plan, declaring it was ‘‘the same old dole under another name’’—he wanted a self-financing plan under which old-age pensions worked on the model of insurance premiums.Workers and their employers would pay into a fund a percentage of their paychecks. In the event of retirement in old age, workers would draw a pension funded by their savings. The program would thus constitute ‘‘a wholly contributory scheme with the government not participating,’’ as Roosevelt asked.Critics immediately pointed out the drawbacks of this plan. No other country financed social insurance this way, and for good reason.
Contributions calculated as a percent of payroll put a relatively heavier tax burden on poorer earners. Within the administration, Harry Hopkins pointed out the regressivity of the payroll taxes and recommended a tax on wealthier Americans’ incomes instead. In the press, opinion-makers fretted that ‘‘the law is almost a model of what legislation ought not to be,’’ as the New Republic wrote.
The administration’s concern with fiscal soundness also prevented the Social Security system from reaching all Americans. Because the United States came late to the business of old-age insurance, it had the advantage of other countries’ experience to examine. As Abraham Epstein, an advocate of old-age insurance, noted in 1922, ‘‘It is evident that it can only be made to apply to persons who are in regular employment. It is next to impossible to collect contributions from persons who are irregularly employed, from agricultural laborers, from those who are not their own employers, from women who work at home not for wages, from small merchants, and so forth.’’ The Roosevelt administration therefore sought to follow other countries that had excluded farm workers and domestic servants from their old-age pension policies at the start, and Congress complied.
Now, you could certainly argue that because Roosevelt and the CES trimmed their interest in health insurance as part of the bill, what they got in the end was “less than what [they] wanted.” And we could point out that Congress did make revisions (for example, to the morals test for “mothers’ pensions”) that made the bill more conservative than perhaps the administration would have preferred. But Benen’s suggestion that FDR had to swallow major concessions to get the Social Security Act passed really isn’t well founded. There’s little doubt that FDR wanted — at least at first — a less comprehensive, less expensive, and more regressive program than what his own advisers outlined. Moreover, there’s plenty of evidence that FDR was ambivalent about aspects of the bill (including the old age pensions) that were not directly related to unemployment relief, which was his strongest motivation for pursuing social insurance in the first place. Even so, when conservative Democrats like Missouri’s Bennett Clark tried to give private employers a way to opt out of contributing to Social Security, Roosevelt successfully fought to preserve the system he’d proposed. All things considered, he pretty much got what he wanted.
Edward M. Bernstein, the man who supposedly added the words “and Development” to “International Bank for Reconstruction and Development,” talking to an interviewer, Stanley Black, about his good fortune in life; he got hired right out of graduate school to tenure at North Carolina State in 1930, the midst of the Great Depression.
Bernstein: No, I didn’t start at the bottom. To tell you the truth, although my wife doesn’t like me to say it, all my life I’ve been overappreciated, overhonored, and overpaid. Everywhere I went I got to the top of the scale very fast.
S.B.: It helps to have the talent with words, and writing.
Bernstein: Maybe, oh yes, there’s nothing like being able to write. Being able to write is a remarkable gift. There’s none better, if you can also think.
I really like that caveat. Though the exchange is also nice for Black’s Steve-Martinish interposition.
… was a mistake. Not the address you’re thinking of, American. On this date twenty years ago, East German Politburo spokesman Günter Schabowski was giving a press conference. Immediately before it had started, he had been handed a note instructing him to announce some new travel regulations that had been intended to appease East German protesters. These regulations included a provision for the travel of private citizens to the West, and it’s unclear whether it was meant as more than an empty promise, as the East German government had neither conferred with the Soviets nor ordered the armed border guards to stand down.
So into the sleepy little press conference Schabowski read his note, announcing that private travel visas to the West would be granted. “When?” asked the reporters.
And then came the mistake.
“As far as I know, immediately, without delay,” replied Schabowski.
A tiny stumble on a mountainside leads to a great fall. During the evening news broadcast in West Germany, the anchor Hanns Friedrichs joyfully proclaimed that the ninth of November was a historic day, for the East German government had announced that its borders were open. East Germans who listened surreptitiously to the Western broadcast immediately gathered at the gate. Within an hour, thousands had gathered at the Wall.
In a nearby possible world, this story ends with a bloody riot. Armed guards shoot the boldest of the misinformed citizens; the uninjured retaliate. Guards are killed, the police put down the riot, and the Wall stands, not forever, but for a little while longer as the Soviets eased into openness.
In this world, Harald Jäger, in command at the Bornholmer Gate, decided not to shoot. He called his superiors, who of course had heard of no such policy change, and faced with the gathering, chanting crowds, decided to let a few cross the border; by midnight, he simply opened the gate to all, not taking names or checking identification.
As the news spread and more Ossis came to the Wall, other gates opened throughout the night, and what was done was done. Eleven months later the states of the DDR would join the West, reuniting Germany.
The Times has an interactive graphic up that breaks out the unemployment numbers for different groups of people. It turns out that only 3.9% of college-educated white men between the ages of 25 and 44 are unemployed. Compare that to nearly 50% for young black men who haven’t graduated from high school (I think that’s the group with the highest rate). Nothing here is especially surprising, and I wish The Times had included more variables, but it’s fascinating nevertheless.
“Get off Elmo! You’re not supposed to touch Elmo!” Seriously.
Speaking of period dramas on television, John Rogers recently told me to watch Life on Mars. So I am. And so far it’s really quite good: early Hill Street Blues meets A Connecticut Yankee in King Arthur’s Court (or something).
Anyway, the thing I’m enjoying most is the show’s relentless critique of nostalgia. The main character, a contemporary British detective who finds himself transported back in time to Manchester in 1973, can’t seem to decide if he misses his friends or his cell phone more. When he’s at his most despairing, in the early episodes at least, he focuses on the dearth of creature comforts available to him. Even if you weren’t trained as an environmental historian, the emphasis on material conditions — a lack of central heat, spotty electricity, a studio apartment appointed with a twin bed — is pretty obvious. It’s a healthy reminder that the past, even the recent past — forget the damp and drafty castles of the Middle Ages — was pretty grim.
The point may be that our current age is wondrous, filled with innovations straight out of science fiction, especially in the realm of policing and medicine. Regardless, though I suspect historians are especially cranky about the emptiness of nostalgia, I think the show gets its view of the historical city just right: unlike Mad Men, which makes the early 60s built environment seem awfully appealing — that furniture! that color palette! — Life on Mars suggests that urban life used to suck.
By no means a complete list, but stuff that caught my eye.
- UC Davis asks instructors to teach freshmen writing seminars for free
- UCD library budget, static over the last decade while the student population increased, now to fall
- UC-wide Academic Senate opposes fee increases for graduate students—“fee increases will result in fewer graduate students at UC. Aside from disappointing would-be graduate students, reductions will affect undergraduates and faculty by shrinking the number of teaching assistants, leading to fewer small discussion sections and negative impacts on faculty research.”
The highly recommended crabs, that is, at the Seaside Restaurant in Glen Burnie, MD, close by BWI. The service is great and the ambience just what you want from a place that will wrap your table in brown paper and bring you a load of filter feeders and a mallet. And beer.
They put the spice rub on the shell, so when you pull it apart you get the spices on your fingers, and when you pick out the meat you get the spices on the meat. A fine system.
My brother and I happened to have a chance to visit and partake last week.
If you haven’t seen the episode, feel free to read this. No real spoilers ahead.
As someone who knows a little too much about the subject, I’ve been eagerly awaiting Mad Men’s portrayal of the Kennedy assassination. And when it came, on Sunday night, I thought the episode was beautifully done. But I found myself less intrigued by the portrayal of the event itself than by how the writers used the assassination to advance the one of the show’s main themes: that white, middle-class American women suffered from the feminine mystique in the 1960s, and they weren’t going to take it anymore. In Mad Men’s universe, John Kennedy died Sunday night, but Betty Draper is just starting to live.
The writers have dropped hints throughout the season that the show would address the assassination: the brief shot of Margaret Sterling’s wedding announcement, with its portentous date; the eerie recreation of the Zapruder film with the John Deere accident in episode 6; the frequent references to dates as the episodes moved through 1963, with November always looming in the distance. And, when the event finally came, the show handled it with grace and intelligence. I loved the way that Walter Cronkite came on the television set in the background, with the volume on low, as Peter and Harry discussed office politics, completely oblivious to the news, and the sudden emphasis on TV, as everyone gravitated to sets throughout the episode. Overall, there was the general sense of tragedy, loss, and confusion, especially after Oswald’s murder. “What is going on?” both Don and Betty ask, separately.
But the story of Betty’s gradual awakening was integrated with the narrative of the assassination in a way that brought home to me why so many women like this show. I know a lot of men who despise it; they see a loathsome main character, Don Draper, who lies and cheats on his wife; they see pervasive misogyny, and it makes them feel uncomfortable and depressed.
Many women, though, understand and empathize with how the female characters are objectified and mistreated. But at the same time, we know that, while sexism has hardly disappeared, women have a lot more options and, yes, a lot more power than they did in 1963. As the show moves through the early 1960s, the anger builds, but the opportunities unfold. We know where this story is going, and we like the ending.
When US sailors first set foot on Midway (then called Brooks) in 1867, the birds were so numerous on the ground that the men could not walk without stepping on the chicks in their nests. Now we can accomplish the same results without traveling to a remote atoll to do it in person.
So, about half a month ago, when I started writing this post, Yglesias argued that the way celebrity chefs should be helping people eat healthier food is by aiding the production of pre-packaged meals that are better for you. Why?
If over time people were getting poorer, but the number of hours in the day was getting longer, and gender norms were shifting toward the idea that women should get married young and drop out of the workforce in order to do unpaid domestic work, then obviously people would start cooking more. But that’s not what’s happening. Compared to people in 1959, people in 2009 have more money, less time, and less ability to call on socially sanctioned unpaid domestic labor. So obviously they’re going to cook less. Or to look at it another way, there are lots of things you can do in 2009 that you couldn’t do in 1959—read a blog, download an MP3, get a movie from Netflix on Demand. There are also a lot of things you can do in 2009 that were prohibitively expensively in 1959—fly cross-country, make a long-distance phone call to your sister. But there’s no more time in the day. Which implies that people need to spend less time doing the things that you could do in 1959. Sometimes we can get out of this box by finding technological innovations that let us do things more quickly, but you can’t really speed up cooking from scratch.
The good news is that there’s no real reason to think that food you prepare yourself is for some reason intrinsically healthier than food someone else prepares for you.
Eh. I’m not convinced. Ta-Nehisi Coates has a good story about what he learned when he first baked blueberry muffins. Baking treats yourself ensures you know what goes into them.
To that I’d add a couple of points. Portion size is much easier to control when you cook your own food, as is the addition of salt, spices, and fats. It also strikes me as unlikely that the best organic hippie-dippie free-range pre-packaged food imaginable will be free of stabilizers and preservatives. While I don’t want to return to 1959 (though I think the argument that we have less time is somewhat undercut by the idea that blogging and Netflix are these new things we do), I think there’s no way around the idea that home cooking is better for you, especially if you’re in an area where your take-out options are limited to unhealthy fast food.
It’s not a metaphysical certainty, but I know which way I’d bet, and it’s not on Auguste Gusteau’s Frozen Dinners becoming common (or, for that matter, affordable.)
I think the problem here is conceptual. (Shut up, Neddy.) Says Yglesias, “I like to cook. Sometimes. I think it’s fun. And I’m certainly glad I know a few recipes. I hope to learn more. And everyone should know a few, ” but there’s a difference between recipes and techniques, and its the latter that gets the cook through every day.