You are currently browsing the category archive for the ‘meta’ category.

The latest round of Ron Paul excitement reminds me of this blog’s long and rich relationship with the mad doctor. Herewith a holiday selection of oldies.

Enjoy.

In a belated comment on this post, David Brewster links to a photograph he took that neatly brings the conversation back to the subject of this blog:

This image also debunks one of those vicious smears about the pair.  (This other one?  Not so much.)

The blogospheric dynamic often resembles that of a particularly raucous frat party.  Someone gets the idea in their head to dance on a table.  Suddenly dancing on tables with a bottle in hand is  the best idea to occur to anyone, not dancing on tables is a sign of depravity, and the drunken boys surrounding the table chanting “Chug! Chug! Chug!” reinforce the dancer’s dubious choice.  Hours later the dancer comes to, facedown in a lampshade, and, at this particular party, wondering what on earth he could have done in his stupor to earn the plastic green beads roped around his neck.

Do not mistake my silence on Iran for a lack of interest.  Like everyone else, I’m reading the blogs and the tweets. I find the regime’s violence abhorrent.   I sympathize with the protesters.  They look like the nearby counterparts of my friends and students.  I am impressed by their courage, and I suspect that there is no force fiercer than an Iranian mother.

I’m conscious, however, of how little I know about Iran or Iranian politics, and discretion being the better part of amateur punditry, I didn’t have much to say.  Here’s the thing: neither does anyone else  know what is going on.  We have a narrative that says, truthfully, oppressive regimes that fake election results and beat up their citizens are bad, and non-violent protesters of those results are good.  The problem is with the further embellishment:  that “bad” and “good” map onto domestic American politics concerns, attitudes, and goals; that the favored protesters would, if they won, be recognized by the U.S. political establishment and chattering classes as allies; that Iranian politics is as familiar to us our own so that we can feel confident in the comparisons.

Would we trust a pundit who thought that Barack Obama was a Republican, or that in the United States, Presidents were elected by a simple majority vote?

Daniel Larison noted last week that with a slightly different spin or simply some more information, Mousavi would not look as favorable to the West.  Will Wilkinson’s latest posts on  the “vanity dressed up as elevated moral consciousness” of  Twitter avatars strike me as extremely perceptive.   Exiled Iranian filmmaker Lila Ghobady has a much harsher view of both the current regime and Mousavi as a reformer:

Let us not forget that Mousavi was Prime Minister of Iran in the 1980s when more than ten thousand political prisoners were executed after three-minute sham trials. He has been a part of the Iranian dictatorship system for the past 30 years. If he had not been, he would not be allowed to be a candidate in the first place. In fact in a free democratic state someone like Mousavi should have gone on trial before becoming a presidential candidate for his crimes against thousands of freedom-loving political prisoners who were killed during the time he was Iran’s Prime Minister.

Read her whole column.  At the very least, it shows how little we should be confident of understanding the situation or of the morality of breathlessly cheering it on.  I am not arguing that there is no reason for the protesters to take to the streets.   Nor am I arguing that Mousavi would be worse, or that the Iranian leadership is a force for good.

I am, however, counseling sobriety.

It is easy to get swept up in a romantic narrative of someone else’s passionate struggle from behind the safety of one’s keyboard.   Revolutions are exciting from far away, and the illusion of participation in something greater than oneself by…. making your Twitter avatar green… by bloviating about it on the Internet….changing your Facebook status… being anti-green Muppets…. protesting the irresponsible consumption of ice cream…is seductive.

It’s unclear right now what will happen.  The very real risk on our end is no matter how the Iranian political dispute is resolved, this incident will become grist for a future ill-conceived war.  The same groups rending their garments over the murder of Neda will be calling for the bombing of her relatives.

So, I think the intellectually and morally responsible thing to do is to hope quietly and read voraciously.

We’re a partially pseudonymous blog (it’s an open secret that I write all Ari’s posts, for example, and mine are written by a collective of political prisoners forced, a la Clockwork Orange, to consume Atlas Shrugged and Pat Boone around the clock). So we have a stake in the outing of publius by Ed Whelan.

But the case looks pretty clear: Whelan, cross that Eugene Volokh had shown him wrong and noticing publius agreed with Volokh, outed publius—after publius told him he had professional and personal reasons for wanting to remain pseudonymous.

There isn’t even the problematic case for outing as presented in Outrage to justify this; publius’s secrets had no bearing on the argument at hand. I can see no reason at all except the desire to strike at an antagonist.

You can make a case for pseudonymity from first principles, and maybe our philosophers here would like to do it, but you know, if it’s good enough for Madison, Hamilton, and Jay, I guess it’s good enough for us historians.

Leiter asks, considering this Kristof piece:

Why do members of the educated public think that it is an objection to philosophical inquiry that it is unintelligible to them (or that it does not have immediate application to the quality of life of pigs, say), whereas no one would think to put such objections against esoteric work in the natural sciences?  Are other humanities subjected to this same expectation of “practical relevance and intelligibility”?

From discussions with other colleagues in the humanities, they are subject to the same expectation, one as old as the hills, or at least the Gorgias: how is that going to make money and benefit society?   (I think philosophers get more questions about pot.) Yet I think there’s an explanation specific to philosophy in the answer to the first question.

Read the rest of this entry »

Some line has been crossed, here.

The post below is about half common-sense reasoning about the current crisis and half bloggy speculation about its effects on the higher-education biz. Much of it is brain dump, though there are a few useful links thrown in. You have been warned.
Read the rest of this entry »

This graph, from David Beckworth, is pretty hilarious. It’s unfair, but who minds unfair?

Thanks to commenter, uh, David Beckworth, for pointing this out.

I think what this picture best illustrates is the peril of getting drawn into a debate over the recovery question and the recovery question alone. The New Deal did a lot more than just set about a plan for recovery. For one thing, it saved some considerable number of people from starving, which is a nice thing. For another, it gave us a significantly reformed system of regulating economic downturns: a re-drawn Federal Reserve System, the FDIC, the SEC (which, prior to its gutting, was a pretty good thing), Social Security (which includes not only old-age but also unemployment insurance), and a variety of other similar measures. For yet another, it set about hauling the South out of poverty—a project at which nobody had succeeded despite considerable effort since the Civil War. So it was a lot more than just a recovery program.

But also, thinking just of the recovery program, I guess I think this graph isn’t so terrible at illustrating what I think we ought to notice: there was a deep, deep hole to climb out of in 1933, and the rate of climb was pretty quick. Could you really, realistically, expect much quicker, as broken as things were?

One question for David: what numbers did you use to establish trend? Did you include 1928-9, or just 1923-7 (which is one of the methodological disputes here)?

And one further point: the person who really should be on the list with Krugman, Sirota, and me, is the currently-rather-important Christina Romer. Probably also Gauti Eggertson, of the New York Fed, come to think of it.

My first thought was that this proposal is beyond bizarre.  My second thought is it’s beyond bizarre, but worth kicking around a bit because it hits on some interesting issues about the profession.

So, there’s a perception that academic pedigree, i.e., where one did one’s Ph.D. and with whom one worked, particularly who wrote one letters of recommendation, matters disproportionately much to search committees, to the detriment of equally good but less prestigious candidates.  And there’s a stickiness problem, because it is commonly believed that one’s first job sets the course of the rest of one’s career, meaning that if it’s true that pedigree matters too much, there are plenty of people not getting good jobs their first time out, and having no real way to recover.

Portmore’s solution: candidates should submit blind dossiers, including blind letters of recommendation.

I think this is a bad suggestion on both practical and theoretical grounds.

Read the rest of this entry »

Several people ask of the WPA graphing question, why not use a log scale? Commenter Stinky (no, I don’t know who s/he really is) kindly supplies a graph showing just this. For my money, it speaks for itself—which is to say, it screams, “don’t use me!”

We want to accomplish two things: (1) show how very outsized a chunk of money went to highways and (2) show also meaningful distinctions among lesser expenditures.

The log scale permits (2) while pretty much wiping out (1), unless you know how log scales work. I don’t think the likely consumers of such a graph do really know. But I’m wrong, Stinky says.

It’s conventional wisdom that when the economy is bad, college enrollments are up.  Yglesias recommends that 2008-2009 college grads, unable to find employment*, should go to grad school.

Bad idea.  There are two kinds of graduate school.

  1. Kinds you pay for.  Law, MBA, most master’s programs.  Professional degrees.
  2. Kinds you don’t.  Ph.D. programs.

There are some cross-overs, but I will ignore them.  (If you’re getting a scholarship to Columbia law, odds are you didn’t just decide to study law because the economy was bad.)

In the first category, one does a short (two-three year) program, incurs a soul-crushing amount of debt, and then has a degree which arguably makes one more employable.  As Neddy is fond of saying, “J.D. is not Latin for ‘meal ticket'”, and it’s entirely possible that one’s employment prospects are just the same two-three years down the road, except now one has even more loan payments.

It’s also not clear how long this economic downturn is going to last.  It would suck if one found an expensive place to hide for two years only to learn in 2011 that 2008 was really thought of as the last good year….

In the second category, ugh, please don’t.  A Ph.D. program is a multi-year commitment.  And while there’s no reason to be ashamed of beginning a Ph.D. program and leaving having discovered you don’t want to do it, there should be something wrong with starting a Ph.D. program (and eating up funding**) in bad faith.

More to the point, a Ph.D. program is not just like undergrad.  Undergrads never believe this.  There’s a world of difference between liking history or English and being good enough at it to produce it.  It can be very rewarding, but it’s essentially a low-paying job that no one thinks is a job (“Are you still in school? When are you joining the real world?”)

I would recommend to anyone considering graduate school, especially in the humanities, that they work a year or two first.  One of two things will happen:  you’ll realize that you only wanted to go to graduate school because you were comfortable with being good at school, and, look at that!  You’ll have a career!  Or, you’ll realize that you really did have a love for the subject, and you have interesting things to say about it.  In which case, you’ll go into graduate school with a clearer idea of why you’re there, and with a little more cash in your pocket.

But this is not meant to be a system to babysit you because no one handed you an i-banking check upon graduation.


*This is advice targeted at elite college grads whose trouble seems to be not that they can’t find any job, but that they can’t find the kind of job that they’d expect to get. We’re not talking laid-off-from-the-Chrysler-plant territory here at all.

**No loans for a Ph.D.  Don’t even consider it anything that isn’t saying full-tuition, and some sort of fellowship/teaching income.  The market is just too bad, and the job doesn’t pay enough, for hundreds of thousands in loans to be a good risk.

I suppose that the Paulbots and assorted libertarians weren’t enough, so Ari had to call down the wrath of I/P opinion. While that’s raging away, let me welcome as a regular contributor the estimable David Silbey, whose work regular readers already know. (One example; another.) For some reason he wants to join a blog that, having won a Cliopatria, can attain no greater height. We are grateful.

Whether he knows it or not—and “he” being Adam Kotsko, I’ll bet he knows it—this Weblog post is less about the formal fit between epic and the television serial than the relation of film to the episodic form.  I know that sounds backwards—what with MOVIES! being PRESENTED! on SCREENS! the SIZE! of WYOMING!—but the compounded facts of run time and the modern American attention span necessitate we consider film the proper realm of the self-contained episode.  Even films which promise sequels announce their completion in terms of whatever -ology they embrace. 

Films should be about something in the original, locative sense of the word.  They should surround some subject matter, be “on every side” “wholly or partially,” as per the OED.  They should be self-contained.  Not that they shouldn’t be sweeping—you can frame Guernica or a sublimely panoramic view of the Hudson River and slap it on a gallery wall without robbing them of sweep—but they should recognize their formal limitations.  Films can only intimate narrative epicness.  They can’t achieve it. 

“But!”

“But But But!” 

Try me.  Start listing epic films and I’ll start listing films with grandiose tableaux.  The Lord of the Rings?  Shot in that sewer of New Zealand.  Blade Runner?  The Lord himself envies Ridley Scott’s matte painters.  With film we confuse the formal qualities of narrative epic for the GIANT! SCALE! presented by the movie screen.  Cases in point: Iron Man and The Dark Knight

Both were hailed as epic upon release, and yet both are far superior films on the small screen.  Before you ask: I do remember what I wrote about The Dark Knight on IMAX, and inasmuch as it relates the experience of watching an obscenely high-quality image projected on the side of an eight-story building, I stand by it.  Watching the film on a small screen—one on which a bug of a Batman glides between five-inch tall skyscrapers while Heath Ledger’s Joker licks human-sized lips and establishes human-sized eye-contact—it’s impossible to deny that this supposedly epic performance is better suited to the televisual medium.  (This goes doubly for Iron Man, which barely passes for “good” on the big screen but shines when we connect with Robert Downey Jr. as a human actor in corporate world.)

Not that I think we should deny that the serial drama is also better served on the small screen.  A solidly written, solidly acted television show can be a better film than most films.  To wit: having finished the first four episodes of the blogosphere’s own Leverage, I can’t help but wonder what went so terribly wrong with Ocean’s Twelve and Thirteen

(x-posted about.)

Kieran weighs in on the question of how to present the WPA data, following up on Duncan.

In effect, what I’ve done here is choose to break a different rule from Duncan. Instead of putting two scales on the same axis, I have made one axis discontinuous between panels, skipping values in order to compress the horizontal size. Hence the reminder at the top of each panel that you’re shifting up an order of magnitude each time. Despite the rulebreaking, there’s still some principle at work because instead of just putting a discontinuity right at the end (to incorporate the largest value) the panels are split consistently by powers of ten, and it makes sense to think of WPA expenditures as falling into groups like “stuff they spent billions on” versus “stuff they spent tens of millions on” or “stuff they only spent a few million dollars on” and so on.

I like this, too. I can foresee an entire lecture on different ways of presenting information about the WPA…. Students will be so happy.

Duncan Agnew, a professor in a university at the very edge of the American West, sends along this solution to the “Tufte I ain’t” problem. I like it. Thanks!

He says anyone can feel “free to use, reproduce, and display this figure in any way you wish, with or without attribution.” Happy new year!

But I’m willing to learn. Properly rebuked for slapdash graphing, I’ve tried to improve the representation of WPA expenditures that appeared in this post, using the Greensboro font from Zapatopi.net.

I’d be delighted to hear further suggestions and critiques. One thing I want to do, but can’t figure out how for a graph like this one, is to figure out how to make the biggest line discontinuous—that way I could change the scale so you could make more meaningful distinctions among the other numbers, while still appreciating that the top expenditure goes way off the scale.

UPDATED: Here it is with a scale in billions, per suggestions.


B has a great post up at her place. You want a little taste? Well, the first one’s free:

When people with PhDs (or in graduate programs) talk about doing something other than professing, we always do so in terms of their “leaving” or “quitting” academia. When I left my tenure-track job, I talked abut it in terms not only of leaving a job, but possibly of leaving the profession, though that’s not really what I wanted to do…

But the truth, I think, is that part of what’s so painful about “leaving” academia is that we usually aren’t leaving by choice. More often, academia is leaving us, and all we’re doing is having to slowly come to the point of acknowledging that we’ve been left alone in this big apartment full of books, maybe with a cat or two, and a big pile of bills on the counter. Academia, that bastard; he just up and walked one day, and it took us a while to realize he wasn’t going to come back.

Oh, you know, maybe we could maintain the fiction that the relationship isn’t over. We could seek him out, hang around in the background picking up a few scraps of part-time attention when he needs someone to fill a gap in his schedule and hoping that at some point he’ll realize/remember how great we are and we’ll get back together on a full-time basis. Maybe he’ll even propose someday, and we’ll say yes–of course! does anyone ever say no?–and it’ll turn into a lifetime commitment.

The whole post is a useful reminder of what an absurd crapshoot a career in the academy is. I mean, I feel incredibly lucky to have my job. But that’s just it: I feel lucky. Because I am. How many scholars out there are every bit as smart as I am, work just as hard as I do, have CVs that are identical to mine? And how many of them have jobs they hate? Or don’t have jobs at all? What a weird profession.

To cogitamus. And also to donkeylicious, two cogitamus alums’ new blog. So what if I’m a bit late? It’s the thought that counts.

Kieran sent me this. I am sorely tempted to use it when people email me to ask, what is your telephone number, mailing address, and email? (Stop snickering; it actually happens.) But that would be rude, wouldn’t it? Please tell me, O readership.

Discussion of cabinet-staffing yields:

You don’t want the camel pissing in the tent.

True, but not unproblematic. There are of course two tent metaphors: better inside pissing out than outside pissing in, and the problem with letting the camel get his nose in the tent is, pretty soon you get the rest of the camel, too.

And these metaphors have, it seems to me, opposite implications for cabinet-staffing.

Recent comments

This is officially an award-winning blog

Best group blog: "Witty and insightful, the Edge of the American West puts the group in group blog, with frequent contributions from an irreverent band.... Always entertaining, often enlightening, the blog features snazzy visuals—graphs, photos, videos—and zippy writing...."
Follow

Get every new post delivered to your Inbox.

Join 180 other followers