The brief moment of tenure in American universities

Befitting the title and the subject of this post, I’ll try to be brief. Stanley Aronowitz, in his 1998 essay on faculty working conditions called “The last good job in America,” tells us the following:

“Organizations such as the American Association of University Professors originally fought for tenure because, contrary to popular, even academic, belief, there was no tradition of academic freedom in the American university until the twentieth century, and then only for the most conventional and apolitical scholars. On the whole, postsecondary administrations were not sympathetic to intellectual, let alone political, dissenters, the Scopeses of the day. Through the 1950s most faculty were hired on year-to-year contracts by presidents and other institutional officers who simply failed to renew the contracts of teachers they found politically, intellectually, or personally objectionable.

For example, until well into the 1960s the number of public Marxists, open gays, blacks, and women with secure mainstream academic jobs could be counted on ten fingers. And contrary to myth it wasn’t all due to McCarthyism, although the handful of Marxists in American academia were drummed out of academia by congressional investigations and administrative inquisitions. The liberal Lionel Trilling was a year-to-year lecturer at Columbia for a decade not only because he had been a radical but because he was Jew. The not-so-hidden secret of English departments in the first half of the twentieth century was their genteel anti-Semitism. For example, Irving Howe didn’t land a college teaching job until the early 1950s, and then it was at Brandeis. Women fared even worse. There’s the notorious case of Margaret Mead, one of America’s outstanding anthropologist and its most distinguished permanent adjunct at Columbia University. Her regular job was at the Museum of Natural History. She was a best-selling author, celebrated in some intellectual circles, but there was no question of a permanent academic appointment. Her colleagues Gene Weltfish and Ruth Benedict, no small figures in anthropology, were accorded similar treatment.”

(pp. 207-208)


What strikes me as interesting about this is the fact that tenure, according to Aronowitz, only became generalized in the postwar period (50s or 60s), as higher education expanded more generally and as America saw the emergence of a tacit social contract between workers and employers that offered stability and decent material conditions. Since the 1970s (according to the usual way of telling this story, which I’m not really competent to evaluate, not being a labor historian), this contract fell apart, for various reasons involving deindustrialization, a shift to the service sector, rising right-wing political opposition to welfare and social services, economic downturns, and so on. As Andrew Ross put it a couple of years ago:

On the landscape of work, there is less and less terra firma. No one, not even in the traditional professions, can any longer expect a fixed pattern of employment in the course of his or her lifetime. The rise in the percentage of contingent workers, both in low-end service sectors and in high-wage occupations, has been steady and shows no sign of leveling off. For youth who are entering the labor market today, stories about the postwar decades of stable Fordist employment are tall tales indulged by the elderly, not unlike the lore of Great Depression hardship that baby boomers endured when they were young. In retrospect, the Keynesian era of state-backed securities for core workers in the primary employment sector, including higher education, was a brief interregnum or, more likely, an armed truce.

This description is, as Ross goes on to emphasize, true of academia as well. As practically every reader here probably knows, today in the United States, tenure-stream faculty amount to less than a third of all instructional staff in higher ed, and the growth of temporary, adjunct, part-time, poorly-paid, insecure teachers has been very rapid (the AFT has details). Ross goes so far as saying that “in no other profession has casualization proceeded more rapidly than in academe.”

That’s not news. But what’s news to me, and what seems worth emphasizing, is that the history of academic labor actually seems not that different, on the whole, from the broader trajectory of American labor relations. Not only have academic jobs gotten more precarious around the same time that precarious employment has generally increased, but academics only started to generally have tenure — this is what’s so important about Aronowitz’s comments, if correct — more or less around the same time that other American workers also started getting stable jobs in the post-war period. Sure, there are some important things that are specific to academia: the tenure movement had been in the works for decades earlier (according to the AAUP’s history), and it may have taken a few of decades longer in academia to cut pay and working conditions than it would have in some factory that moved abroad in the 70s. According to the only historical statistics I’ve come across, 51% of faculty were tenured in 1969 and 64.4% in 1979, which was still far from unanimous (though many of the remainder may have been on tenure-track appointments). At any rate, there was no golden age when everyone was tenured, although in 1982 the fraction of tenured faculty was still thought to be increasing by “a point or two every year.” These days, of course, the fraction of tenure-stream faculty declines every year instead by the same amount. I can’t tell you the moment when it began to fall instead of rise, although sometime between 1985 and 1995 seems like a reasonable guess. I’ll look for statistics.

The general point, however, would seem to be that an exceptionalist fantasy where universities are radically special – in terms of their social organization – is pretty clearly false. American faculty didn’t have tenure in an earlier age; they only had it for a few decades mid-century which, basically, seem to correspond to the age of big post-war economic growth and prosperity, though apparently lagging behind it by a decade or two. The conclusion here, it seems to me, is that it makes no sense for academics to defend tenure in itself, without looking at its historical conditions of possibility in the American economy. There’s nothing wrong with demanding a stable job, but it’s irksome when certain academics seem to think that only a professor deserves one in this day and age. Sometimes (as in that interview by Stimpson I’ve linked to) the argument is that tenure facilitates academic work in the public interest, but I’m skeptical of any a priori claims that academic work as such is in the public interest (this would need to be demonstrated, not assumed), and a general argument for stable employment strikes me as far less prone to fantasies of academia’s exceptionality and unique value.

11 thoughts on “The brief moment of tenure in American universities

  1. I think that last point you make, questioning the supposedly inherent value of professorial tenure to the work that is produced in academia, is pretty astute. The longer I spent in academia–from my time as a wide-eyed undergraduate, until the last, jaded year of grad school–the more it seemed like tenure was the academic version of becoming a “made guy” (in mafia parlance). And I think that this is certainly the case in the Creative Writing MFA field. The entire point of getting a teaching gig at an MFA program is so you can get tenure and be paid to write all day long, while teaching a couple weekly workshops each semester. That tenure adds significant value to an MFA professor’s teaching or “research” activities is dubious, at best.

    Like you, I’m not so sure that the work coming out of academia is uniquely, or even especially, worthy of the “protection” of tenure, at least no more than any other kind of work. That it would be defended by academics and their advocates is not surprising, of course, but I think that the rhetoric used to defend it (that academia must be this uniquely preserved/protected zone, and that tenure is key to that effort) is often overblown and overtly self-serving.

    At the same time, inasmuch as many academic jobs in America are actually public sector jobs, I don’t see why university faculty–not to mention K-12 public school teachers–should be subject to a different set of employment practices than, say, police officers. Police officers can do all kinds of crap and leave with their jobs intact. And yet one only needs to turn on the TV or read a newspaper these days to hear certain segments of the public criticizing tenure for teachers on public payrolls. Granted, K-12 teachers are a far easier target than professors, so I don’t think the “plight” of the latter (largely imaginary) can be so easily rolled up into that of the former (pretty obvious/evident). But still, I think it’s worth taking into account.

  2. I agree with with your later couple of paragraphs. But I don’t share your sense that tenure is a sinecure that allows people to get away with doing little. My sense is that most tenured faculty are working quite a bit more than 40 hour weeks — as are most assistant professors who want to get tenure, and many full-time adjuncts, no doubt. And I do think that, for that minority of professors who does want to do non-mainstream research or take controversial stands, tenure does serve as an important protection. There’s an ugly history of firing dissidents from the academy that would probably be uglier still without it. So that’s the specific value in tenure for academics. It protects the marginal. And I do think that’s worth something. It’s just that I don’t think this protection for the marginal can be grounds for a good, plausible public defense of the professoriate when so much academic research has so little public benefit.

    Do we agree here? or only mostly?

  3. Eli, thanks for pointing out these facts about tenure. I wasn’t aware of this stuff. I’m not an expert on economic and labor history either, but I want to do my best to balance your description of negative trends with a mention of the positive trends to which they are more or less inextricably linked.

    My understanding is that the reason the post-WWII era had better job security was that firms were more profitable. Why were they more profitable? Because 1) they faced less foreign competition because foreign countries were educationally and technologically behind the U.S., and perhaps because they suffered more from the war.
    2) they had more monopoly power within our borders as well

    There are probably other reasons as well. The good news is that the economic competition that made our jobs less secure than they once were has also provided all sorts of comforts that were not available in the past. More living space, better access to information and education, many new forms of entertainment, lower priced and more diverse food and culture, new medicines, work that is more interesting for some (surely less interesting for others, hard to say about the net effect). I’m not saying that government should do nothing to ameliorate the downsides of economic competition, but if we only pay attention to the downsides then we will probably take actions with unintended consequences.

  4. Hi Mike, you’re right of course that there are certain consumer benefits to be had from what I guess we might as well call recent decades’ economic, or more precisely corporate, globalization. My view is that these consumer benefits are massively not worth the trade for bad/unstable/nonexistent jobs and ever-increasing corporate dominance of our society’s culture, politics and policy. We’re no less dominated by massive firms than ever, as far as I can tell, and I wouldn’t even really agree with calling it an era of increased competition, if increased competition means that goods and services markets are open to increasingly more firms. Au contraire, it seems to me more like a situation where increasingly large business enterprises are getting ever more consolidated (often internationally) and breaking down the legal, social and economic restrictions on their operations that formerly tended to hold them back within national boundaries and restrained to narrower spheres of operation. Business has been quite eagerly expanding into the higher education market, for example, and while I don’t know if I really care whether the private or public sector is teaching people how to be computer technicians, it would be a major tragedy to have a total conversion of higher education to for-profit models. That’s not on the horizon now, but it’s in the spirit of GATS and the WTO to go in that direction.

    Basically, I do agree with you that there is a cost/benefit equation here, but the upsides to me seem generally unconvincing and the downsides grave. I wouldn’t advocate for a return to the 50s, obviously, but I think that it’s unacceptable to cast the global workforce into a state of precarity. Of course, I am quite aware that some people (especially people with a great faith in markets) will see the cost/benefit equation differently than I do.

  5. Though I think the last 30-40 years have been more good than bad for Americans, the most important story of that time period is the massive increase in the standard of living in China and India. I’m pretty sure that the share of the American economy accounted for by the 500 largest firms has shrunk, not that this statistic, in isolation, says much. We’ve also had increasing income inequality within countries (but decreasing inequality across individuals worldwide)…

    While I think its useful to sometimes debate the big trends and what they portend for people’s well being, it might be more important/productive to discuss specific policy proposals and what their likely effects are.

  6. As always, I appreciate your intellectual engagement, Mike, and the reminder that, as usual, we disagree about the beneficence of the global economy. But I would note that this post, if it’s about anything, is about the big trends and not about the specific policy proposals!

  7. Eli:

    One of the reasons why I hauled out the MFA example is because it seems like such an egregious example to me of the “getting made” scenario. I should have added that, depending on the field, tenure’s value to the work being done seems (at least in my mind) to be in flux. So I think we generally agree.

    But what I would also say is that, given current conditions, tenure and all of its benefits, which supposedly protect the marginal, are part of the reason why jobs for full professors are becoming so scarce. Which could be, in itself, considered a marginalizing effect. We’re talking about a generation of academics, after all, who will be subjected to pseudo-employee status for much of their early careers, partly because it’s become too expensive to keep them on full salaries and benefits. That, too, has consequences for the research and study that forms our knowledge base. (Though, again, probably not as dramatic an effect as hardcore advocates would argue, at least not in every case).

  8. Hi Max, I don’t know why I didn’t reply right away. Anyway, I agree that the increasing unavailability of tenure is itself very bad for research (perhaps esp. in the humanities), because of course the logistical demands of juggling four adjunct jobs take up a lot of time that might otherwise be put into research work, not to mention the counterproductive stress, etc. Incidentally, the other day I heard a high-ranking corporate research officer, the research director from Total, give a talk on research management, and I was interested to hear him say that “stress is never good — not even in the private sector!” (He knew he was talking to an audience of academics, of course.)

Comments are closed.