I left anthropology for economic and family reasons, but that isn’t what I want to write about. I just want to write about no longer being what I was. What does that mean?
I don’t practice anthropology any more. I don’t teach it. I don’t write it. I don’t read it. I sold my anthropology books for a dime.
What does it mean to unbecome what you were?
Unbecoming an anthropologist: You stop making a mental map of who’s who. You don’t know who got what jobs. You don’t know who just published what.
Unbecoming an anthropologist: You no longer know what theories are alive and which are dead.
Unbecoming an anthropologist: You stop trying to speak in a certain tone, the academic tone of voice. That tone starts to seem thorny and stilted, though it used to feel natural and empowering. It stops making you feel like you are part of something.
Unbecoming an anthropologist: Your existence no longer hinges on your knowledge, your thoughtfulness, your capacity to theorize. You will no longer make cultural analysis into a cornerstone of your professional being in the world.
Unbecoming an anthropologist: Above all, you cease to constantly REPEAT the little ritual incantations that held up your claims to “be” that thing. To “be an anthropologist” involves constantly DECLARING your identity as if it were the most obvious thing in the world. “I’m an anthropologist…”, “I’m in anthropology…”, “As an anthropologist…”
But when you stop repeating those speech acts, when you stop being around those speech acts, the identity itself goes up in smoke.
Was it ever more than a mirage?
Yes: it was a mirage that people fervently believed in. A mirage that was also a landscape some people can spend their lives in.
How do you change a landscape?
The self is subject to the forces of wind and water and it shifts through almost geological processes. To change your self, you just let natural processes of erosion, slippage, and fire take their course. Maybe you can channel them — just a little.
Laboriously, you had made yourself look a certain way. And then you stopped maintaining the look, and soon your face changed.
I’m not an anthropologist: I just worked there for a while.
In leaving a profession, one senses the deep (ideological) gulf between “the professions” and “working for a living.”
“I’m a doctor” vs “I work in a hospital”: they’re totally different things. One of them defines your being in the world: another merely describes your relationship to capital and wage labor. (This may turn out to be a distinction without a difference, but it still feels like a difference.)
To be clear, I had the social and economic capital to be able to change careers and fairly easily become “something” else. I didn’t unbecome an anthropologist in exchange for nothing. But that is another story.
I don’t think I will ever again try to anchor my whole identity in my relationship to work. At least, I’d rather avoid it.
It’s almost funny that I ever was “an anthropologist.” What a reified thing that was. Why do anthropologists have such a objectified, involuted discourse on their own identity?
But there’s a lot of pleasure and comfort in self-objectification. That’s what I’ve learned and had to give up.
“One of the best ways to get a self is to be a thing,” I wrote once when I was thinking about gender. It would be funny if it weren’t a bit painful.
Perhaps one of the best ways to change a self is to stop being a thing.
And then — and then?
I started this blog on academic culture back in November 2007. I think I might be done with it now.
]]>So here is a little excerpt from the preface that explores the relationship between French Theory and the recent controversy over the HAU journal in my field.
It seems retro to appeal to French Theory as a source for the utopian imagination. From the point of cultural anthropology, French Theory now seems outmoded, since the 1960s are long since “past,” and nothing now seems less novel than its Great Men, Foucault or Deleuze. What is the point of an ethnography of an outmoded moment of intellectual production? Ironically, though, the very rejection of French Theory lies at the heart of anthropology’s latest crisis of coloniality: a coloniality founded on new pedestals for old men (and, it must be said, some women). It is worth exploring this in some detail, to show how French Theory remains key to reflexive struggles within Anglophone anthropology.
In June 2018, six months after #MeToo, a more specific conflict erupted in anthropology, centering on the journal HAU: Journal of Ethnographic Theory. The journal’s namesake category, hau, had been extracted by Marcel Mauss from a 1909 ethnography of Māori “forest lore” and repurposed in his 1923-24 “Essay on the Gift,” where it became an increasingly decontextualized concept of the “spirit of the gift.”[1] In 2011, HAU’s founders, Giovanni da Col and David Graeber, inaugurated their project by drawing on Mauss. His essay, they said, was “the quintessence of everything that is equivocal, everything that is inadequate, but also, everything that is nonetheless endlessly productive and enlightening in the project of translating alien concepts” (Da Col and Graeber 2011:vii). But it was the journal itself that ultimately became an equivocal, inadequate and productive symbol of the violence of theory.
The insider critiques of the journal chiefly took the form of #MeToo-style public testimony about an abusive workplace. Anonymous letters from the journal’s staff testified that da Col, who was Editor in Chief, had systematically mistreated them. They described financial mismanagement, wage theft, “daily vitriolic reprimands,” “overwork, exhaustion, and de-moralization,” and “inappropriate sexual comments,” and they argued that the journal’s open access mission had been betrayed by transferring its operations to the University of Chicago.[2] Graeber publicly disowned the project, writing an apology for the failed “realization” of what he still called the project’s “brilliant concept.”[3] The journal’s continued defenders, preoccupied by internal reorganizations, declared that the allegations amounted to a smear campaign by disgruntled egotists, confused outsiders and misguided radicals making “destabilizing efforts.”[4] (The phrase became infamous.) It seemed to me that the journal’s defenders never made a very persuasive public case for themselves, while the alleged labor abuses struck me as depressingly common features of precarious academic workplaces.
But the ensuing debates, which circulated on numerous blogs and on Twitter under the hashtag #HauTalk, rightly made HAU into a broader site for critiquing coloniality and elitism in contemporary anthropology.[5] Just as #MeToo had insisted that we not deny our coevalness with sexism, #HauTalk reiterated that colonial structures in anthropology were not a matter of the past, but were an ongoing crisis in the present, as Zoe Todd particularly emphasized (2018). It was commonly observed that HAU was a product of the elite Northern centers of the field: it was based largely on social networks from the University of Cambridge and the University of Chicago (my own alma mater). The Mahi Tahi collective wrote pointedly from New Zealand to ask, “How well have the journal’s recent practices, decisions and approaches lived up to the Māori concept of hau, a concept that the journal has continually stated is its central ethos?”[6] Adia Benton’s comments from 2017 were picked up again; she had been one of the first to say publicly what minoritized anthopologists had been saying privately, that HAU had fixated on “a rather old-fashioned model of canonizing the oldies,” and that these “select few ‘theorists’… skew[ed] white, old and male.”[7] Takami S. Delisle summed up the “core problem” as “white colonial elite masculinity.”[8] Was it a coincidence that the editorial board foregrounded representatives of “old school” anthropology, while the journal seemed to reject contemporary theories of identity, coloniality, race, gender, sexuality, and the intersections of all these?
Let us turn here to re-examine HAU’s founding statement, which turns out to center on a specific melodrama of masculine recognition. For Da Col and Graeber, the widespread influence of French Theory in cultural anthropology had left us a “discipline spiraling into parochial irrelevance” (Da Col and Graeber 2011:xii). Instead of borrowing ideas from Foucault or Deleuze, they argued, we should take refuge in the heartlands of our discipline, distilling concepts from ethnographic data instead of borrowing them from others. “It’s only by returning to the past, and drawing on our own hoariest traditions, that we can revive the radical promise of anthropology” (xxix). Doubling down on territoriality, the HAU founders also pictured the discipline in Leninist terms, declaring that “anthropology has been since its inception a battle-ground between imperialists and anti-imperialists, just as it remains today” (2011:xi). I have nothing against critiquing imperialism, but unlike Lenin, Graeber and Da Col did not link their radical rhetoric to any collective labor politics or political project. On the contrary, the rhetoric of anti-imperialism worked to downplay the journal’s own elite position in the academic field. In theory, Da Col and Graeber sought to diversify anthropology, promising to “promote intellectual diversity across different traditions… outside the North Atlantic and Anglo-Saxon academic juggernauts.” Yet these were the very juggernauts that had seeded their project with its initial academic capital — a contradiction which the authors proved incapable of working through.[9]
Thus if the radical promise of anthropology was ever “revived” at HAU, it was buried alive again the same day. The obvious detachment from contemporary Māori culture—however much it was valorized as a source of ethnographic concepts—was only matched by an equal and opposite disengagement with its French counterpart. As an ethnographer of French academic life, I was struck by how HAU’s founders unwittingly replicated the form of shallow, ahistorical engagement with France that they deplored in others. They treated “French Theory,” “European thinkers,” “Continental philosophy,” and the “Western philosophical tradition” as synonyms for each other, reproducing an essentialized, undifferentiated image of Europe. And instead of seriously analyzing theoretical production in the Cold War moment of decolonization and Western Marxism, they invoked a bizarre analogy with Classic Rock, dismissing “French theorists from the period of roughly 1968 to 1983” as “the intellectual equivalents of Fleetwood Mac and Led Zeppelin” (Da Col and Graeber 2011:xii).
If “Classic Rock” was passé to HAU’s founders, the funny thing is that then they got nostalgic for theory from the era of Dixieland jazz, Tin Pan Alley showtunes and Frank Sinatra. In the first half of the 20th century, they declared plaintively, anthropology had produced “concepts that everyone, philosophers included, had to take seriously” (2011:x). They noted excitedly that Jean-Paul Sartre had written about the potlatch and that Sigmund Freud had written about totems. Yet their casual expression “everyone, philosophers included” was really a misnomer for a narrow Franco-German sphere of white, male, overwhelmingly bourgeois intellectuals. In 1949, which HAU cast as the end of anthropology’s glory days, 68% of French university students were children of the bourgeoisie or of civil servants, while less than 2% emerged from the industrial working class.[10] Meanwhile, in France, anthropology hardly even existed as a discipline.[11] The Big Men of French social theory — Durkheim, Mauss, Lévi-Strauss, Bourdieu — were all initially credentialed to teach philosophy, via a French certifying exam called the agrégation. This philosophical legitimacy, not (contra HAU) some inherent draw of early anthropology, was key to why French philosophers took ethnology “seriously.”
Meanwhile, it is hard to idealize this intellectual epoch, since it was also a factory for vicious colonial and racist ideologies, as Aimé Césaire documented in his Discourse on Colonialism (2000 [1950]). The very French institutions that produced Mauss were themselves organs of structural racism, in a way that HAU never acknowledged. In 1952, Frantz Fanon described the agrégation as sufficiently racist that black men would simply not bother with it. “When an Antillean philosophy graduate says he won’t bother to take the agrégation, citing his color, I say that philosophy has never saved anyone.”[12] I find it disturbing that these seminal critiques of colonialism mark the ending of HAU’s preferred era of social theory.
In any case, when HAU went on to call contemporary anthropology an “intellectual suicide,” what they were lamenting was not a failure of political engagement with the communities where we do research, but a failure of renewed recognition from present-day academic elites. This is why I say that HAU was founded on a melodrama of masculine recognition. Its founding mood was embattled woundedness, and its founding relation was the fear of not finding legitimacy in the eyes of the Other — this obscure “everyone” that still seemed to focus on European philosophers. Da Col and Graeber went on to fantasize about creating a “different mode of engaging” with philosophy, but they did not imagine studying philosophers ethnographically (which, of course, is the project here). Instead, invoking a game of competitive one-upmanship, they liked to envision ethnographers showing that Deleuze and Guattari had been wrong about one concept or another (2011:xiv).
I have long appreciated Graeber’s contributions to anarchist anthropology and his activism. But he has never sufficiently processed his own investments in the elite section of the discipline, and I must disagree strongly with his conclusion that HAU was founded on a “brilliant concept” that was poorly realized. On the contrary, the project was always compromised by its basically affirmative stance towards anthropology itself, by its indifference to intersectional critiques of the field, and by its inability to move beyond the elitism and structural violence of its institutional origins. It was sometimes said during #HauTalk that HAU had renounced one locus of white masculinity, French Theory, only to enshrine another instead. Yet if we look at the social institutions of French Theory, it turns they are not only the institutions of pure white radicality that they seem to be. Like contemporary anthropology, they too are sites of struggle with coloniality and masculine domination. One reason for an ethnography of French Theory, then, is to learn from a set of French struggles that most of us are not even aware of.
Césaire, Aimé. 2000 [1950]. Discourse on Colonialism. Translated by Joan Pinkham. New York: Monthly Review Press.
Chimisso, Cristina. 2005. “Constructing narratives and reading texts: approaches to history and power struggles between philosophy and emergent disciplines in inter-war France.” History of the Human Sciences no. 18 (3):83-107.
—. 2000. “The mind and the faculties: the controversy over ‘primitive mentality’ and the struggle for disciplinary space at the inter-war Sorbonne.” History of the Human Sciences no. 13 (3):47-68.
da Col, Giovanni, and David Graeber. 2011. “The return of ethnographic theory.” HAU: Journal of Ethnographic Theory no. 1 (1):vi-xxxv.
Fanon, Frantz. 1952. Peau noire, masques blancs. Paris: Seuil.
—. 2008. Black skin, white masks. Translated by Charles Lam Markmann. New York: Grove Press.
Mauss, Marcel. 1990. The Gift: The Form and Reason for Exchange in Archaic Societies. Translated by W.D. Halls. New York: W. W. Norton.
Sauvy, Alfred. 1960. “L’origine sociale et géographique des étudiants français.” Population no. 15 (5):869-871.
Todd, Zoe. The Decolonial Turn 2.0: the reckoning. Anthrodendum, 15 June 2018. https://anthrodendum.org/2018/06/15/the-decolonial-turn-2-0-the-reckoning/.
Notes
[1] See Mauss 1990:114n24-25.
[2] “Former and current HAU staff letter”, June 14, 2018, https://haustaffletter.wordpress.com/; ”An Open Letter from the Former HAU Staff 7”, June 13, 2018, https://footnotesblog.com/2018/06/13/guest-post-an-open-letter-from-the-former-hau-staff-7/
[3] “HAU Apology,” David Graeber, https://davidgraeber.industries/sundries/hau-apology.
[4] ”Letter from the new Board of Trustees,” HAU Journal website, June 11, 2018, https://www.haujournal.org/index.php/hau/announcement/view/17
[5] An overview of these debates is at “HAU Mess,” https://docs.google.com/document/d/e/2PACX-1vSHK7oM8jxF9ppg_oVnX2VjWofn0VrH3Hf7GMqvlygYSDcuJ3-rSlGVQNEyKeHXLNVjabGBfJnL1Mnx/pub
[6] ”An Open Letter to the HAU Journal’s Board of Trustees,” June 18, 2018, https://www.asaanz.org/blog/2018/6/18/an-open-letter-to-the-hau-journals-board-of-trustees
[7] Tweets by Adia Benton, https://twitter.com/Ethnography911/status/908799037889024000, https://twitter.com/Ethnography911/status/908799682637389824
[8] Tweets by Takami S. Delisle, https://twitter.com/tsd1888/status/1009592747588714502
[9] To be clear, I also got my academic capital from this juggernaut, and I too oppose it in theory while benefitting from it in practice.
[10] See 1949 data in Sauvy 1960:869. I counted as “bourgeois or civil servants” the categories professions libérales, chefs d’entreprise, fonctionnaires, and propriétaires-rentiers.
[11] On French disciplinary recomposition in this period, see especially Chimisso (2000, 2005).
[12] “Lorsqu’un Antillais licencié en philosophie déclare ne pas présenter l’agrégation, alléguant sa couleur, je dis que la philosophie n’a jamais sauvé personne” (Fanon 1952:22). I have modified the English translation somewhat from Markman’s recent version (Fanon 2008:17).
Is there really such a thing as a biography?
After all, we are not really individuals. We are more like aspen colonies, all grown together but hiding the togetherness underground.
What is a biography but an alibi?
What is a biography but a ceremony?
Who gets to have a biography and who barely even gets counted in the statistics?
One thing at least is sure: that you can’t write about yourself without writing about others, since the comic fact is that we all have otherness inside ourselves and live inside other people too.
There’s something funny about biographies.
]]>But everything here is sadly unremarkable, aside from the gender and sexuality parameters. Workplace hierarchies and precarious gigs are ripe for abuse. Harassment is largely about enjoying power and transgressing other people’s boundaries. It exploits ambiguity and hides in plain sight.
These usual truths are all I have. I still think they’re worth hearing.
It’s an older story now, and I left out the names and places.
* * *
I had just graduated from college with an anthropology degree. I was bi and genderqueer, but not as out as I would get later. I was working as a temp receptionist for eleven dollars an hour at a big urban university.
Next to me at the front desk sat a charismatic gay man. I found him cute.
Let’s call him M.
We flirted a little at work. It was a boring place to work. I felt so very awkward in my totally ill-fitting, unfashionable efforts to dress business casual.
Once as we were leaving work at the end of the day, we kissed on the sidewalk.
M. was a bunch older than me and had a permanent job. He wasn’t my boss, but there was an asymmetry. But let’s be clear, the first time, the kiss was welcome. It was very quick. It had a certain energy, an anxiety. Our coworkers could have seen us. That would have been so weird.
We parted ways at the subway.
Maybe a little later there was another one kiss like that. I think there might have been about two okay kisses. I wasn’t taking notes.
But then after the kiss, or two, the office became a miserable space for me.
* * *
It’s hard to explain how miserable it was. Miserable in this nothing is what it seems way. There was the normal part of office life, and then the other part, the part hidden right in front of everyone.
The succinct version doesn’t do it justice. M. started to hit on me constantly at work. It was only barely clandestine. I really didn’t like it, I didn’t respond positively, and I didn’t know how to make him stop.
He sat right beside me at a long counter, facing the public. Whenever he thought he could get away with it, he would turn towards me and make these come-hither, sit-on-my-lap gestures. I’m sure he found it sexy and fun. I found it mortifying.
The truth is, I suspect he enjoyed my discomfort, or my powerlessness. I frowned back at him sometimes, or wrinkled my brow. These desperate, little gestures.
Two women worked right alongside us — literally three, five feet from us. But they never noticed anything. There are always moments when someone is looking the other way, when they’re down the hall. You would never think anything like this could happen in such a well-lit, sterile office environment.
This went on for a while.
I would go home at night and feel awful, like something was happening to me that I didn’t really understand; I just knew I felt trapped. I remember that I began to doubt myself a lot — like maybe it was my fault, or I’d asked for it, or I was misunderstanding something.
Sexual violence of this sort is mostly epistemic violence. Somehow your truth has become unhearable; you’re living something that is invisible; your version of the story is beyond subordinate, it might as well be nothing. All of these are pretty classic feelings for workplace harassment situations, I gather.
It was extra awkward to complain, because someone in my extended family had originally helped me get the job, by putting me in touch with her friend, the HR person for that branch of the university, who in turn sent me to the temp agency. I didn’t think much about complaining.
Meanwhile, some harassment scenarios might be unambiguous, but mine was awkward because there was a consensual part before it was non-consensual. That also made it harder to explain to anyone.
* * *
One day in the elevator, M. jumped me once the doors were shut. As he came up close and kissed me, I protested more directly than usual. “No, stop, M!” But it all happened quickly and he didn’t listen. The brief collision with an unwanted body. The elevator doors opened again soon.
Did he tell himself I was just being coy?
I don’t think he thought he was doing anything wrong. Not consciously.
But I also think he knew I didn’t like it, and that was part of the draw. They always say harassment is really about power — getting off mostly on power — but it only makes sense when you see it in person.
Meanwhile, I wouldn’t say it was capital t Traumatizing — to get kissed when I really did not want to get kissed — but it left me very rattled. At best, it was a really depressing boundary violation.
* * *
After three weeks on reception duty, I got moved upstairs to a different temp worker task, filing enrollment records.
Though the new work was deathly boring, it was a relief to be away from M.
But somehow, he kept finding excuses to stop by the file room to make sexual overtures. He even showed up and did it once in front of my one friend in the office, a woman.
It was hugely validating to have someone else see how egregious and gross it was. I’d started telling her about it, but I don’t think she really believed me until she saw it herself.
* * *
Anyway, after six weeks there, I quit. I had a new job lined up first.
In the exit interview, I told the temp agency I’d been sexually harassed.
I remember how hard it was to say anything, even then, when nothing professional was at stake. I remember having to get up my nerve. This little moment of hesitation.
The manager was superficially sympathetic. Should we only send women temps to that site, to keep them away from M? he wanted to know. (Which is a heinous cop-out.) Could we contact the employer?
I did let them contact the employer. The university HR person invited me to come describe my experience — to meet her, I had to sneak past M. at the front door of the building. But when I said that the first kiss had been consensual, she seemed really skeptical. I left her office feeling shaken. Frankly, the complaint process made me doubt myself even more.
* * *
My friend told me that M. did get reprimanded after I complained, and he seemed very chastened. This news didn’t really make me feel a lot better.
A month later, M. reportedly still talked about me, saying I kind of miss him* but he hates me.
[*my accepted pronouns at the time.]
He just never really got it, I guess.
Maybe most harassers just never get it. He just thought it was about “hating him” for some random interpersonal reason.
But I still don’t even hate him. I just hate the situation. And the lingering feelings that go with it. The feelings that maybe I am the bad person in the story, the broken one, the unreliable one, the queer subject who wanted something and then not what came next, the one who suffered through becoming an object.
It wasn’t a very bad harassment case. I had an exit. I didn’t suffer professionally. To be honest I don’t think about it most of the time. It was just something that happened.
Something that nevertheless should not have happened.
* * *
Why didn’t you protest more directly? someone might ask. But harassment is about constituting you as an object. And this process can be woefully effective.
When you end up spending so much time putting yourself in question, dealing with the confusions in your own psyche, it can feel like there’s no space to put anyone else in question.
And if you don’t know what harassment is, it’s hard to understand at first that it’s happening to you.
I wish I had been taught something about harassment beforehand. I wish everyone was taught that.
But would any amount of training really have prepared me?
I’m still not sure.
I just know how much I admire the courage of those who protest more successfully than I did.
This was written in 2018, but with some edits in 2020 to slightly improve the analysis.
]]>It’s been about twelve weeks since I left my faculty job in South Africa. I really liked teaching there, partly because the weight of the Apartheid past was still so very present in Stellenbosch that, in an unexpected way, it made it feel especially worthwhile to teach critical social science. But it was just too far from my partner and our kid, who had stayed here. Obviously, we had explored different options. Leaving Stellenbosch ended up being the right thing, and I’m not ambivalent about it, even though I miss the teaching.
Here’s what I wrote on Facebook as I was leaving:
I made so many mistakes in the classroom this year, but for obscure reasons, I also really fell in love with teaching in South Africa, so much that I’m in tears now, writing this. My colleagues told me to be ambitious and teach what I thought was challenging, which was the opposite of my postdoc, where my boss said the students were pretty mediocre and not to expect much. Here I got less afraid of the classroom, and more in touch with which boundaries I need and which ones I don’t. I got better at being myself, at accepting the mistakes and fixing them, at being reflexive in front of the students, and at managing bad affects (mine & other people’s). Some students didn’t like my act and some loved it, and sometimes their critiques of my classes were spot on, and sometimes they came from detached kids who rarely showed up. I think I got better at hearing the critiques and doing my best and just feeling… alive. I don’t necessarily know what people thought or what they took away, and that indeterminacy is important. Teaching is a modest project. But I want to believe that when I was more present, they were more present too. Sometimes I could see it in their faces, I thought; and to my considerable delight, a lot of supposedly “bad students” sneakily turned out to be pretty good ethnographers.
To leave a place is to figure out how to acknowledge your losses, to learn what you will miss.
That experience also taught me, contrary to what I’d imagined, that mixed feelings are not the same thing as ambivalence. Ambivalence in the strong sense, I think, only emerges when mixed feelings are also in conflict with each other, or express some contradiction. But you can feel happy and sad at the same time without having this sense of unresolution that ambivalence provides. Ambivalence is a way of deferring the solutions to struggles, of keeping contradictions open (sometimes, of course, it is an permanent deferral). But me, I’m at peace with my mixed feelings.
Still, on a professional front, it was very hard to leave a teaching job without having another one figured out first. I hope I can find a new university position in America, but we’ll see, since the academic job market in my field is still pretty meager. I still think my work is good. But I’m still writing about precarity, and experiencing it.
We often think of precarity as meaning short-term work contracts, but oddly, when I was at Stellenbosch, I had a permanent academic position, but it was still precarious because the geography was so incompatible with my life realities. The trip, one way, was 25 hours and three flights, minimum.
I liked my colleagues and I’m happy that I’m going to stay affiliated with my department in Stellenbosch for the next few years. I’ll go back and visit, maybe even teach a short course.
Meanwhile, I’m doing much more childcare, and trying to finish my book about French disappointed utopians. There are a lot of coffeeshops in this neighborhood, which constitutes a sort of college town for Case Western Reserve University. Maybe I’ll write more about this neighborhood, with its odd class markers and its unusual (for America) degree of racial diversity. The spaces around universities always bring out my ethnographic instincts.
]]>I had lots of different feelings in graduate school, and lots of them weren’t bad. But for me, some of the hardest things were those ritual moments where your very Being is supposed to be under examination. In concrete terms, that meant the big rites of passage: the qualifying exams, the dissertation proposal hearing, and finally the dissertation defense. It’s easier to think about them now that they’re a bit distant in time.
Sometimes you have anxieties that you just can’t explain rationally. None of my advisors thought I needed to worry about my quals.
Yet here’s an entry in my journal from Sept. 29, 2008.
I was feeling practically gleeful about my exam, pleasantly numb, but suddenly
— I was at a Graduate Students United meeting—
my roommate calls it a panic attack but I prefer to call it sick from anxiety
— I began feeling bad, locked myself in a little bathroom, locking both the locks on the door.
Horrible cramps in the abdomen, broke into a sweat, turned cold. I looked in a mirror later and I was convinced I’d turned white, the cheeks and forehead pale.
The scary part was less the physical symptoms than the total sense of not panic but weakness.
There’s the nice kind of vulnerability and then there’s the kind where you just feel terrible, a little desperate, helpless, I told my partner later.
So the bad moment was just sitting there in pain, my vision narrowing as if I were going to faint though I didn’t, and feeling afraid of how to get home, wondering if I could beg someone to bring me to their apartment and let me lie on a couch, prone. This awful feeling that something is wrong, and it had come from nowhere.
Then as I’m still in the bathroom starting to feel better, there’s a jingle of keys and a knock. When I come out I find the janitor, Joe, who works nights. I tell him I think I almost fainted. He says he goes in the basement bathroom and douses his head at times like that.
There used to be a cot down there, in the basement bathroom, but they took it away, I say. Too many people sleeping down there, he observes.
***
This all happened a long time ago and nothing bad came of it, really. But I’m posting it because public vulnerability in academia is very gendered, and I think it’s important for those of us who aren’t women to step up and think publicly about the hard moments that academic socialization wants us to endure.
I think because I’m generally unafraid during public performances, people don’t think I’m the kind of person who would feel anxiety elsewhere, in private.
Of course, not everybody has this much anxiety or experiences it the way I have. I don’t usually have moments like this; I actually forgot that this one even happened until I was re-reading my notebook the other day. That said, even though everyone is different, I do think that a lot of people experience massive and polymorphous anxieties in academic life. These experiences are themselves likely quite gendered, and, to insist on my point above, the way we talk about them is very gendered too. And who talks about them is very gendered.
I was amazed to read an article by some male geographers about “neoliberal anxiety” that had no mention whatsoever of anything personal or experiential. They weren’t even aware of the Cartesian and masculine quality of their discourse.
Indeed, there can be a lot of pressure to conceal these kinds of feelings and experiences, because they’re incompatible with the stoicism and invulnerability that is supposed to be part of professional comportment. As you learn when you teach, you can’t be sick, you can’t suffer, you can’t come undone, and you are supposed to be In Control. I mean, you can do these things, but there is pressure not to; they are almost maxims of professional comportment.
I know some people work against these norms, sometimes even building solidarities with their students by working against these norms. (I do that too, sometimes.)
And it’s funny how bad moments can produce unexpected solidarities. Like me and the janitor who told me about dousing his head.
I wonder what happened to him.
]]>It doesn’t really matter to me what the survey is about. The survey has two fatal flaws:
Both of these strategies are insulting and, inasmuch as “professionalism” means anything whatsoever, unprofessional.
Literature review: Clickbait is just a bunch of clichés, normally used in titles, that seek to generate phony desires to become a reader of some online article. The standard emotional logic is about generating a feeling of missing out or an epistemic lack — “XYZ happened, you’ll never guess what happened next!”
Here, then, are some phrases used in the survey messages that I consider clickbait: “Don’t miss your chance to take part,” “We have not heard from you yet!”, “AAA needs your help!”, “Please watch your inbox”…
It’s as if they want me to believe that there was an actual personal relationship here and not just the nth request to provide data to an organization that gouges its members on fees and rents its journal portfolio to Wiley-Blackwell… Not to mention that instead of just sending me one email about this survey, they sent me three.
This takes us right into spam territory. Listen, if I’d wanted to participate, I would have. Show some respect for my time and attention.
The question of respect brings me to the atrocious gift card lottery that is supposed to incentivize/compensate for my participation.
Look, we’re (ostensibly) professional social scientists here. That makes us experts in how to compensate people fairly for participation in research. If a student of mine proposed to compensate their research participants by giving each of them a lottery ticket, I would explain that that was ridiculous. But giving out a chance to win a gift card — which is exactly the same thing as giving me a lottery ticket, from my perspective as the recipient — is somehow considered appropriate in many university and scholarly contexts.
Back when I was in grad school, for instance, this was how the student health services people tried to get me to click on their survey link:
As an expression of our appreciation for your time and input, all students who complete the survey will be entered into a random drawing with a chance to win one of the following prizes:
1st prize – (1) iPad mini, 64GB tablet with Retina display (mfsr $599)
2nd prize – (1) Kindle Paperwhite 6″ reader (mfsr $139)
3rd prize – (10) $25 gift certificates at the University Bookstore
My disciplinary association, by contrast, is considerably more frugal:
In appreciation for your time, you will have the opportunity to enter a drawing for a chance to win one of ten $25 Gift Cards.
Let’s suppose there are 10,000 members (source) — $250 total in gift cards divided by 10,000 comes out to $0.025 per member.
So basically we are getting a little message here about what our time and attention is worth: 2.5 cents is considered is a fair average rate for survey-completion services.
Now they also mention that the survey should take five minutes to complete. From this, we can calculate the hourly rate that the AAA considers fair compensation for survey participation.
$0.025/5min = $0.005/min
$0.005/min * 60 min/hr = $0.30/hr
So here you have it, everybody: for our professional time and energy in contributing to the statistical data banks of our disciplinary association, we are being compensated at thirty cents per hour. That’s just slightly more than 4% of the current U.S. federal minimum wage ($7.25/hr).
At this point, it would be less insulting just to ask the research participants to participate gratis.
But this brings me to my real thought about this. Governance by survey is not a satisfactory form of participatory democracy. And it’s not fair to force a group of increasingly precarious professionals to pay a large annual tax to a disciplinary association that fundamentally has no form of participatory governance.
The word for what they do is rent-seeking.
And it is precisely because my disciplinary association is a large, opaque and self-interested entity, seeking primarily to reproduce itself as an organization rather than to help its members, that it resorts to this sort of casino-consumerist substitute for participatory input. It’s bad social science and it’s bad democracy. The irony, however, is lost on the organizers.
]]>Anyway, halfway through the volume, I find a compendium of sexist comments made to women graduate students at the University of Chicago. I thought it would be worth reproducing here, since I haven’t seen this text before and I think it’s good to have this sort of discourse out in circulation. While the general lines of this sort of sexist thought are pathetically familiar, the horror is always in the particulars.
THE HALLS OF ACADEME
The Women’s Caucus, Political Science Department, University of Chicago
Several of our professors have made these comments—some of them in jest— without realizing how damaging comments like these are to a woman’s image of herself as a scholar:
“I know you’re competent and your thesis advisor knows you’re competent. The question in our minds is are you really serious about what you’re doing.”
“The admissions committee didn’t do their job. There’s not one good-looking girl in the entering class.”
“Have you thought about journalism? I know a lot of women journalists who do very well.”
“No pretty girls ever come to talk to me.”
“Jane Jacobs’ book The Death and Life of Great American Cities is the only decent book I’ve ever read written by a woman.”
“Any girl who gets this far has got to be a kook.”
“They’ve been sending me too many women advisees. I’ve got to do something about that.”
“I hear I’m supposed to stop looking at you as a sex object.”
“We expect women who come here to be competent, good students but we don’t expect them to be brilliant or original.”
Student: “No, I wouldn’t stop teaching if I had children. I plan to work all my life.”
Professor: “But of course you’ll stop work when you have children. You’ll have to.”
Professor to student looking for a job: “You have no business looking for work with a child that age.”
Some people would say things are better now than they used to be. Well, are they?
]]>Best,
I have always found best an oddly alienating sign-off. So alienating that I almost want to argue it should be abolished. Though as you’ll see, I’ve become ambivalent about that view too.
What is it that bothers me about Best? It’s rote, of course. It feels a bit cold to me. It’s terse. It’s a perfunctory convention, and while I don’t hate conventions per se, I hate when conventions deprive us of an occasion for some sort of human warmth. It feels to me like fake sociability.
If you read this little word literally, it raises an obvious question — best what? Best… indifference? Best customs? Best formalities? Best is a modifier, but what does it modify? Best wishes? That’s what I think it is usually supposed to mean, but best wishes is truncated, too. I suppose it means something like I’m sending you my best wishes or I wish you the best (in life?).
In South Africa where I’m teaching, sometimes people do say Best, but there is also an alternative convention: Kind Regards. I actually like it much better than Best, because it seems to mean something more definite and much warmer, but it’s hard for me to use in practice, because it’s not part of my habits. One student told me that “kind regards is overrated anyway.”
Still, instead of writing Best, I like writing yours or take care (for friends) or some adverb expressing a mood or cheers (especially with Brits). One of my grad school teachers was famous for signing messages VBW, S, for “very best wishes, sincerely.” I used to sign off peace when I was in college, but it long since started to feel like an affectation. Sometimes when I just sign my name, eli, that feels almost unmediated.
But honestly, I mostly sign my work email best wishes because at least it’s best something.
And if I’m even more honest, the fact is that best wishes becomes a bit rote, too.
I want to say we should abolish Best, but I begin to suspect that any valediction will become rote if it’s overused, and thus relatively empty.
The more I think about it, the less I’m sure what would be a good norm in writing sign-offs. Perhaps there can be no good norm in the way I want, because the very reason we have norms is to save people from having to particularize and humanize their professional relationships.
Is it possible to institutionalize warmth?
It comes to mind that in any cultural system, there is a set of possible greetings and sign-offs that are organized through their differences from each other. Thus, in my part of American culture, the handshake is the standard professional greeting, and it’s distinguished primarily from waves (more informal) and hugs and kisses (more intimate), on an ascending hierarchy of warmth and intimacy. The handshake is a bit impersonal, ultimately, not because of any of its tangible qualities, but because of its symbolic difference from hugs and the other options.
It follows that what makes best impersonal is that it’s marked as being more cold and institutional than yours or sincerely, and also more formal than take care, peace, etc. So if we were to put an alternative term like kind regards where best currently sits in our system of sign-offs, it would presumably become just as hollow.
The question is thus not about abolishing best but about figuring out why we need to be able to ritually signal this very minimal warmth. Why do we need to constantly exchange cold warmth with each other? What is the meaning of institutionalized sterility? It’s more than nothing, but by design, it’s not much.
If you write best a lot, what’s it doing for you? Is it just the easiest option, given the cultural environment? Or does it have some aesthetic merits I’m missing?
Maybe best is, perversely, doing something useful for me after all. It gives me a thing to very publicly not do. Does that just mean I’m inhabiting the worst form of hipster culture — in essence, depending on mainstream norms as something to reject, as if that rejection in turn conferred a tiny bit of authenticity?
Maybe. I still don’t want to sign best, though. Like someone said once on TV: “Best is the worst.”
]]>
Want to comment? Please be aware that only comments from current AAA members will be approved. AN is supported by member dues, so discussions on anthropology-news.org are moderated to ensure that current members are commenting. As with all AN content, comments reflect the views of the person who submitted the comment only. The approval of a comment to go live does not signify endorsement by AN or the AAA.
On the one hand, this only means that anthropologists who can’t afford the Association’s exorbitant annual dues are going to be further excluded from the Association’s public forums. (There are rumors that many anthropologists only pay the annual dues in years when they are attending the Annual Meetings, because otherwise membership confers few useful benefits.) I am certain that no one is going to be incentivized to join the AAA merely to write a comment on this site, which implies that policy constitutes a harmful form of economic exclusion within the profession without any identifiable upside.
But on the other hand — and even more importantly — this commenting policy just further emphasizes the Association’s paleolithic relationship to technology (cf. their latest tech fail), and in particular their weak grasp on the culture of web publicity. Websites like AN are public spaces. There are cultural norms about how online discussions work in such spaces. It flagrantly disrespects these norms to provide public commenting facilities — as on any blog-like site — and then to deliberately reject all comments by non-dues-paying members.
To be clear: you don’t charge people cash to comment on your articles, because they are already giving you something for free by writing their comments. To comment is to contribute. To comment is to create a space of exchange where otherwise you just have a one-way transmission into the digital void. It’s fair to ask people to create accounts before commenting, to cut down on abuse, but there’s little precedent for making it into a cash transaction.
If you want to have members-only web forums, the generic convention is to hide them behind a login screen for members, instead of coupling a public comment box to an anti-public message. Thus the current policy is both hostile to the digital public and out of touch with web culture.
]]>
The actual experience of sitting on the platform was surprisingly unstructured. We were far enough from the audience that people chatted to each other a good deal, often in low voices to avoid disturbing the proceedings. Everyone was provisioned with a water bottle and a program, and arranged into three long rows of seats facing the audience, behind the higher-ups. There was an amusing hierarchy of chairs, such that the Trustees had brown wooden chairs, while the faculty had white plastic ones. Longtime attendees seemed to have strong views on where to sit, and arranged themselves in the faculty marching lineup with an eye to ending up in their preferred seat. The front row had a better view, but were conversely more on display; whereas the back row was somewhat shaded from the harsh sun by a backdrop. Individuals’ seating strategies sometimes led them to depart from their place in the official lineup, which was supposed to be in rank order.
In a sign of the times, a lot of faculty people were on their smartphones during the ceremony. Most people didn’t use their phones the entire time, but did consult them at least every few minutes. If you look carefully at the above picture, the professor with the pink hood has her phone out, possibly taking a photo of the audience.
The ceremony had previously been represented to me as “the most required faculty event of the year,” and I received numerous emails over the course of the Spring semester reminding me that my presence was obligatory, informing me that absences had to be approved by the Dean, and so on. The reality was much more casual: no one seemed to keep track of who actually showed up or how they presented themselves. “Can’t you stand in a straight line?” the Dean shouted out to us mock-seriously as we stood in a ragged file to greet the students. I also noticed that many faculty opted for sun hats rather than the official academic caps.
As the phone usage suggests, not everyone was completely attentive the whole time. “Is it done?” I heard someone ask plaintively during the lengthy presentation of diplomas. At the same time, though, many faculty carefully watched the graduating students going by, often clapping enthusiastically for students they knew personally. There was an obvious social hierarchy encoded in the amount of applause from the audience — the more popular students received more cheers (sometimes supported by their fraternities or sororities). Meanwhile, some relative “outcasts” (often male) received practically no applause.
Where I went to college, the graduating class was far too large to hand out diplomas individually during the ceremony; instead the diplomas were handed out separately at small per-department functions. I must say that handing out hundreds of diplomas individually in front of a huge crowd is an inefficient process, in spite of the obvious efforts at efficiency by the diploma presenters. The graduates were arranged in a queue at stage right and, when their names were called out by the Dean, they walked across the stage, were handed their diploma, shook hands with the President, and exited stage left. A crew of senior faculty was responsible for making sure the diplomas were handed out in the correct order, and they did their task with nearly machinic efficiency. (The process presumed that the graduates had been correctly lined up in a pre-given order, matching the order of diplomas in piles.)
It struck me, watching the process, that the distribution of diplomas was a perfect icon of “education as pipeline.” Everyone has their place in the queue; they all go through the same physical motions; they all end up with the same ritual result (being socially recognized as graduates).
Yet within this visual representation of the education pipeline, there was also this odd, evanescent moment of individuality. You get to have your name read out ritually to a huge audience. Some of the students strutted or struck poses for the audience. Others tripped or dropped their hats as they traversed the stage. This added a minor degree of drama.
Meanwhile, the faculty on stage followed along by reading the alphabetical list of graduates in the program. Alas, a typo appeared over and over at the bottom of almost every page of the program, under the graduates’ names. “Academic Disctinction” somehow slipped past the copyeditor.
This didn’t surprise me too much, because only a few days earlier, a sad email from the campus bookstore had apologized profusely for having stocked a t-shirt where “Whittier” was misspelled. At least one faculty person at graduation pointed out that this shirt would probably have become a collectors’ item if they had decided to keep selling it.
There was in any event something poignant about having the graduation become, in effect, the “firing hall” for faculty whose contracts are up. One normally thinks of graduation as being about the departures of the students, but at places like Whittier where there are plenty of temporary faculty, it’s also a scene of the departure of precarious faculty. (Adjuncts aren’t required to attend, but visiting faculty and postdocs — everyone technically on the “faculty” list — do have to.) “It’s a lot of people’s last day,” one tenured faculty member told me. Behind every ritual, there’s a labor politics.
As I left the ceremony, I decided to walk home through the nature trail that goes uphill from the Whittier football stadium. The parking lot rapidly dies down into dust; you can see the dots of the crowd through the fence at right.
Even from among the leaves, you could still hear the brass band playing exit music.
]]>
I’ve long had a memory of having seen this complaint crop up in earlier decades, and I just stumbled back across its source in a 1969 paper by Donald Campbell (in which he critiques the “ethnocentrism of disciplines” and advocates a “fish scale model of omniscience,” but that’s another story). Here’s Campbell critiquing the “scholarly ego ideal”:
]]>While on the theme of recreational reading and the duplication of fish scales, it seems appropriate to deplore the tendency of social scientists to feel that they all should read current newspapers, particularly the New York Times. Certainly the collective perspective would be better if most spent the equivalent time with newspapers of other epochs or with historical, anthropological, archaeological, or literary descriptions of quite other samples of social milieus. Rather than the ego ideal of keeping up with the current worldwide social developments, the young scholar should hold the ideal of foregoing current informedness for some infrequently sampled descriptive recreational literature. Too often our ego-ideals settle for uniform omniscience, knowledge of both past and present, of both here and there, and too often we settle for the same pattern of compromise all our colleagues are settling for. Compromise from the Leonardesque aspiration there must be, but even in leisure reading, one can hold as ideal the achieving of unique compromises.
The motivations for academic work are similarly supposed to be other than pecuniary. One is supposed to work for existential reasons, or out of commitments to higher values that go beyond the purely economic — the “pursuit of knowledge” in some quarters, the dedication to making citizens or producing social justice in others. Yet it’s no criticism of these values to observe, as many have already observed, that these higher values can become alibis for an amplified self-exploitation. “You’re doing it out of personal commitment,” they tell you as you donate your weekend to the institution.
A strange moment in this process, though, is the moment where colleges and universities beg their own employees for charitable donations.
Thus I’ve been surprised to receive email and paper mail requests numerous times per year from my current employer, Whittier College, originating in their Office of Advancement. As the illustration for this post shows, they even emailed me before the end of 2016 to suggest that “Charitable giving might help reduce your income tax bill.” But the only reason I have a tax bill is because they themselves are paying me a salary. So if I gave them a donation, that would … essentially be returning a portion of my salary to my employer.
Which amounts to asking me to work for free, or anyway for less, as if, again, academic work wasn’t actually something you do for a living. (I say “for a living” and not “for the money” to signal that what motivates me is the practical survival of our household, rather than money for its own sake. For people motivated by the latter goal, academia is obviously an inefficient route.) In any event, this seems a strange message to send to one’s employees. The same thing used to happen when I worked at the University of Chicago, so it isn’t just Whittier College in question; but in that case at least I was actually an alumnus.
I would recommend to academic employers that they at least ask their employees to opt in to the list of prospective donors, rather than giving their names to Institutional Advancement purely because of the mere fact of their employment.
]]>
First she comments on the poverty wages and immense structural sexism that characterizes the post-PhD situation:
… a whole generation of junior academics is exposed to an ever growing casualization of labor. In Ireland alone, as a study of the collective Third Level Workplace Watch shows, a growing number of casual academics win on average 10 000 € annual income for an average of eight and a half years after finishing their PhD. In 63% of the cases this income is generated by hourly paid work, done in 62% of the cases by women. In Ireland again, a recent study by the Higher Academic Authority has shown that men still get 70% of all permanent academic positions in all seven universities in the country. The situation is similar in other countries where despite the fact that women make for the majority of completed PhD dissertations, the ratio of employment is still at their detriment. Women are particularly exposed to vulnerability with less access to permanent positions, and more emotional labor and care-giving functions both in and out of the academy. While those who have children feel losing the academic game because of the domestic burden of care in ever decreasing welfare conditions, those who do not have children feel deprived of private life due to growing imperative to do replacement teaching and administrative work.
Ivancheva subsequently remarks on the increasingly cruel norm of labor mobility that precarity and underemployment impose:
Beyond national trends, a growing “internationalization” (i.e. transnational flexibilization) of academic work makes it a difficult subject of both research and organized resistance. To stay in the academic game after finishing a PhD, in an English language research institution, one is usually required to put up with flexibility and recurrent migration. Those who get to do a post-doc or get a full-time fixed-contract teaching position are usually pressed to find time out of work in order to turn their PhD into publications. The shorter the time of the contract the higher the probability is that they return unprepared to the ever more competitive job-market.
Finally, she succinctly notes the costs of this mobility norm:
On the road of celebrated “internationalization” many are pressed to curtail their previous social and professional networks, and change countries every few months or years, if lucky. Many suffer loneliness and depression while others have to take on the responsibility of moving their whole families along or commuting across regional or national borders to make ends meet. The others, who – out of choice, or often out of necessity – opt out of the game of transnational mobility, fall easily in the trap of zero-hour teaching and precarious research arrangements in order to stay afloat. Both groups are dependent on local or international clan-like arrangements of loyalty and hierarchy. While university administrations outnumber academic faculty, academics do ever-growing amount of administrative work of (self-) evaluation to fit the demands of the ‘global knowledge economy’. Individualized contractual arrangements and access to benefits and resources encourages cruel competition among colleagues and friends, and breaks all solidarity.
In particular, her emphasis on “cruel competition among friends” fits my own disciplinary milieu (U.S. cultural anthropology) all too exactly. The question of how to reinvent professional solidarity among people forced to compete with each other for scarce jobs remains, in my opinion, one of the major challenges facing academic labor organizers.
]]>
Viz:
I have to say, not having done much mainstream disciplinary publishing before, I found myself agreeing with the received wisdom that scholarly publishing is a tremendously long process. The first paper went through at least eleven drafts and two journals. For the second paper, which has some nifty animated diagrams, I had something like sixty email exchanges over the past six months with the journal staff who organized and realized the animations. Not all these steps were time-intensive, but cumulatively they added up to quite a bit of work.
One of the inevitable results of the slow publishing process is that some of the work is born dated. For example, one of my claims in the paper on precarity is that a lot of anti-precarity organizing isn’t actually by precarious academic staff themselves, but is rather handled by a set of union delegates who themselves are not precarious. I also suggested that precarious academics tend to avoid identifying personally as precarious. If I were writing the paper this year, I might have changed those claims a bit, because a new “Collective of precarious workers of Higher Education and Research” emerged in France last spring. It seems to be getting a lot of the attention that the traditional union apparatus used to get, and it does speak more in the first person (albeit plural, not singular).
As far as the other paper, it turns out that I slipped in an unwarranted assumption that Sarkozy was only the past President of France:
The Ronde had initially been launched by French activist academics in March 2009, during Nicolas Sarkozy’s five-year term as president of the French Republic…
Now that Sarkozy is running for President again, it’s possible I may live to regret that assumption as well. History undoes academic knowledge so rapidly, one might say. It’s hard to know how to narrate the past if you don’t know the future.
In 1984 I began full-time teaching in a tenure-track position at a small college in Ohio. One day, walking across campus with one of the most senior members of the faculty, I was discussing with him some classroom difficulty we were both having. He shook his head in resignation and said something I have heard faculty all over the world say so often, as though it explains everything, “Well, you know, most of our students come from working-class backgrounds.”
This time, for the first time, I did not stand there in shamed silence. Although it was not my most articulate moment, I said, “So what, Richard? So do I!”
He stopped walking as he threw back his head and laughed. Then threw his arm around me and said, “So do I, Diane. So do I.” I don’t know what that moment meant to Richard, but for me, that moment meant that I was able to say that being working class is not an excuse or a sorrow or a shame. It happens to be where I come from.
There are two kinds of social difference that come in contact here like a short-circuit: the teacher vs the student, the self-that-one-is and the self-that-one-was. The premise of this moment — two teachers talking about their classroom problems — is that to be a teacher, one has to objectify one’s students. But then it becomes obvious — at least in this story, which is why it’s even a story — that this kind of objectification depends on a folk sociology. “Well, our students are from XYZ backgrounds…”: there’s a horrible potential there to slip over the line that separates benign objectification from outright essentialism.
But this time when that line gets crossed, the narrator can’t prevent herself from letting her own social identity come out in protest against the institutional hierarchy that usually precludes teacherly identification with the student masses. And there’s a joy and laughter in that moment of deconstructed hierarchy.
I would still observe, though, that one readily stops being working-class if one becomes a tenure-track college teacher. Class origins aren’t everything; they aren’t necessarily identical to class destinations. Which is why Kendig can apprehend one’s own social origins as something deeply rooted within her but also as something that has become outside and thus a bit uncanny.
]]>My name is ——. I am the Lucy Adams Leffingwell Professor in the Department of Anthropology at Case Western University. I am also a lifetime member of the American Anthropological Association and President-elect of the Society for Psychological Anthropology. I am writing to ask that you vote against the boycott of Israeli universities.
My name is Dale Eickelman, the Lazarus Professor of Anthropology and Human Relations at Dartmouth College…
I am Paul Rabinow, Professor of Anthropology at the University of California at Berkeley. I write to urge you to watch this important new video where anthropologists who know something about the matter demonstrate how an academic boycott is ultimately personal.
I am Ulf Hannerz, Professor Emeritus of Social Anthropology at Stockholm University, Sweden. I have been a member of the American Anthropological Association since the 1960s, and I am a former member of its Committee on World Anthropologies. I have voted against the boycott resolution.
My name is Myra Bluebond-Langner. I am a medical anthropologist currently at the Institute of Child Health, University College London where I hold the True Colours Chair in Palliative Care for Children and Young People as well as Board of Governors Professor of Anthropology Emerita at Rutgers University. I am a long-term member of the American Anthropological Association and a recipient of the Margaret Mead Award from the AAA and the Society for Applied Anthropology. I am writing to urge you to vote against boycotting Israeli universities in the AAA’s spring ballot.
I am Tanya Marie Luhrmann, Professor of Anthropology at Stanford University, a member of the AAA for over thirty years. I write to urge you to vote NO on the proposal to boycott Israeli universities in this year’s AAA spring ballot.
I am Michele Rivkin-Fish from the University of North Carolina Chapel Hill. I am writing to urge you to vote NO to boycotting Israeli universities in the AAA’s ballot this month.
These are all the opening lines of the anti-boycott emails. I have to say I’m struck — amazed, really — by the massive recourse to institutional affiliations, titles and credentials. It is as if the most important task for these authors was to establish their own power, as if that in itself conferred authority. None of these people are untenured; none of them are unemployed; none of them are adjuncts; none of them are working-class, all of them are privileged; and we’re meant to know and value that as we imbibe their prose. It’s like a parade of academic capital that you hadn’t planned on watching go by.
One particular slippage that I find interesting is the quite direct equation of the person with their title. I am XYZ, not I work at XYZ. I find that particularly pernicious, as there is nothing more antithetical to the spirit of democratic inquiry than identifying speech with the institutional trappings of its producers. And yet it turns out that the anti-boycott group has an explicit rationalization of this equation. They note in “ten reasons to vote against the boycott” that
Badges we wear at conferences, by-lines at the top of journal articles, resumes and terms we use to introduce each other all consist of names attached to titles and affiliations – institutional idioms that define who and what we are.
But is it really the titles and affiliations that define who and what we are? It’s an idea fit for a feudalism re-enactment camp, but apparently for this group of academics, the thought can somehow be defended non-ironically. Do they not realize that this proposition amounts to saying that unemployed scholars are nothing? And that their recourse to their own titles tends to make their whole discourse nothing but an argument from authority?
]]>Davydd is known best for doing participatory action research, so naturally we wanted to devote half of the panel time to working collaboratively with the audience. We planned to ask them questions like these:
We were also hoping that, after sharing our presentations with the audience, we could engage them in trying to collectively generate new questions, new research agendas, and new strategies for re-creating universities. That didn’t entirely work out. What happened instead was experience-sharing – the crowd was small enough that everyone could take a turn at describing their own institutional circumstances and dilemmas. This turned up a wide range of situations, everyone from graduate student unionizers and undergraduates to junior and senior faculty. Correspondingly, the participants shared a wide range of strategies for intervening in their institutions: everything from open-source publishing advocacy to arguing over budgets to militant faculty committee politics. (I did notice, incidentally, that graduate students were under-represented in the audience compared to the conference public in general; I’m not quite sure why.)
But I was left wondering more than ever about the risks of specialization in anthropological work on universities. We — the panelists — were there in a double role: as facilitators of collaboration, but also as experts on the panel topic. I came away feeling that these two roles were more in conflict than one might hope. One might even say, abstractly, that the usual logic of expertise is at odds with the logic of collaboration – “expertise” seems to name a vertical relationship (the most knowledgeable people are privileged) while collaboration hopes to be more horizontal (everyone brings something to the table). Of course, this doesn’t have to be a problem in practice; there are ways of using expertise more collaboratively, less hierarchically. Still, I came away from our panel thinking that we were overly optimistic in asking our audience to instantly talk like higher education researchers, suggesting new research agendas. We said we were hoping for collaboration, but I think we may have wanted something more. Perhaps we had hoped that they would act as our peers.
There’s a logistical element here that’s worth noting. Collaboration, as Davydd reminded me, takes a large investment of time and energy, and it was probably unrealistic to expect a new form of interaction to emerge in the space of 90 minutes. The traditional logic of expertise, by contrast, is a script that academics are automatically trained to practice. In a traditional conference panel, you don’t need to spend time figuring out how to interact, because you come in with a set of expectations and habits that presuppose you to participate in the usual logic of deferring to expert knowledge. It’s efficient not to challenge these habits.
In the context of doing critical research on university institutions, this is worrisome. Surely the point of a critical anthropology of universities is to help improve these institutions by working broadly with their denizens. In that context, it would be a terrible irony if we ended up producing narrow expert knowledge in the usual disciplinary mode. (Or at least, if this were mainly what we produced.)
There is a lot of precedent in higher education for “critical” fields like feminism being defanged by walling them off into their own disciplinary space. Is this inevitable for “critical university studies” as well? The recent invention of the “critical university studies” label seems to point in that direction.
If we’re not to end up there, I think we may need to work harder at figuring out how to interrupt the usual logics of expertise even when they help bring attention to our work.
]]>I previously argued that if academic overproduction is in many ways market-like we might want to push for a better regulated market in knowledge. I suggested that this could be a complementary strategy to the usual denunciations of market forms in academic life. There is nothing the matter with critiques of market forms, I will stress again; but for all that, they need not be the end point of our thinking.
Continuing that line of thought, I’m wondering whether mass overproduction of academic knowledge may not have some unexpected effects. Its most obvious effect, of course, is the massive amount of “waste knowledge” it generates — the papers that are never read (or barely), citation for its own sake, prolixity for institutional or career reasons, pressures to publish half-finished or mediocre work, etc. All of these are the seemingly “bad” effects of mass overproduction.
But does mass overproduction have any clearly good effects? I like to imagine that one day, machine learning will advance to the point where all the unread scholarly papers of the early 21st century will become accessible to new syntheses, new forms of searching, and so on. We don’t know how our unread work might be used in the future; perhaps it will be a useful archive for someone.
More immediately, I’m also wondering if mass overproduction is creating new forms of self-consciousness in the present. In Anglophone cultural anthropology, it seems to me that mass overproduction is forcing us to constantly ask “what is at stake here?” Older scholarship seldom needed to ask itself that question, as far as I can tell, and certainly not routinely, with every article published. It became common, somewhere along the way, to ask, “so what?”
As one crude measure of this, I checked how often the literal phrase “what is at stake” co-appeared with “anthropology” in works indexed by Google Scholar, dividing up by decade (1951-2010). What I found out is that this exact phrase occurred last decade in 14,600 out of 853,000 scholarly works in anthropology. (Or at least matching the keyword “anthropology.”) This comes to 1.71% of anthropological scholarship published last decade. Obviously, 1.71% is not a large percentage, but what’s important as a barometer of tendencies in the field is that the percentage has risen considerably since the 1950s. Back in 1951-1960, only 35 publications mentioned “what is at stake” (0.2% of the 17,300 works published that decade).
Here’s the data:
Hits incl. Percent
Hits for "what is incl.
Decade Anthro at stake" "at stake"
1951-1960 17300 35 0.20%
1961-1970 37700 160 0.42%
1971-1980 89900 480 0.53%
1981-1990 198000 1480 0.75%
1991-2000 609000 5860 0.96%
2001-2010 853000 14600 1.71%
Growth since 49.3x 417.1x 8.5x
1950s
Put another way, there was 49 times more anthropology published in 2001-2010 than in 1951-1960, but the expression “what is at stake” was used 417 times more often in 2001-2010 than in 1951-1960, thereby growing a bit more than 8 times as fast as the field in general. Google Scholar’s crude keyword search is too imprecise to measure how much work actually discusses what is at stake one way or another, but I expect that a more sophisticated linguistic analysis would show similar patterns over time.
So. Let’s say it’s true that cultural anthropologists now talk about “what is at stake” much more than they used to. The standard explanation for this is basically cultural and political. Cultural anthropologists are just much more self-conscious than they used to be, or so the story goes. They’re attuned to the politics of their representations. They’ve had to ask themselves about the relationship between their theories and colonial regimes. They no longer write under the assumption that producing objective knowledge is possible or even desirable. That’s what many of my colleagues would say, I think.
There’s plenty of truth there. But I wonder whether the sheer fact of overproduction – the massive flood of publications, the massive pressure to publish, the fact that we are not just a small village where everyone knows each other – may not also contribute to a sort of routinization of existential crisis. After all, if we are in a massive market of knowledge and attention that’s driven by the pressure to constantly produce, it stands to reason that the value of our product is constantly under scrutiny. I think that that’s partly what the “stakes” question reveals: an assumption that, until proven otherwise, our epistemic product has no value.
On some level, it is of course ridiculous to constantly have to prove that something major is at stake in every article, because when one is in a system of mass production, it is illogical to demand that the mass-produced part be singular, or even distinctly valuable. On the bright side, this massive existential focus on “the stakes” does help puncture an older generation’s dogma that scholarship is intrinsically virtuous. Existential self-doubt is a healthy thing, in some measure.
The downside, though, is that this focus on the stakes can oblige us to constantly exaggerate the value of our work— if only in order to get published and to attract readers. When everyone has to declare the great stakes of their scholarly products, this opens up a vast new space for self-promotional hyperbole. One might conclude, then, that mass overproduction can produce new forms of existential self-consciousness and self-scrutiny; but ironically, this existential awareness can itself readily become a new self-marketing opportunity.
]]>In any event, it is a curious document. Here it is as of March 2016; I’ll highlight a few important passages.
Disclaimer and Waiver
As a condition of my participation in this meeting or event, I hereby waive any claim I may have against the American Anthropological Association (AAA) and its officers, directors, employees, or agents, or against the presenters or speakers, for reliance on any information presented and release AAA from and against any and all liability for damage or injury that may arise from my participation or attendance at the program. I further understand and agree that all property rights in the material presented, including common law copyright, are expressly reserved to the presenter or speaker or to AAA.
I acknowledge that participation in AAA events and activities brings some risk and I do hereby assume responsibility for my own well-being. If another individual participates in my place per AAA transfer policy, the new registrant agrees to this disclaimer and waiver by default of transfer.
AAA intends to take photographs and video of this event for use in AAA news and promotional material, in print, electronic and other media, including the AAA website. By participating in this event, I grant AAA the right to use any image, photograph, voice or likeness, without limitation, in its promotional materials and publicity efforts without compensation. All media become the property of AAA. Media may be displayed, distributed or used by AAA for any purpose.
By registering for this event, I agree to the collection, use, and disclosure of contact and demographic information. This information includes any information that identifies me personally (e.g. name, address, email address, phone number, etc.). AAA will use this information to: (a) enable your event registration; (b) review, evaluate and administer scholarships or other AAA initiatives; (c) market AAA opportunities you may potentially be interested in; and to (d) share limited information (e.g. title, company, address and demographic information) with third parties that perform services on behalf of AAA. AAA does not distribute email address or phone numbers to third parties or partners performing services on behalf of AAA. AAA may use this information for so long as AAA remains active in conducting any of the above purposes.
The bold points all seem to raise some questions:
These concerns seem in turn to raise some obvious (meta)procedural concerns. Who wrote this document? Who enforces it? How is it used in practice? Who reviewed it and signed off on it? I would be interested in knowing if any other scholarly associations have similar protocols, and if so, what their history is.
]]>My father has a disconcerting habit, especially for people who don’t know him, of pointing to things with his right pinky. Why it’s disconcerting is that his pinky is only a stub. Its top half was sheared off on a conveyor belt while he was working in a feed mill that supplied the many duck farms then dotting a good part of Long Island east of Queens. As a teenager in the early 40s, he loaded eighty-pound bags of feed, then after coming back from driving a half-track across Africa, Italy, and Germany, he forewent the GI Bill to drive a tractor trailer delivering those eighty-pound bags to the duck farmers. While Long Island metamorphosed from farms to suburbs, he took a job as a dispatcher—as he puts it, telling the truckers where to go—at a cement plant that flourished with all the building.
When I was an undergraduate at Stony Brook, founded with the sluiceway of postwar money to universities and serving the people in the new suburbs, I would sometimes show up at the office hours of a well-known Renaissance scholar and Shakespeare critic. He was born the same year as my father and also served in World War II, but after the war signed on for the GI Bill to get through the University of Chicago. He always seemed surprised to see someone appear at his door; he was tough-minded, with a neo-Aristotelian, analytical edge common to Chicagoans of his generation, which put some students of my generation off, but I saw the gleam of ironic humor underneath, plus I liked the challenge. He would typically fuss with his pipe (this was when professors still really smoked pipes, and in their offices) while we were talking. One afternoon in his office, watching him light his pipe, I remember noticing that his fingernails were remarkably long, and polished to a low gloss.
If you’ve ever done what used to be called manual labor for any extended period of time, you’ll know it’s hell on your hands. Or if you’ve ever read Life in the Iron Mills, you’ll realize that class is not just a question of what money you have or don’t have, nor solely a question of status conferred by cultural capital, but that it marks your body. If you look at most fellow academics’ hands, you’ll rarely see calluses.
But personally, if I look at most academics’ hands nowadays, ten years after Williams was writing, I mainly see the capacity to feel pain. The whole image of ourselves as “mental laborers” all too easily leads us to undertheorize the fact that our work process consists largely of interacting with machines. We are professional machine operators, even if we don’t think of it that way, because our work process is computerized: we operate computers for a living. That’s not the only thing academics do, to be sure, but it takes up a great deal of our time, as reading, writing, research, grading, and communicating all get redirected into digital formats.
And these machines can readily damage our bodies. Particularly our hands.
For instance, one former graduate student writes:
By the end of my time in grad school, my wrists were in agony, and my left pinky finger was simultaneously strained, pained and numb.
Even an hour of typing per day would lead to aches and pains (up through my forearm) that lasted through the night.
A psychologist writes:
So it appears our geeky heroine may be showing some early signs of Repetitive Strain Injury (RSI). Damn those long hours of coding physio data and its necessitation of bizarre hand movements!!
The critical geographer Jeanne Kay Guelke wrote in her 2003 Road-kill on the information highway: repetitive stress injury in the academy:
I experienced tingling, numbness and hot-and-cold sensations in my hands, together with pain and muscle spasms in my lower arms and wrists.
[…] Rather than finding the ‘disembodied’ freedom-producing, cyborg-like identity in computing as envisioned by some futurists, word-processing had made my own body all too tangible and limiting.
Guelke also mentions that “96.8% of a sample of students interviewed at San Francisco State University reported some physical discomfort associated with computer use” (392).
In fact, almost twenty years ago, the Washington Post was already commenting on the physical problems exacted on students’ hands:
Pax-Shipley, who graduates this month, has had to face the possibility that she may never completely regain the use of her hands. Accepting that fact, she said, “was hard at first. It was a long grieving process.”
Guelke offers a useful disability studies analysis of the whole topic, as well as a number of important political questions. Why, she asks, are we not demanding that computer manufacturers produce gadgets that are friendlier to our bodies? Why do we not have ergonomics clauses in our contracts or as objects of collective bargaining? How do gender and occupational hierarchy enter the picture? Needless to say, there are politics whenever there is risk of workplace injury, as there should be.
But I think my point is more rudimentary. Academics’ hands are fragile. They are a zone of vulnerability. We shouldn’t read them simply by their external signs (whether the nails are polished, whether we see calluses). Rather, we should become attuned to the hidden injuries of digital labor that can get sent out through their very nerves. We should see the hands not as a sign of privilege but as an object of restraint, the part of our body shackling us to the machinery of our work. Far from being signs of our agency and capacity to act in the world, as they are conventionally construed, academic hands on the keyboard strike me as one of the major instruments coupling us to our institutions.
]]>I’ve gotten interested in programming as a stock of useful metaphors for thinking about intellectual labor. Here I want to think about scholarly reading in terms of what programmers call caching. Never heard of caching? Here’s what Wikipedia says:
In computing, a cache is a component that stores data so future requests for that data can be served faster; the data stored in a cache might be the result of an earlier computation, or the duplicate of data stored elsewhere. A cache hit occurs when the requested data can be found in a cache, while a cache miss occurs when it cannot. Cache hits are served by reading data from the cache, which is faster than recomputing a result or reading from a slower data store; thus, the more requests can be served from the cache, the faster the system performs.
Basically the idea is that, if you need information about X, and it is time-consuming to get that information, then it makes more sense to look up X once and then keep the results nearby for future use. That way, if you refer to X over and over, you don’t waste time retrieving it again and again. You just look up X in your cache; the cache is designed to be quick to access.
Caching – like pretty much everything that programmers do – is a tradeoff. You gain one thing, you lose something else. Typically, with a cache, you save time, but you take up more space in memory, because the cached data has to get stored someplace. For example, in my former programming job, we used to keep a cache of campus directory data. Instead of having to query a central server for our users’ names and email addresses, we would just request all the data we needed every night, around 2am, and keep it on hand for 24 hours. That used up some space on our servers but made our systems run much faster.
One day, I had a thought: scholarly reading is really just a form of caching. When you read, in essence, you are caching a representation of some text in your head. Maybe your cache focuses on the main argument; maybe it focuses on the methodology; maybe on the examples or evidence. In any event, though, what you stick in your memory is always a provisional representation of whatever the original document says. If you are not sure whether your representation is accurate, you can consult the original, but consulting your memory is much faster.
I should probably issue a disclaimer here. I’m intentionally leaving aside a lot of other things about reading in order to make my point. Of course, academic reading isn’t only caching. Reading can be a form of pleasure, a form of experience valuable in itself; it can be a process of imaginary argument, or a way of training your brain to absorb scholarly ideas (which is why graduate students do a lot of it), or a way of forming a more general representation of an academic field. All of that is, of course, valuable and important. But I find that, after you spend long enough in academia, you don’t need to have imaginary arguments with every journal article; you don’t need to love the experience of reading; and you don’t need to constantly remind yourself about the overall shape of your field. Often, you need to read only a relatively well-defined set of things that are directly relevant to your own immediate research.
The analogy between reading and caching becomes important, in any event, when you start to ask yourself a question that haunts lots of graduate students: what should I read? I used to go around feeling terribly guilty that there were dozens, or probably hundreds, of books in my field that I should, theoretically, have been reading. I bought lots of these books, but honestly, I mostly never got around to reading them. That wasn’t because I don’t like reading; I do. It’s because reading (especially when done carefully) is very time-consuming, and time is in horribly short supply for most academics, precarious or not.
Now if we think about reading as a form of caching, we begin to realize that it might be pretty pointless to prematurely cache data that we may never use. For that’s what it is to read books pre-emptively, out of a general sense of moral obligation — you’re essentially caching scholarly knowledge whether or not it has any immediate use-value. To be sure, up to a point, it’s good to read just to get a sense of your field. But there is so much scholarship now that no one human being can, in effect, cache it all in their brain. It’s just not possible to have comprehensive knowledge of a field anymore.
I find this a comforting thought. Once you drop comprehensive knowledge as an impossible academic ideal, you can replace it with something better: knowing how to look things up. In other words, you do need to know how to go find the right knowledge when you need it. If you’re writing about political protests, you need to cache some of the recent literature on protests in your brain. But you don’t need to do this years in advance. You can just do this as part of the writing process.
That’s a rather instrumentalist view of reading, I know, and I don’t always follow it. I do read things sometimes purely because they seem fascinating, or because my friends wrote them, or whatever. But these days, given the time pressures affecting every part of an academic career, we ought to know how to be efficient when that’s appropriate. So: have a caching strategy, and try not to cache scholarly knowledge prematurely.
]]>JESSICA MARIE FALCONE
Kansas State University
Anthropology Program
204 Waters Hall
Manhattan, KS 66502
Here’s another example, from Bonnie Urciuoli’s paper on neoliberal workplace language:
Bonnie Urciuoli
> Department of Anthropology
> Hamilton College
> Clinton, NY 13323
> burciuol@hamilton.edu
To be sure, there are good reasons for this information to be available. If you want to ask the author a question, it helps to know their contact information. If you want to get a sense of which universities are supporting certain research topics, it helps to know where a given scholar is working. Or even, if you are trying to do meta-research on academic prestige and hierarchy, it’s pretty handy to be able to see who gets represented and who doesn’t, or maybe to get a really crude measure of gender and racial representation based on the scholars’ names (which inevitably encode certain social characteristics).
That was the case for listing affiliation. But I think there is a strong case that we should stop listing affiliations in journal articles.
In brief: the naming of affiliation is also the creation of stigma. What kind of stigma, you ask? The stigma of precarious employment. The stigma of being out of work, “unaffiliated.” The stigma of career ambiguity. The stigma of not having an affiliation to put in this box.
You really notice the problems of affiliation if you graduate with a Ph.D., for instance, find a job in some other field, but still want to publish an article. Take my former job working in campus IT. Is a job in campus IT a plausible affiliation? I don’t think so: most employers require that you don’t use your job title for non-job-related purposes. What if your employer doesn’t want to be associated with your findings? Wouldn’t you need to show them what you were publishing beforehand? Whatever you might say about academic freedom, there’s less of it for non-academics.
For a year after I got my doctorate, I just kept listing my graduate department instead of my actual job whenever someone asked me for a scholarly affiliation. It beat writing “independent scholar.”
Underneath the current system of declaring one’s affiliations, there’s an assumption that one’s scholarly identity is equatable with one’s job, with one’s institutional belonging, and with one’s paycheck. I think that as global academia gets increasingly precarious, these things are all getting unbundled. You might not get your paycheck from being a scholar. You might have an institutional affiliation that’s partial, that’s barely declarable. You might be broke and unemployed but need to publish in hopes of getting a job so as to get less broke. All of these conditions are ill-served by the affiliation metadata that journals are requiring.
I think they should abolish it. These days, you don’t need to publish your academic department and campus address to be contactable; we have Google and academia.edu if we want to find someone’s CV. Publishing an email address is a sufficient form of contact information.
I think it may make sense to still collect metadata about the employment status of scholars who publish in journals, so that it will still be available for meta-analysis. But it doesn’t need to be published with the article. In my modest opinion.
]]>If one description of scholarly activity is “producing knowledge,” then logically, wouldn’t we expect that there would be such a thing as “overproducing knowledge”? Can there be an overproduction crisis of scholarship?
It’s been said before. For instance, here’s Tim Burke from Swarthmore writing in 2005:
The drive to scholarly overproduction which now reaches even the least selective institutions and touches every corner and niche of academia is a key underlying source of the degradation of the entire scholarly enterprise. It produces repetition. It encourages obscurantism. It generates knowledge that has no declared purpose or passion behind it, not even the purpose of anti-purpose, of knowledge undertaken for knowledge’s sake. It fills the academic day with a tremendous excess of peer review and distractions. It makes it increasingly hard to know anything, because to increase one’s knowledge requires every more demanding heuristics for ignoring the tremendous outflow of material from the academy. It forces overspecialization as a strategy for controlling the domains to which one is responsible as a scholar and teacher.
You can’t blame anyone in particular for this. Everyone is doing the simple thing, the required thing, when they publish the same chapter from an upcoming manuscript in six different journals, when they go out on the conference circuit, when they churn out iterations of the same project in five different manuscripts over ten years. None of that takes conscious effort: it’s just being swept along by an irresistible tide. It’s the result of a rigged market: it’s as if some gigantic institutional machinery has placed an order for scholarship by the truckload regardless of whether it’s wanted or needed. It’s like the world’s worst Five-Year Plan ever: a mountain of gaskets without any machines to place them in.
But this isn’t exactly the kind of overproduction that I’m talking about. This is what I would call herd or mass overproduction, a sort of overproduction that “ordinary academics” produce as a matter of survival in a scholarly system that incentivizes publication quantity. “Ordinary academics” — the ones who have jobs or want jobs of the kind where publishing is required, that is — which isn’t all academic jobs, by a long shot.
Herd overproduction, on Burke’s view, is a generic state of being, a thing “everyone” is doing. But what I’m thinking about is a different kind of overproduction: let’s call it elite or star overproduction. That is, the kind of overproduction that academostars often practice. To be clear, not all recognized academostars overproduce in quite the way I mean, and conversely, some singularly prolific academics are not necessarily at the very top of the scholarly prestige hierarchy — but there is a strong correlation between hyper-prolific writing and star status, in my experience.
Star overproduction does something different than just produce a mass of mass expertise. If herd overproduction produces relatively generic, interchangeable, unremarkable research commodities, then star overproduction reinforces big-name scholarly brands; it’s more like releasing a new iPhone every year than building a minor variation on a cheap digital watch. Star overproduction captures an outsized share of academics’ collective attention, more as a matter of general brand loyalty (“I like to keep up on Zizek”) than because it is necessarily the highest quality academic product. As a corollary to this — and this particularly irks me — certain hyper-prolific academics really let the quality of their work slip as they begin to hyper-overproduce. It’s as if they just have too many obligations, too much exposure, like a decent band that just doesn’t have a dozen good albums in it. Fact-checking sometimes gets iffy; the same arguments get repeated.
All this makes me wonder: after a certain point in a hyper-productive career, might it be more ethical to pass the floor to other, more marginalized academics?
More generally, if there is a market for scholarship, could it be a better-regulated one? Scholars like Marc Bousquet (The Rhetoric of “Job Market” and the Reality of the Academic Labor System) have rightly criticized the notion of a “market” as an adequate description of academic labor allocation, but market-style social dynamics do crop up in a lot of academic life, in my experience, and the critique of market ideology need not preclude regulatory projects of one sort or another. For instance, might we have a collective interest in preventing oligopolies of knowledge? In preventing overly large accumulations of academic capital? Could we help the marginalized publish more by placing limits on publishing success?
Or to be a bit hyperbolic, but also more concrete: Could there be, hypothetically, a lifetime quota on publication, a career-length word limit? Suppose, for instance, that such a limit were set at a very high level that most of us would never approach — but if you did get to the limit, your time would be up?
]]>