I realize it is meaningless to harp on the failures of past authors, but I was still struck by this very blithe statement from a psychoanalytic scholar in 1970, in a paper on “The Concept of Reality Testing.” I suppose I usually think of the 1970s as the beginnings of our intellectual present, rather than as a past epoch. And yet:
Werner (1948) showed how mental development proceeds from syncretic (non-differentiated), diffuse, indefinite, rigid and labile to more discrete, articulated, definite, flexible, and stable functioning. In primitive mental functioning (where Werner convincingly demonstrated formal parallels among higher animals, human children, primitive peoples, and schizophrenic and brain-damaged human adults) objects in the external environment are not apprehended as things with separate, fixed characteristics. Rather, objects tend to be understood in relation to their emotional and motor connection with the perceiver, animistic qualities are often imputed to inanimate objects, and there is an inability to distinguish separate parts, or to discriminate between essential and non-essential characteristics.
I was reading the paper in question because I’ve been thinking about whether the notion of “reality testing” can be analytically useful. In particular, I’ve wondered whether it could apply to moments where people aggressively press the boundaries of institutional reality, to the point where social reality itself seems to be having a breakdown. I saw numerous people in my French fieldwork complain that situations were getting “surreal,” meaning not-normatively-real, and I find it useful to see these as moments where people were testing their normative impressions of reality against their immediate situations.
But it’s demoralizing to see that psychoanalysts, who are the main theorists of reality testing, have also used the notion to reinforce these hyperprimitivist notions of lower, savage, child-like, animalistic thought. If anything can still be made of the notion of reality testing, it would need to be cleansed of this primitivist genealogy.
I’ve been exceptionally dismayed this year by the retrograde, anti-open-access, profit-oriented publication philosophy at the American Anthropological Association. Earlier this year they announced that they were renewing their publishing contract with the corporate behemoth Wiley Blackwell. Now I notice that they also have a horribly misguided commenting policy for their online news site, Anthropology News. Here’s what the policy says:
Want to comment? Please be aware that only comments from current AAA members will be approved. AN is supported by member dues, so discussions on anthropology-news.org are moderated to ensure that current members are commenting. As with all AN content, comments reflect the views of the person who submitted the comment only. The approval of a comment to go live does not signify endorsement by AN or the AAA.
On the one hand, this only means that anthropologists who can’t afford the Association’s exorbitant annual dues are going to be further excluded from the Association’s public forums. (There are rumors that many anthropologists only pay the annual dues in years when they are attending the Annual Meetings, because otherwise membership confers few useful benefits.) I am certain that no one is going to be incentivized to join the AAA merely to write a comment on this site, which implies that policy constitutes a harmful form of economic exclusion within the profession without any identifiable upside.
But on the other hand — and even more importantly — this commenting policy just further emphasizes the Association’s paleolithic relationship to technology (cf. their latest tech fail), and in particular their weak grasp on the culture of web publicity. Websites like AN are public spaces. There are cultural norms about how online discussions work in such spaces. It flagrantly disrespects these norms to provide public commenting facilities — as on any blog-like site — and then to deliberately reject all comments by non-dues-paying members.
To be clear: you don’t charge people cash to comment on your articles, because they are already giving you something for free by writing their comments. To comment is to contribute. To comment is to create a space of exchange where otherwise you just have a one-way transmission into the digital void. It’s fair to ask people to create accounts before commenting, to cut down on abuse, but there’s little precedent for making it into a cash transaction.
If you want to have members-only web forums, the generic convention is to hide them behind a login screen for members, instead of coupling a public comment box to an anti-public message. Thus the current policy is both hostile to the digital public and out of touch with web culture.
Sometime earlier this spring I asked the students in my Digital Cultures class to each write down a sentence (on a post-it) about what education was for.
“Education is intended to improve people’s intellectual ability. What it really does is create arbitrary competition among people.”
“To gain knowledge, $$, and power.”
I thought their answers were quite interesting, partly for the interrupted way in which a healthy cynicism makes its furtive appearance, and partly because I suspect that my students largely fed back to me the stock narratives that the college was always feeding them (about critical thinking, opportunity, etc). In other words, the students always tell you what they think you want to hear. Or rather, since they rarely know much about you individually, what they think a generic professor would want to hear.
At the same time, perhaps I should give them credit for being quite idealistic on the whole about the value of education. Here’s what they said.
Education is for students to learn how to critically think. Being educated helps you understand the world and aspects within it.
Education is supposed to be for the expansion and knowledge of all people regardless of age, race, gender, or religion. However, education has become a privilege to those who can afford to pay for it and the access to resources.
Education is for the purpose of creating an elite status. Education (for the most part) accelerates an individual to success + subsequent wealth (usually). I think this is the motivation to pursue higher education.
To provide us with options, expand our perspectives & increase understanding/empathy. Also, to let us know how little we really know.
To learn – learning fosters personal & societal growth. So essentially education is for fostering growth.
One could write numerous things about masculine domination in French philosophy, and many have done so. Right now, for instance, I’m engrossed in Michèle Le Doeuff’s programmatic 1977 essay on this question, “Cheveux longs, idées courtes (les femmes et la philosophie),” which appeared in Le Doctrinal de Sapience (n° 3) and was translated in Radical Philosophy 17 (pdf). I hope to write more about that essay in the near future, and its remarkable comments on pedagogical erotics and transference.
But in the meantime, as a sort of tiny case study, it’s also useful to consider specific cases of philosophical or theoretical masculinism. I recently wrote a bit about Derrida and a bit about latter-day Marxist theory. Today I have a few tidbits that I found in David Macey’s 2004 biography of Michel Foucault:
While teaching in Clerment-Ferrand in the early 1960s, Foucault “cause[d] a scandal when he appointed [his partner Daniel] Defert to an assistantship in preference to a better-qualified woman candidate” (p. 64).
When Foucault travelled abroad in 1973, “he was not happy when he had to attend formal receptions where he had to be polite to women in long evening gowns” (109). (I presume that Macey is trying to voice his subject’s own attitude, and not merely showing his own biases.)
Describing Foucault’s general outlook in the early 1970s: “Feminism was of little interest to Foucault and had little impact on him, although he did publicly support the right to abortion and contraception. He has often [been] criticized for his masculinist stance and it is true that neither the book on madness nor that of prisons looks at gender or takes account of the fact that women and men tend to be committed to both prisons and psychiatric hospitals for very different reasons” (103).
An eminent professor at a well-known university on the East Coast once alerted me to two distinctions. First, between students who need to learn that they matter just as much as everyone else, and the students who need to learn that everyone else matters just as much as they do. Then, between students who are smarter than they think they are, and students who think they are smarter than they are. The joy of teaching here is that so many of our students are smarter than they think they are, and need to learn that they matter just as much as everyone else.
On a crude first approximation, these two distinctions could be glossed as “elites vs non-elites” and “narcissists vs self-deprecators.” One might of course guess that the two distinctions sometimes map onto each other, but that’s not what I wanted to say.
What I wanted to say is just that, in my fairly brief experience teaching, there is a weird problem with the first distinction — “between students who need to learn that they matter just as much as everyone else, and the students who need to learn that everyone else matters just as much as they do.” In brief: some non-elite students both don’t think they matter, and are curiously indifferent to the mattering of some further others.
Last Friday, as my last work event at Whittier College (since my postdoc contract is finishing up), I went to graduation. A few observations on graduation as seen from the faculty perspective seem to be in order.
The actual experience of sitting on the platform was surprisingly unstructured. We were far enough from the audience that people chatted to each other a good deal, often in low voices to avoid disturbing the proceedings. Everyone was provisioned with a water bottle and a program, and arranged into three long rows of seats facing the audience, behind the higher-ups. There was an amusing hierarchy of chairs, such that the Trustees had brown wooden chairs, while the faculty had white plastic ones. Longtime attendees seemed to have strong views on where to sit, and arranged themselves in the faculty marching lineup with an eye to ending up in their preferred seat. The front row had a better view, but were conversely more on display; whereas the back row was somewhat shaded from the harsh sun by a backdrop. Individuals’ seating strategies sometimes led them to depart from their place in the official lineup, which was supposed to be in rank order.
In a sign of the times, a lot of faculty people were on their smartphones during the ceremony. Most people didn’t use their phones the entire time, but did consult them at least every few minutes. If you look carefully at the above picture, the professor with the pink hood has her phone out, possibly taking a photo of the audience. Continue reading Graduation as seen by faculty→
I’ve taken to writing little end-of-class reflections, which I read to my students on the last day. Here’s my reflection on my last day teaching at Whittier College. (The class was about digital cultures; you can find some of the course materials online at GitHub.)
Coda: Anthropology of Digital Cultures
Let’s start out with a definitional question. Does this expression, “digital culture,” actually mean anything? In one sense, a digital culture is really just a culture. All cultures have technology. All cultures have media. All cultures therefore also have ideologies about their media, which enable people to actually communicate (or have communications breakdowns). So “digital culture” just becomes a synonym for “the world you live in.” That we live in.
If Noam Chomsky had done nothing else, he would have given us one of the strongest critique of the New York Times as the guarantor of nationalist ideology for the U.S.’s professional-managerial classes. But there’s another good reason to not read the Times besides its obvious ideological problems. Namely: that it promotes an intellectual monoculture. Too many scholars and academics read it to the exclusion of anything else.
I’ve long had a memory of having seen this complaint crop up in earlier decades, and I just stumbled back across its source in a 1969 paper by Donald Campbell (in which he critiques the “ethnocentrism of disciplines” and advocates a “fish scale model of omniscience,” but that’s another story). Here’s Campbell critiquing the “scholarly ego ideal”:
While on the theme of recreational reading and the duplication of fish scales, it seems appropriate to deplore the tendency of social scientists to feel that they all should read current newspapers, particularly the New York Times. Certainly the collective perspective would be better if most spent the equivalent time with newspapers of other epochs or with historical, anthropological, archaeological, or literary descriptions of quite other samples of social milieus. Rather than the ego ideal of keeping up with the current worldwide social developments, the young scholar should hold the ideal of foregoing current informedness for some infrequently sampled descriptive recreational literature. Too often our ego-ideals settle for uniform omniscience, knowledge of both past and present, of both here and there, and too often we settle for the same pattern of compromise all our colleagues are settling for. Compromise from the Leonardesque aspiration there must be, but even in leisure reading, one can hold as ideal the achieving of unique compromises.
One time a friend of mine, Mike Bishop, asked me an interesting question about the ethics of deviating from norms:
“In what sense is deviance important for its own sake, rather than merely being necessary (perhaps even regrettably necessary) because “the good” is not socially acceptable in all contexts?”
A few ways of thinking about this came to my mind:
1. Deviance is always morally necessary because all (known) social systems are imperfect, so it’s just guaranteed that some good things will come across as deviant, no matter what social context you inhabit. Thus, deviance gives flesh to the inevitable clash between normativity and virtue.
2. Deviance is necessary as a way of demonstrating anti-authoritarianism, that is, as a counterforce pushing back against social discipline and authority. While some kinds of authority are admittedly better than others, every authority structure needs to be reminded constantly that it is not absolute or without flaws. Thus, deviance expresses a primordial resistance to domination.
3. Deviance is a good thing because vast seas of cultural likeness are just hideous. Thus, deviance expresses a basic aesthetic of diversity.Continue reading Why deviate?→
Najat Vallaud-Belkacem is the first woman Minister of Education in France, in office since 2014 in the second half of François Hollande’s presidency. (Before becoming Minister of Education she was also the Minister for Women’s Rights and subsequently also Minister for Youth, Sports and of Urban Affairs; it turns out she isn’t the first French Minister of Education to use Twitter.) She was born in Morocco (and has had to think plenty about eluding the “diversity” pigeonhole); I’ve long been struck by her charisma as a public speaker (which isn’t to say that her political projects have always been unproblematic, needless to say).
In any case, I came across a recent interview in which she makes an interesting comment on the cultural value of education in France: