I have a PhD in cultural anthropology and I’m a software engineer. How did that happen?

I guess I’ve had a bit of the hacker spirit for a long time. It was part of the culture I was raised with. Building things. Stumbling into new places. Asking questions. Being skeptical, but not just in a negative way: skepticism can be a very hopeful gesture.

Exploring. That’s what it’s about for me: exploring.

It seems like I have had an unusual path through tech, compared to everyone who was a CS major in college and worked in tech their whole career. Here are some notes on how that happened.


I was a kid in the late 20th century. When I was little we had a Macintosh SE (when my dad did graphic design) and an old PC with DOS. I wrote some little BASIC programs and (believe it or not) some HyperCard stacks. I think the first machine I owned was a Macintosh Performa. I spent a while trying to learn native GUI programming at that point, using a thick reference manual for Apple interface building. I think the only thing I ever finished was a screen saver demo animation.

Computers were only one of the technical systems I used to like. For a few years, I was in love with video production. At that point the professional gear was still largely analog. I spent some time in TV studios — they were little, but run by professionals. I crewed some educational broadcasts that went out on satellite; I was an intern at a local cable company for a year; I went on some multicamera shoots, with a van on location. I never tried to do it for a living, not even close. I just loved being around the technology, the visual design part too, framing shots just the right way, cutting clips at just the right moment. I played with editing gear and I made a trippy video of my own about the alienating landscape of my high school.

Then I got deep into lighting for theatres. I worked in summer theatre as a stage electrician; I ran a followspot one season; I climbed a lot of ladders and catwalks, lugged around a lot of gear, and worked late nights for free or really bad pay. I had a lighting design teacher from the local university, which had a MFA drama school. He brought me as an assistant to one of his professional gigs, doing lighting design for an opera. I learned how to design lights for a show, how to run the lighting console, how to plan the logistics.

There’s a lot of hacker spirit in theatres. You’re building things that you just dreamed up. You’re running at the very limits of your capacities. It’s a wild place.

Meanwhile, I was taking some computer science classes — mainly C style languages with object oriented features thrown in, as used to be the rage in the 1990s. I must have done a year of C, a semester of C++, and a semester of Java. We did quicksort and I wrote a really basic web crawler implementation. Meh.

I just didn’t love CS classes. Whatever the hacker spirit is, they didn’t have enough of it. They were taught in a pretty rote, “memorize this” way. They were dull. The intro ones were all a little too easy. And they didn’t help me (when I was 18 or 19) figure out the answers to the big existential questions that I desperately wanted to figure out.

So I ended up studying a humanities field, cultural anthropology, that was a lot better at big philosophical questions than anything I found in STEM.

Around the same time, I found out I could get paid to build software without finishing a CS degree.


That was the beginning of a long, meandering period where I learned tech on the job.

I started to work for a language laboratory in college, Cornell’s Language Resource Center. I learned some Python, which powered our then-fancy Zope platform. I wrote online quiz software for language learners (we weren’t using commercial learning management systems in those days). Before long before we needed non-document-based data storage, so I set up a MySQL instance, made it talk to Python, and learned something about normalized database schemas.

Then I went straight to grad school in cultural anthropology and didn’t write much code for a few years.

After I finished the field research part of grad school, I got back into web programming at the University of Chicago, in the IT group for the humanities graduate school. At first I worked on their public-facing websites (mainly Drupal); I remember building a custom event planning module for a big annual event. Next door, there was a web applications programmer who was building internal administrative software in Ruby on Rails, which sounded more exciting. He gave me a crash course in Ruby, in the MVC pattern, and in test-driven development. Soon he left for a startup, and I got hired into his position.

I found myself going every day to the office and sitting at a big software development setup with a bunch of monitors.


What I loved about my first full-time software development job was that I had so much to explore.

The culture of technology was wild around then (~2012). In academia, things move really slowly, but in tech, there was constant flux, shifting trends, unstable new projects. The “new Javascript framework every month” thing was setting in taking hold. A lot of history was happening, somehow.

My boss told me once that I had a particular skill: it wasn’t just that I could write code, it was also that I could take a preliminary set of requirements — usually not very clear ones — and then build working systems from that starting place, pretty much all by myself, without needing micromanaging. I have to say, I enjoyed the autonomy I had.

I was building administrative applications in Ruby on Rails and Javascript. So we had clients, but they were only internal clients. I got a salary and we were free from commercial pressures.

I built our testing infrastructure up from almost nothing. I built a realtime dashboard app to monitor activity on our products. I got a lot of practice triaging production exceptions (which ones are urgent? which ones can wait a little bit? do we need extra logging or debugging?). I worked on performance, on authentication systems, on database design. I built lots of things.

As long as there were tests and there was a good project plan, I was trusted to be an autonomous professional and to do solid work.

I didn’t realize until later that in bigger commercial environments, you usually don’t run your own projects as a software engineer. But in that case, I was the only web application developer in our group, so I spent quite a bit of time talking with our clients, mainly admin staff who needed software to simplify their work. I knew almost every one of our users by name. They could email me and I would help them if they needed support.

That human connection was possible because we were writing software for only a few hundred users at most.

These days, I have to say I miss that sense of personal connection with the users.


Looking back, it seems so improbable that I could go from playing with HyperCard in the 1990s to being a professional software developer a few decades later. I suppose life is full of those surprises.

I just still look out for the moments of joy and exploration in what I do.

And I automate the boring stuff.