novel-gazing: a prologue

It is easy to lose track of time as we grow older. One minute your child is teething, the next they are getting married. An event that must surely have happened a year ago turns out to be a decade or more in the rear view mirror. One wakes up one morning and sees your PhD advisor's face staring back in the mirror.
That last example of course is specific to the subspecies Homo Academicus, of which I am, for better and worse, very much a member. I have spent 41 of my last 42 years on college and university campuses, the past 25 on the campus on which I was born when my parents were MFA students here in the 1960s. University campuses have become my native ecosystem, and I have passed a good portion of my life tending to my own narrow niche within the system while exploring the range of life that thrives (and doesn't) in these spaces.
Like more natural ecosystems currently under profound threat, the environment of higher education has changed dramatically since I first walked onto a campus in the early 1980s. This is especially so for my own niche: the academic Humanities. I was born into this environment at a time when it seemed, if not central, at least essential to the post-war university. The professors in college and grad school who reared me had been reared in their time by professors born with similar assumptions as to the stability and value to the larger ecosystem of this odd niche. Today, of course, this habitat is under attack, an assault whose ferocity is absurdly out of alignment with the influence of the Humanities in the modern university.
Increasingly struggling to define themselves in a world it cannot fully recognize, many of my vintage are hunkering down in home offices, so beautifully renovated during the pandemic lockdown, refreshing the totals on their retirement accounts. Many younger faculty, finding the halls empty and the ecosystem starved for nutrients, are finding what comfort they can in small clusters of increasingly microscopically defined subfields—hoping to recreate on a smaller scale some semblance of the community they had expected to find based on campus novels fifty years out of date.
It is typical of me, I suppose, that I have lived my academic career somewhat in reverse. As I will discuss in more detail in later installments, I began my career struggling against profound despair about ... well, all of it. The academic Humanities of the 1990s in which my career launched in earnest was confident of its own importance to the present and future of the university. Everyone was abuzz with their own caché, or principled refusal thereof. Everyone was bouncing their ways up the ladders of success—some of it measured by the rankings of their various institutional leaps, more of it by the number in the audience waiting to hear them read a paper at the annual MLA conference.
And I hated it. My misery, it would turn out, was not solely due to academia, of course. The story of my decades-long journey through layers of busted wiring towards a center from which I could—with the help of amazing caregivers and many medications—start to build a healthy, happy sense of self will surely be part of what I share here as well. But even if I overestimated the role academic life played in my misery for many years, there is no questioning that the fertile soil of academic humanities in which so many of my colleagues thrived was often more noxious than nourishing for me.
Maybe it was my discomfort with the norm that left me able to spot the coming storm earlier than some of my colleagues. I was doing the scheduling for the department from 2003-06, shortly before we turned the job over to professional staff who could manage the complexities of enrollments, classrooms, faculty and grad instructors, and budgets better than an English major like myself. I liked the work, though, and I took it seriously (more on my compulsive relationship to what we academics unpleasantly call "service" in another installment). And from that vantage point I saw the problem earlier than I otherwise would: there was a slight but consistent curve downward trending after the dot-com bubble burst in 2001. Majors, overall enrollments, all of it. Something was happening.
I flagged it to my chair and colleagues. But if there is one thing professors and university administrators agreed upon at the time, it was that we should always proceed as if the baby boom of the 1950s would magically continue, justifying the need for new departments, majors, and universities. By 2005, universities themselves were starting to shutter in numbers that, while nothing close to our current killing grounds, should have sparked attention. But there were numbers at the university level that justified the sense of confidence in the enterprise as a whole. After all, the percentage of young people in the U.S. attaining college degrees continued to grow.
But wait. Why were numbers of college graduates continuing to grow, sufficient to justify ever more growth in the massive enterprise of American higher education? There were plenty of good reasons why that growth should have peaked as population flattened. The Internet was a decade on, and despite a major tumble in 2001, it was growing at a breakneck pace. In those early years of the digital revolution, it was still possible to believe that a true democracy of knowledge—both its consumption and its production—was at hand. Youtube had been founded in 2005 (purchased by Google one year later), and for the first time my film students were showing up at college having already produced and uploaded their own short films. How long, I wondered, before all the knowledge universities had monopolized for so long would be available for free for the public?
As it happens, universities were not worried about this question in the way we in the Humanities should have been. As early as the early 90s, I had noticed a dramatic shift towards "metrics," as the consultants explained to us at a faculty meeting at Grinnell College where I began my teaching career. The faculty gathered that day were to a one contemptuous of these outsiders telling us we needed to prove the value of our education to our students' futures in terms of hard data. This was an elite liberal arts college, one dedicated to the tradition with a ferocity that made it feel, even then, like something of a relic. Older faculty actually referred to themselves as "New Critics," something those of us newly-hired from our elite coastal universities had been taught to revile as conservative and antiquated (even as most of us still taught students skills that would have been entirely recognizable to any New Critic from the 1940s). How does one quantify the benefits of close reading? Of having read all of Ulysses?
30 years later, of course, we no longer flinch when told to engage in assessments that will result in data that can be fed into the charts used to explain return on investment. It is so deeply embedded into the industry of higher education that, even as we know much of what we do in the Humanities remains unmeasurable in terms of short-term assessments, we know that everything from university accreditations to department's ability to replace a retiring faculty member depends first and last on hard numbers.
In the 90s and still today many Humanities faculty bemoaned the emphasis on such numbers as symptomatic of the new "corporate university"—a sign that Trustees and Administrators (those perennial bogeymen) wanted to run the university "like a business." Much to say about this sophomoric assessment of contemporary higher education in future installments, but for now suffice it to say that it was a sign of what will be a repeated theme in this particular campus novel: the profound inability of Humanities faculty to accurately see the systems in which they are embedded. And yes, the irony that this is what Humanities faculty think they do particularly well, better than anyone else, in fact, is not lost on me. Nor is the fact that I myself am engaged in such hubris here. Foreshadowing of ironies to come.
In truth, of course, the turn to metrics was part and parcel of a range of forces beginning to apply new pressures and scrutiny to universities–public and private—but also as a strategy for continuing to increase the pool of applicants even as the college-aged population itself leveled off. If you can't make more teenagers, the solution is to convince more of the existing population that going to college, expensive and exhausting as it is, is an investment in a more secure future.
If the wealth benefit of attending college could be sold as the reason to attend college, sufficient so as to draw in students who might not otherwise have attended, it could—and it did—become a self-fulfilling prophecy. The very act of going to college was, and to a diminishing degree still is, a sign of a commitment to future-thinking and self-investment that was in and of itself attractive to the service economy of the 90s and 00s. The skills mattered less than the character and commitment represented by the choice itself. And the more prestigious the university, the more intelligence could be supposedly inferred from the degree, fueling the never ending race (or, more accurately, churn) of the rankings wars that dominated university-level thinking from the early 90s to today.[[1]]
[[1]]: David Brooks is not someone I am inclined to cite, but strange times, strange bedfellows, and all that. And his recent article in Atlantic, despite the sensationalist title, hopefully chosen by an intern, has some valuable observations about the meritocracy of elite institutions ginned up well before the start of my own story. https://www.theatlantic.com/magazine/archive/2024/12/meritocracy-college-admissions-social-economic-segregation/680392/
When I arrived at Ohio State University in 1999, we had roughly 1300 majors. Today we have roughly 450. The glimmerings of bleaker prospects I fretted over after the dot-com crash of 2001 would be realized with devastating brutality in the wake the Great Recession six years later. Not all of the steady decline to our current levels could be directly pinned on the Recession, to be sure, but it was all of a piece. In succession over the course of the next 15 years, we would experience additional blows such as 1) the state lowering the AP score required to place out of the first-year writing requirement; 2) a dramatic push from the state for more early college access, allowing students to place out of ever more Humanities GE courses while still in high school; and, most recently, 3) a university overhaul of the GE itself which lowered the required courses students had to take in the Humanities and eliminated entirely the second-level writing requirement.
While these and related blows would explain a decline in general education and writing class enrollments, they don't account for the fall in English majors. To explain the latter, English faculty are prone to conjure a caricature of a parent unwilling to allow junior to pursue his dream of studying modernist literature. The reality, as any English major knows, is that parents—even those who had been English majors themselves—have been anxious about their kids pursuing Humanities majors for at least the last century. It is hard to imagine the fever pitch parental disapproval would have reached to scare away this generation of literary magazine editors and Austen cosplayers after that long history of parental failure.
The painful truth of course is that it is the students themselves who have made this decision. More than their parents (and much more than their English professors), our students are keenly aware of the uncertain future that awaits them. And the economy is only one ingredient in the lava field that is their post-college landscape. To name but a few of the obvious: climate change, the rise of AI, the rise of autocracies at home and abroad, and the darkness at the heart of humanity which social media has made all too visible. Is it shocking they question whether they have the luxury of hoping the English major leads them to meaningful employment (with health insurance)?
I hear my tribe protesting. Yes, there is data to suggest that Humanities majors ultimately fare quite as well as other majors over the course of a career. As discussed above, in the long run, any college degree will provide financial benefit, with for the most part only marginal differences determined by choice of major. In addition, there is evidence that Humanities graduates possess of skills which are increasingly in demand, including those skills necessary for lifelong educability in a world where jobs will transmute in the time it takes to onboard. Our majors have a still-undervalued ability to synthesize and make connections across disparate bodies of knowledge, and the belatedly-acknowledged value knowing how to use language and how to play well with others.
The problem is that many of today's undergraduates believe—and not without reason—that it may take several years for those advantages to be properly rewarded in the job market. And many students aren't confident they have the years to secure a nest-egg sufficient to provide security when the economy turns inside out yet again. From 9/11 to the Great Recession to COVID to whatever is happening now, the 21st century has proven beyond a shadow of a doubt that uncertainty is the only certainty. Most of the conventional English major starter jobs—publishing, editing, journalism, grant writing for non-profits—are already evaporating as AI promises cash-strapped industries breathing room (at the expense of training the next generation of expertise). And while we tell our students, honestly, their skills are very much in demand by employers in fields they might not imagine as hungry for English majors, the search for these jobs feels like hunting unicorns when none of them show up at job fairs asking for English majors.
Combine an economy (and planet) that renders the future uncertain with ever-increasing tuition, and add a dash of decades of university marketing ROI as the argument for a college degree and the results are inevitable. With the demographic cliff upon us, we now face a new cliff of our own making as more and more would-be students (especially boys) factor the lost earning years a college degree requires in with the uncertain future of the economy itself. Suddenly the ROI argument of an additional $25K a year seemed less persuasive if winter is truly coming.
In my extensive experience since shifting my teaching focus to GE classes in the past 15 years, students are no less interested in the Humanities as they were when I started my career 30 years ago. If anything, they are more clear as to what it is they value in the Humanities; precisely because they are deprived of those qualities in their engineering, business, and pre-med programs, they do not take for granted their rare general education class in art history, film, science fiction, or classical mythology. But they hope their degrees will lead to decent pay and health care right out of college, allowing them to pay down their loans and save enough to provide a safety net. And they hope that they will have opportunities to pursue their inquiries in the Humanities once they are settled on the other side. In truth, thanks to the Internet and the infinite jukebox of the digital age, they probably can.
There is more to this story of course. I have many miles of rabbit holes to dive into before I bring this campus novel to a close. But for tonight, this breathless race through the last three decades of my chosen profession brings me back to the strangest part of my own story.
I like my job more today than I did for much of my first two decades as a professor. I should feel guilty confessing this. I do feel guilty that I don't. But in truth, as the academic Humanities of my younger years crumbles I find myself feeling more excited about possibilities for the part which the Humanities might play in helping to build a new kind of university. As the rest of this post no doubt makes clear, I am far from a utopianist. For all my flaws and bad wiring, I keenly realistic about what the future (and present) holds. I believe higher education and the nation as a whole are entering into an historically challenging period. Things are bad, and they are going to get worse before they get better.
Any fantasies I have about a reimagined Humanities will be on pause while higher education and other vital institutions of knowledge, research, care, and democracy weathers the current assault. My greatest fear is that the Humanities will end up on the chopping block as universities find a way to survive the devastating financial blows raining down on them from Washington. Certainly, I expect what remains on the other side will be smaller. I expect much to be smaller. But I do believe the Humanities will persist because, contrary to what so many faculty believe, university leadership continues to believe in the Humanities, even if many in leadership have a harder time explaining why than their predecessors a generation or two before.
That last part is on us. But an existential crisis of the kind we are entering now is precisely the time for the Humanities to recover its own explanation of its value-added—not in terms of future income nor some Humanities spin on "art for arts sake." I suspect that some of the work ahead will involve a return to some of what we came to scorn following the post-structuralist turn: the Humanities professor as secular priest ministering to the campus and the public as to the best (and worst) in humanity; the Humanities professor as advocate for and counselor to the university's more conventionally profitable enterprises; and the Humanities professor as public servant talking to eldercare groups, public library audiences, and whoever else will have us about whatever it is they want to learn more about.
Of course, two or three generations past the point where being labeled a "Humanist" by a fellow Humanities scholar had become an insult, this will not be easy. And the goal is not to return to business as practiced in the 1940s, for many reasons. But we can't begin to imagine what comes next until we can get over our dreams that the 90s are ever coming back. More urgently, we have to get over ourselves, over a fantasy that our readings, arguments, answers, cleverness are what people want from us. We need relearn how to help others answer these questions for themselves. Because at its essence that is what the Humanities is for.
So, yeah, I'm optimistic that this profession I reluctantly entered 35 years ago might now have a role in helping to make the future navigable, bearable, human. One of the many burdens I place on this blog is to help me understand whether that optimism is in any way justified. It is also to come to better understand the past of higher education as one tool by which to better predict its possible futures. And it is to come to better know myself and my own complicated marriage with higher education, entered into for all the wrong reasons only to find a deepening love and devotion when all the signs are flashing towards Reno.
There will be no linear order to this "novel," and I will jump from topic to topic as the knots present themselves to me for untangling. If all of it never adds up to anything coherent by way of the larger answers I am hoping for, at least I will have put down some sense of what it is to be very much at the heart of the enterprise of both the academic Humanities and higher education itself in this unprecedented moment in history. Maybe when the time comes to pick up the pieces and start rebuilding, someone will find something useful—or at least entertaining—in the words to follow.
For now, suffice it to say that I have some explaining to do, if only to myself. And I have some very uncomfortable and deeply unpopular opinions to share, many of which I have kept to myself for fear of being shunned by my colleagues as traitor to the faith. But while we keep the candles burning so that it will light the way for those who come after, we do so hoping they will not repeat our errors but will instead find their way further along the path of knowledge and understanding than we were able to make it in our time.