At first it happened slowly, then all at once. We are now living in a world that would have been considered science fiction to any twentieth-century novelist. What precisely would a writer like Raymond Carver make of self-driving cars, cryptocurrencies, drones, augmented reality, or even just iPhones? That’s all now, not tomorrow.
To some degree, fiction writers are adapting the ontologies of their work to fit this new world, although I think the response can be characterized as tepid so far. There is now the common stylistic choice to include explicit bubbles of text messages à la Jonathan Frazen’s Purity. I think more often the writer’s first instinct is to sidestep these changes or minimize them for the sake of some kind of stable cosmos, like how in Donna Tartt’s The Goldfinch the characters still write handwritten letters rather than emails. In this aspect Franzen’s book comes across as too of-the-moment technologically, and already feels out of date to me, whereas Tartt’s comes across as untethered and fantastically ahistorical, demonstrating how difficult it is for novelists to walk this tightrope. Too much focus on these changes can easily veer into kitsch. In Richard Powers’ Galatea 2.0 an English professor teaches an artificial neural network to pass qualifying exams in literature. That is, until it discovers the internet, learns about nuclear war and GMOs and Brexit or what have you, and turns itself off in protest. It was too good for this world!
Not that any of this is precisely a new problem. In his essay, “Fictional Futures and Conspicuously Young” David Foster Wallace writes:
A fine and contentious writing professor once proclaimed to our class that a serious story of novel always eschews “any feature which serves to date it,” to fix it in history, because “serious fiction is timeless.” When we finally protested that, in his own well-known work, characters moved about in electrically lit rooms, drove cars, spoke not Anglo-Saxon but postwar English, inhabited a North America already separated from Africa by continental drift, he impatiently amended his ruling’s application to those explicit references that would date a story in the frivolous “Now.”
Wallace was young and the proscriptive professor was old, so it’s easy to parody the point, but I think it’s also worth considering that (a) this is indeed a problem, and (b) there has to be some sort of least inelegant way to account for all this change as it continues to accelerate.
Normally writers move beyond the “Frivolous Now” by communicating about the nature of family, of men and women, of children, of death and birth, of love, of hate, of war. This makes sense. These are the oldest dramatic structures in evolution. They will be around for a while yet. But it gets more tricky when the narrative becomes specific to culture or history, nevermind to technology, which is ever more a part of people’s lives, bringing us to a point where writers must engage in intuitive guesswork about what will become relatively eternal or timeless, or at least, not frivolously disappearing over the event horizon of technological change. How can writers make something feel contemporary without also immediately dating it?
One possible approach is to dig up the roots of the technological change, to search for the etiology behind it. To find the thing that’s really changing. Which is, in my opinion, the growing secularization and popularity of science in culture. Over the past forty years science has evolved from an occult institution constituted of networked experts working within covert circles into a popular industry still producing empirical results but also now entertainment, best-selling books, news, gossip, and talking heads. Pope Francis has 16.8 million Twitter followers. Neil deGrasse Tyson? 11.8 million. There’s really no question that we live in a scientific age, not in that we are any more rational, skeptical, or evidence-based than in times past, but rather that culturally the double helix of DNA packs as much a semiotic punch as the symbol of the Christian cross does. Perhaps more. Almost certainly more. By that I mean scientific concepts are increasingly part of felt reality for a growing segment of the population with no sign of where this trend could possibly reverse.
This level of interest in and knowledge about science has been brought about by unprecedented education levels and also the opening up of science’s inherently cloistered nature. The notion of scientific outreach as something to be celebrated, rather than suspicious of, is new within the research community. Even when Carl Sagan was nominated to the National Academy of Sciences he was eventually rejected with backroom whispers of “too much television.” Now every graduate school department has courses and credit devoted to scientific outreach. People sport science t-shirts on the subway and “like” photos from the Hubble telescope on social media.
Never before has popular interest been so high, and this represents an opportunity for novelists, should they choose to take it. Rather than incorporating the technological fads that blur past the creaking wooden gears of the publishing industry, a writer can incorporate the scientific worldview that has seeped into the collective consciousness. Purposefully drawing back the technological curtain avoids the specificity of the “Frivolous Now” while keeping the worldview fundamentally contemporary.
Not that writers should write in any way scientifically—how boring! Let me explain. Go see any medieval triptych in a museum, or find any old church frieze. It is beautiful. You probably have no idea what it means. The art is richly layered with meaning and concepts, but most of the symbolism isn’t understood by the modern viewer compared to someone steeped in biblical lore, like an educated American would have probably been until about the 1900s. I recently reread Herman Melville’s Moby-Dick and I started to mark the pages where there is a clear biblical allusion that I just did not get at all. This started in the first sentence. “Ishmael” is obviously from the Bible, but I didn’t know it meant “God listens,” and that in the Bible Ishmael wanders the desert and is miraculously saved from dying of thirst. Melville’s Ishmael wanders the ocean and is miraculously saved from drowning. In an inversion, land becomes water, salvation always the opposite element. Melville was relying on the conceptual web he would have shared with most readers to endow the name choice with meaning, and yet it is completely lost on a significant chunk of contemporary secular readers.
All novels involve such symbolism, references, and reflections on assumed shared concepts. Writers build their nests out of whatever bits of culture they can carry. Even down at the sentence level in the makings of metaphors and lyricism, these need to be both shared by readers and be symbolically rich enough to be used by the artist or novelist. There’s still so much to get out of Moby-Dick, but clearly I’m reading it along some sort of lower-dimensional wavelength. It’s like listening to music through tinny headphones. What for Melville was so explicitly not the “Frivolous Now,” over the long march of history, became precisely that.
What can now play that role? Science certainly offers a grab bag with more than enough to fill a frieze: the uncertainty of quantum mechanics; the mysteries of the big bang; the tug-of-war between nature and nurture; the law of entropy; that space and time are the same thing; the forces of natural selection; genes and bits, and so on. This provides an opportunity, a language, a space, for writers willing to work with these concepts.
For this to work writers don’t need to be scientists nor inordinately interested in science, any more than Melville was a priest or inordinately interested in the Christian tradition. Rather, they can carry on doing what some have been doing without much fanfare, occasionally to success, occasionally to failure. Here are a handful works successfully pulling it off: Saturday by Ian McEwan, with its focus on biological determinism and medicine; Toward the End of Time by John Updike, which in its metaphors and themes is steeped in cosmology and physics; White Teeth by Zadie Smith, with its play on genetics and race; and Jeffrey Eugenides’s Middlesex exploring the biological basis of sex and gender. These novels aren’t obsessed with the latest technology, but science infuses their worldview. They are not submissive to science, but have a playful engagement with it, with the result that they actually end up using science as a shared language to make meaning.
Perhaps this represents a new global form of literature for the future. What still comes clearly across after a century and a half is Melville’s naturalist eye. Humans may not always understand the semantic complexities of the cross, but from now until literacy is lost they will know the packed meaning of the double helix. The concepts of science are pretty close to being eternal for any future functioning civilization, as well as semantically rich, complex, structured, and deployable. One could pick up Updike’s Toward the End of Time in a thousand years, or further toward the literal end of time, and the quest for meaning in an entropic world, as well as some of the scientific metaphors and references, wouldn’t be lost. Entropy simply isn’t going away. The story of Ishmael, sadly, gladly, neutrally, might be.
Despite this potential opportunity to use a whole new patois of popular myths (in the Jungian sense) I’ve noticed there is a certain frisson in the air when a writer and a scientist are onstage together. A competition, or duality; both afraid to offend, to sound stupid, to step on toes. Most often there is the acknowledgement of the other’s authority in all matters pertaining to their sphere, accompanied by polite protestations of inexpertise. At the same time, disparity of epistemological methods often leads to an aggressive defense of territory: facts on the scientist’s side of the court, feelings on the novelist’s.
Of all the sciences, when it comes to neuroscience this territoriality is the most pronounced. The purviews are dangerously close, though methods, perspective, and goals are all different. In casual conversations I’ve found that writers have a healthy fascination with the brain, which makes sense, given the predisposition of fiction to be interested in the internal, the self, the mind. At the same time, neuroscience can be experienced as off-putting. Prior to neuroscience we were like a culture before the invention of mirrors, and now, having been shown our reflection for the first time, it turns out that we are more like plants than people; what we look like beneath bone is a compact odd garden puffing chemicals between elaborately trestled synapses, two eyestalks leading in. The Greeks, without being able to differentiate its intricate structure, thought the brain an organ for cooling the body’s temperature, while the Egyptians discarded it as unnecessary for the afterlife (a mush, really). But we know better. The revelation that comes upon this strange reflection gazing back at us must be similar to the one Santiago Ramón y Cajal felt while sketching the arborization of individual neurons for the first time: these tiny alien trees are the substance of feeling? But how?
With wonder comes threat. Sometimes neuroscience seems set to eat up all the humanities, another big bang expanding voluminously until all the arts are merely syntactic statements in some neural vocabulary. Love of art, or even the production of art, becoming reduced to just some chain of if-then neuronal firing. It doesn’t help that there seems to be the “This is Your Brain on X” for every subject from imagination to exercise. “This is Your Brain on Writing” was the title of a 2014 article in The New York Times covering a study on the neural difference between experts in creative writing and novices. The more neuroscientists stick both viewers of art and the artists themselves into brain scanners, the more some kind of fundamental reduction seems possible.
Of course, it’s not the case that neuroscience will somehow stop people from making art—that’s impossible even for the most authoritarian regime—but rather, as the philosopher of mind Thomas Metzinger says in The Ego Tunnel concerning the process of scientific self-realization about what’s inside our skull: “One of the many dangers in this process is that if we remove the magic from our image of ourselves, we may also remove it from our image of others. We could become disenchanted with one another.” And to become disenchanted with the human is to stop writing novels.
Such disenchantment is the consequence critics of “scientism” seek to avoid. Both sides of the scientism debate are easy to strawman, but I think it’s a serious concern, especially when two descriptions of the world are so closely overlapping there is the risk of contradiction. Consider an example: in McEwan’s Saturday, the neurosurgeon protagonist Henry performs a craniotomy in an operating theater. While looking at the exposed brain of his patient, he considers that:
Just like the digital codes of replicating life held within DNA, the brain’s fundamental secret will be laid open one day. But even when it has, the wonder will remain, that mere wet stuff can make this bright inward cinema of thought, of sight and sound and touch bound into vivid illusion of an instantaneous present, with a self, another brightly wrought illusion, hovering like a ghost at its centre. Could it ever be explained, how matter becomes conscious? He can’t begin to imagine a satisfactory account, but he knows it will come, the secret will be revealed—over decades, as long as the scientists and institutions remain in place, the explanations will refine themselves into an irrefutable truth about consciousness. It’s already happening, the working is being done in laboratories not far from this theatre, and the journey will be completed, Henry’s certain of it. That’s the only kind of faith he has.
But luckily, this is precisely where the overlap actually falls apart. Neuroscience still does not understand the workings of consciousness, neither its origins nor its execution, while it is something writers understand implicitly. Not only that, but writers understand consciousness from the inside in a way that science never will. I’ve argued before that novelists are unique in that they take the intrinsic perspective on the world, examining the thoughts and feelings of characters as if those were extrinsic objects. In a novel, unlike say on a screen or a scientific model of the brain, thoughts and feelings are as pointable at, as directly describable, as tables and chairs. There can be no dualism in fiction. In this sense writers remain far ahead of neuroscience in understanding, or at least fluidly representing, human consciousness.
If fiction isn’t afraid of its most direct scientific reflection, neuroscience, it shouldn’t be afraid of any of the other sciences either. It can take on the same capacity it has many times before—that of critic, muse, explorer. It can even find drama in the process of science itself. Such secular myth making is important, not to help some particular cause or act against religion or other dominant myths, but rather because it is an honest take on the actual world that we now live in. And novels in particular, with their intrinsic view from the inside, are uniquely poised to pilfer from and also critique the new cultural giant that is science. That’s a kind of fiction for the future.
Wallace, David Foster. “Fictional Futures and Conspicuously Young.” Review of Contemporary Fiction, 1990.
Melville, Herman. Moby-Dick. New York, New York: Penguin Classics, 2003.
Metzinger, Thomas. The Ego Tunnel: The Science of the Mind and the Myth of the Self. New York, New York: Basic Books, 2009.
McEwan, Ian. Saturday. New York, New York: Nan A. Talese, 2005.