my mind is
a big hunk of irrevocable nothing
The light of the sun and moon cannot be outdistanced, yet mind reaches beyond them. Galaxies are as infinite as grains of sand, yet mind spreads outside them.
Biology gives you a brain, life turns it into a mind.
All brains gather intelligence; to lesser or greater extents, some brains acquire a state of mind. How and where they find the means to do so is the question raised by poets and philosophers, doctors of divinity and medicine who have been fooling around with it for the past five thousand years and leave the mystery intact. It’s been a long time since Adam ate of the apple, but about the metaphysical composition of the human mind, all we can say for certain is that something unknown is doing we don’t know what.
Our gathering of intelligence about the physical attributes and behaviors of the brain has proved more fruitful. No small feat. The brain is the most complicated object in the known universe, housing 86 billion neurons, no two alike and each connected to thousands of other neurons, passing signals to one another across as many as 100 trillion synaptic checkpoints. Rational study of the organism (its chemistries, mechanics, and cellular structure) has led to the development of the Human Genome Project, yielded astonishing discoveries in medicine and biotechnology—the CT scan and the MRI, gene editing and therapy, advanced diagnostics, surgical and drug treatment of neurological disorder and disease. All triumphs of the intellect but none of them answering the question as to whether the human mind is flesh giving birth to spirit or spirit giving birth to flesh.
Mind is consciousness, and although a fundamental fact of human existence, consciousness is subjective experience as opposed to objective reality and therefore outdistances not only the light of the sun and the moon but also the reach of the scientific method. It doesn’t lend itself to trial by numbers. Nor does it attract the major funding (public and private, civilian and military) that in China, Europe, and the Americas expects the brain sciences to produce prompt and palpable reward and relief.
my mind is
The scientific-industrial complex focuses its efforts on the creation of artificial intelligence—computer software equipped with functions of human cognition giving birth to machines capable of visual perception, speech and pattern recognition, decision making and data management. Global funding for AI amounted to roughly $30 billion in 2016, the fairest share of the money aimed at stepping up the commercial exploitations of the internet. America’s military commands test drones that decide for themselves which targets to destroy; Google assembles algorithms that monetize online embodiments of human credulity and desire, ignorance and fear.
We live in an age convinced that technology is the salvation of the human race, and over the past fifty years, we’ve learned to inhabit a world in which it is increasingly the thing that thinks and the man reduced to the state of a thing. We have machines to scan the flesh and track the blood, game the stock market, manufacture our news and social media, tell us where to go, what to do, how to point a cruise missile or a toe shoe. Machines neither know nor care to know what or where is the human race, why or if it is something to be deleted, sodomized, or saved. Watson and Alexa can access the libraries of Harvard, Yale, and Congress, but they can’t read the books. They process words as objects, not as subjects. Not knowing what the words mean, they don’t hack into the vast cloud of human consciousness (history, art, literature, religion, philosophy, poetry, and myth) that is the making of once and future human beings.
The contributors marking this tenth-anniversary issue of Lapham’s Quarterly regard the mystery of consciousness not as a problem to be solved but as a wonder to behold. The approach is the Quarterly’s own. The all but infinite extent of human ignorance—about the nature of our own minds, as about most everything else in the universe—is the provocation that rouses out the love of learning, kindles the signal fires of memory and the imagination, turns to the lessons of history. We have no other light with which to see and maybe recognize ourselves as human beings. The catching and reflecting of that light has been the Quarterly’s unscientific method for each of its forty previous issues, all of them focused on the shifting states of mind that underwrite the coming and going of wars and wealth and monuments.
History is not what happened two hundred or two thousand years ago. It is a story about what happened two hundred or two thousand years ago. The stories change, as do the sight lines available to the tellers of the tales. To read three histories of the British Empire, one of them published in 1800, the others in 1900 and 2000, is to discover three different British Empires on which the sun eventually sets. The must-see tourist attractions remain intact—Napoleon still on his horse at Waterloo, Queen Victoria enthroned in Buckingham Palace, the subcontinent fixed to its mooring in the Indian Ocean—but as to the light in which Napoleon, the queen, or India are to be seen, accounts differ.
Wine Drinking in a Spring Garden (detail), Persian silk album leaf, c. 1430. © The Metropolitan Museum of Art, Cora Timken Burnett Collection of Persian Miniatures and Other Persian Art Objects, Bequest of Cora Timken Burnett, 1956.
To consult the record in books both ancient and modern is to come across every vice, virtue, motive, behavior, obsession, joy, and sorrow to be met with on the roads across the frontiers of the millennia. What survives the wreck of empires and the sack of cities is the sound of the human voice confronting its own mortality, points of light flashing in the gulf of time on scraps of papyrus and scratchings on stone, on ship’s logs and totem poles, on bronze coins and painted ceilings, in confessions voluntary and coerced, in five-act plays and three-part songs. The story painted on the old walls and printed in the old books is our own.
It’s been said that over the span of nine months in the womb, the human embryo ascends through a sequence touching on over three billion years of evolution, that within the first six years of life, the human mind stores subjective experience gathered in what is now believed to be the nearly 200,000 years of its existence. How subjective gatherings of consciousness pass down from one generation to the next, collect in the pond of awareness that is every newly arriving human being, is another lobe of the mystery the contributors to this issue of the Quarterly leave intact. It doesn’t occur to Marilynne Robinson, twenty-first-century essayist and novelist, to look the gift horse in the mouth. “We all live in a great reef of collective experience, past and present, that we receive and preserve and modify. William James says data should be thought of not as givens but as gifts…History and civilization are an authoritative record the mind has left, is leaving, and will leave.”
Saint Augustine, fifth-century pillar of the Christian church, concurs:
Memory is like a great field or a spacious palace, a storehouse for countless images of all kinds that are conveyed to it by the senses. In it are stored away all the thoughts by which we enlarge or diminish or modify in any way the perceptions at which we arrive through the senses.
Truly a wonder to behold, but the good bishop has no idea from whence it cometh or whither it goeth:
The power of the memory is prodigious, my God. It is a vast, immeasurable sanctuary. Who can plumb its depths? And yet it is a faculty of my soul. Although it is part of my nature, I cannot understand all that I am…I am lost in wonder when I consider this problem. It bewilders me.
The work of the brain is receiving the presents; the art of the mind is unwrapping them. Playing with them on what the French eighteenth-century philosophe Denis Diderot likens to a clavichord fitted with “sensitive vibrating strings” of memory. Raveling and unraveling them on what Charles Scott Sherrington, twentieth-century English neurophysiologist and Nobel laureate, likens to “an enchanted loom where millions of flashing shuttles weave a dissolving pattern, always a meaningful pattern though never an abiding one.”
The highest possible stage in moral culture is when we recognize that we ought to control our thoughts.—Charles Darwin, 1871
Diderot and Sherrington go and catch the falling stars in the net of metaphor, our only access to the realm of making believe that the poet and naturalist Diane Ackerman pictures as “that dream factory…that huddle of neurons calling all the plays, that little everywhere, that fickle pleasuredrome, that wrinkled wardrobe of selves stuffed into the skull like too many clothes into a gym bag.” Metaphor is the process described by Albert Einstein in 1918 as “intuition resting on sympathetic understanding of experience.” He backed up his observation with a variation on the general theory of relativity:
Man tries to make for himself in the fashion that suits him best a simplified and intelligible picture of the world; he then tries to some extent to substitute this cosmos of his for the world of experience, and thus to overcome it. This is what the painter, the poet, the speculative philosopher, and the natural scientist do.
So does the historian. So do we all, to greater or lesser extents settling the wilderness of our experience with a story staking claim to new worlds both inside and outside our heads. Evan S. Connell, American novelist, essayist, and poet, framed Einstein’s assumption as the question asked in his Notes from a Bottle Found on the Beach at Carmel:
Each life is a myth, a song given out
of darkness, a tale for children, the legend we create.
Are we not heroes, each of us
in one fashion or another,
wandering through mysterious labyrinths?
The answer is yes, we are heroes, lighting our way through labyrinths with symbols and signs made from the shaping and reshaping of a once-upon-a-time, finding the past in the present, the present in the past. The talent is uniquely human. To the best of our knowledge, the human being is the only one of God’s or Charles Darwin’s creatures capable of seeing the similar in the dissimilar, who can say what it’s like to be what it is. The “I am” and the “It is” are both productions of the same independent film studio. History is the face in the mirror answering to the question who or what is man. “Sea-gold and marble columns never have been what I sought,” says Connell,
nor shards of broken amphorae; but the slightest measure
of myself, and of those who have preceded us across
this desolate shore.
But if our consciousness these days is for the most part being made by machines, how do we read the notes in the bottles on the beaches of time, find the words, and with them intelligible metaphors to describe and depict a world made to the measure of human beings? America’s democratic republic was founded on the meaning and value of words. So is the structure of what goes by the name of civilization.
Silicon Valley’s data-mining engineers have no use for the meaning and value of words; they come to bury civilization, not to praise it. Bury it in the avalanche of an instantly dissolving Now that carries away all thought of what happened yesterday, last week, two hundred or two thousand years ago. The losing track of our own stories (where we’ve been, who we are, and where we might be going) is the destruction of our energy of mind. The consequence of the twentieth-century information revolution is the same one the poet William Wordsworth ascribed in 1807 to the nineteenth-century industrial revolution:
The world is too much with us; late and soon,
Getting and spending, we lay waste our powers:
Little we see in nature that is ours;
We have given our hearts away, a sordid boon!
Wordsworth doesn’t appear in this issue of the Quarterly, but the “sordid boon” is the one the Japanese neo-Confucianist Oshio Heihachiro seeks in 1833 to eliminate, allowing for what he calls a “heaven” of one’s own mind free and open to enchantment. To clutter the mind with commercial advertisements for a permanent and monumental self, says Oshio, is to let its “real principles” lie buried and leave nothing to distinguish a man “from a material thing. There is nothing more shameful than for a human being to be no different from a material thing.”
In digitally enhanced America these days, who among us thinks it shameful to be no different from a material thing? President Donald J. Trump’s presence in the White House laughs the question to scorn. Not only is being no different from a material thing not shameful, it is the consummation devoutly to be wished—to be minted into the coin of celebrity, become a corporation, a best-selling logo or brand, a product in place of a person. For the past forty-odd years, persuaded that technology is the means of our deliverance and money the hero with a thousand faces, we’ve been burying the life of a weakened but still operational democracy in the tomb of a stupefied, dysfunctional plutocracy.
Chandra, the Moon God, folio from an Indian dream book, c. 1715. © The Los Angeles County Museum of Art, Gift of Paul F. Walter.
A story like the one told about Midas, mighty king in Greek and Roman legend, who wished that everything he touched be turned to gold. His wish was granted by Dionysus, god of wine and ecstasy, and for one bright new morning in antiquity the king rejoiced in his changing of sticks and stones and sunflowers into precious heavy metal. But then so did his food and drink turn to gold when he held it in his hands, and he would have died of thirst had not Dionysus released him from the prison of his fool request.
America’s democratic republic hasn’t been so fortunate. Its freedoms of heart and mind atrophy within the gilded housings of five-star vanity and greed, the voicing of its political thought indistinguishable from celebrity self-promotion. Midas at least had the wit to know something had gone wrong. Unable to lift or taste les poissons d’or, he cursed his luck, begged Dionysus for the antidote. Our own ruling and possessing classes haven’t been as quick to connect the dots. They have yet to locate the why and wherefrom of their stupidity and fear, still unable to perceive the Midas touch not as a glad tiding of comfort and joy but as the palsied hand of the Dionysian god in the machine of creatively annihilating capitalism.
On screen and off, our world grows increasingly crowded with machines of whom we ask what the rich ask of their servants (comfort us, tell us what to do) and on whom we depend to so arrange the world that we can avoid the trouble of having to experience it. The little we see in nature that is ours we look to replace with material things, with smartphones and bots to feed, water, drive, milk, and round up the herd of our still mortal needs and desires. Runway models at New York fashion shows ape the appearance of automata. Our comic book heroes in the movies and video games, bionic men and metallic women, come dressed in the uniforms of invincible weapons; the villains show up as zombies, cyborgs, replicants, and droids roaming around landscapes as dehumanized and desolate as the dark side of the moon. Whether billed as utopia or dystopia, the future proposed by our popular fortune-tellers is made by and for machines.
Sooner or later if the activity of the mind is restricted anywhere, it will cease to function even where it is allowed to be free.—Edith Hamilton, 1930
Ray Kurzweil, oracle in residence at Google, published The Singularity Is Near in 2005, announcing the dawn of a new day in which the human species breaks the shackles of its genetic legacy, soars to inconceivable heights of nonbiological intelligence, achieves evolutionary union with things, eliminates the distinction between real reality and virtual reality. Selected sensory functions of the human brain attach to immortal computer systems, suspending them forever in a virtual state of perpetual bliss.
The great good news lately has come to be seen as a cause for alarm. The wired-in congregation of the California faithful doesn’t doubt the prospect of AI superintelligence slouching toward Palo Alto to be born, but the programmers worry that, once up and out of the cradle, it will create a world in its own image, leaving little or no room at the inn for human beings. Computer processing power doubles every two years, a rate well beyond the evolutionary gathering of human brain cells. A quorum of prophetic opinion now holds that before the end of this century, computer software will learn how to build its own infrastructure, set its own political agenda, develop self-replicating nanotechnology that eludes the understanding and control of its sponsors. Max Tegmark, professor of physics at MIT, published Life 3.0 in 2017, listing the ways in which superintelligence might take over the world—as “benevolent dictator,” owning and operating society with freedom, justice, and guaranteed income for all; as “protective god,” preserving an illusion of control of one’s own destiny; as “conquerors,” ridding the planet of human beings because machines perceive them as threat, nuisance, or waste of resources.
Silicon Valley sells the prospect of omniscient, omnipotent intelligence to customers wishing to turn it instantly to gold, to frightened children of the bourgeois rich who either don’t know, have forgotten, or don’t take the trouble to learn that it isn’t with machines that men make their immortality. They do so with the powers of mind acquired on the immense journey up and out of the prehistoric mud, drawing on the immense wealth of subjective human consciousness known to the historians Will and Ariel Durant as the “celestial city” of the past “wherein a thousand saints, statesmen, inventors, scientists, poets, artists, musicians, lovers, and philosophers still live and speak, teach and carve and sing.”
Lapham’s Quarterly is the opening of doors and windows into that city. For reasons given by the Roman orator Cicero (“Not to know what happened before one was born is always to be a child”) and by the American novelist William Faulkner: “The past is never dead. It’s not even past.” Understood as means instead of end, the past is the inexhaustible fund of energy and mind that makes possible the revolt against what G.K. Chesterton once called “the small and arrogant oligarchy of those who merely happen to be walking about.” To free themselves from the arrogant oligarchy in charge of the eighteenth-century British crown, American revolutionaries framed their envisionings of a republic (Jefferson’s and Paine’s as well as those of Hamilton and Adams) on their study of Cicero and Plutarch as well as their readings of the King James Bible. So in its turn the Italian Renaissance derived from the rediscovery of classical antiquity. The latter progression supplied the scholar Stephen Greenblatt with the premise for The Swerve, published in 2011 but accounting for the death and resurrection of 7,400 lines of lyric but unrhymed verse, On the Nature of Things, composed by the Roman poet Titus Lucretius Carus in the first century bc. Greenblatt subtitled his book How the World Became Modern, attributing the metamorphosis in large part to the discovery of Lucretius’ poem in a German monastery in 1417 by Poggio Bracciolini, Italian humanist, Vatican functionary, and apostolic scribe.
Chandra, the Moon God, folio from an Indian dream book, c. 1715. © The Los Angeles County Museum of Art, Gift of Paul F. Walter.
Lucretius had infused his poem with the thought of Epicurus, the Greek philosopher teaching his students in Athens in the fourth century bc that the elementary particles of matter (“the seeds of things”) are eternal, and that the purpose of life is the embrace of beauty and pleasure. Everything that exists—the sun and the moon, water flies, ziggurats, mother and the flag—is made of atoms in motion (among them the 86 billion neurons in a human brain) ceaselessly combining and recombining in a bewildering variety of substance and form. The universe consists of “atoms and void and nothing else.” No afterlife, no divine retribution or reward, nothing other than a vast turmoil of creation and destruction, a constant making and remaking of despots and matinee idols, of books and avatars and states of mind.
Late in the season of the Roman Empire, the bringers of the light of Christianity, Saint Augustine prominent among them, dispatched to hell the Stoic and Epicurean schools of thought, reconfigured the pursuit of pleasure as sin, the meaning of life as pain. The fifteenth-century resurrection of On the Nature of Things in concert with the reappearance of Ovid, Seneca, and Aristotle prompted the Renaissance embrace of truth as beauty and beauty as truth made manifest in the glory of its painting, sculpture, music, architecture, and literature. Over the course of the next six centuries, Lucretius’ poem finds further raveling and unraveling in Machiavelli’s political thought, Montaigne’s essays, Shakespeare’s plays, Newton’s mathematics, and what we now know as the wonder of free-market capitalism.
The circumstances at hand in the early years of the twenty-first century suggest that the time is ripe for another redrafting of the contract between man and nature. With any luck, one of a magnitude comparable to the one that gave birth to the Renaissance. For the past fifty years, it has been apparent to the lookouts on the watchtowers of Western civilization that the finite resources of the planet cannot accommodate either the promise or the theory of infinite growth. Too many people coming into the world, no miracle of loaves and fishes with which to feed the multitude. The arithmetic charts the reefs of destruction marked under the headings of overpopulation, climate change, nuclear proliferation, unreedemable debt, extinctions of species, wars of all against all. The intimations of mortality lurking in the depths of the policy papers tend toward an increasingly insistent awareness that, if left to its own devices, the voracious global consumer market must devour and destroy the earth. Not with malice aforethought, but because it is a machine and, like all machines, knows not what else to do.
The mind of man is capable of anything.—Guy de Maupassant, 1884
The same policy papers lead in turn to the recognition of the global capitalist economy as a historical construct and therefore, like the pyramids and the heads on Mount Rushmore, a story with a beginning (in late sixteenth-century Holland), a middle (the eighteenth- and nineteenth-century industrial revolutions in England and America), and an end foreshadowed by the twentieth-century information revolution burying the mind of civilization in the Silicon Valley deserts of sand. Our technologies produce wonder-working weapons and information systems, but they don’t know at whom or at what they point the digital enhancements. Unless we find words with which to place them in the protective custody of the humanities—languages that hold a common store of human value and therefore the hope of a future fit for human beings—we surely will succeed in murdering ourselves with our shiny new windup toys.
The guardians at the gate of our enlightened selfishness (aka Wordsworth’s sordid boon) look for salvation to the god in the machine of artificial superintelligence. They are looking in the wrong direction. The future is nonexistent, the present come and gone too quickly to establish a mailing address. Where else does one live if not in a house of straw made from the shaping and reshaping of a once-upon-a-time? What is it possible to change if not the past? And how else do we escape the prison of a gold-plated self if not with the joy of learning, in the words of Virginia Woolf, that “any live mind today is of the very same stuff as Plato’s,” and that “it is this common mind that binds the whole world together; and all the world is mind.”