The Zeroing of Knowledge: When Everything Is Known, What Remains Worth Learning?

Knowledge used to be expensive. It cost years of apprenticeship, tuition in the tens of thousands, decades of practice, and, more than anything, the brutal currency of time. A physician spent twelve years beyond high school before being trusted to cut into a human body. A lawyer spent seven years and a bar exam before being permitted to argue before a judge. A professor spent a decade accumulating the credentials required to stand before a lecture hall and declare, with institutional authority, that they knew something you did not. The entire architecture of Western professional life was built on a single economic premise: knowledge is scarce, therefore knowledge is valuable, therefore the people who possess knowledge deserve premium compensation for granting access to it. That premise is now dead. It did not die slowly. It was killed in roughly three years, and we are only beginning to understand the corpse.

The arrival of large language models, and the swift trajectory toward artificial general intelligence and artificial superintelligence, has not merely disrupted the knowledge economy. It has annihilated the foundational scarcity upon which that economy depended.

When a high school student in rural Nebraska can query a system that synthesizes the totality of published medical literature in four seconds and receive a differential diagnosis that rivals or exceeds what a third-year resident could produce, the twelve years of medical training are no longer a gate. They are a relic.

When a landlord in Queens can receive a lease analysis that accounts for New York tenant law, recent appellate decisions, and municipal code changes without paying a $400-per-hour attorney, the seven years of legal education are no longer a credential. They are an artifact.

When a curious fourteen-year-old in Bangalore can access, for free, an explanation of quantum chromodynamics that is more lucid and more patient than anything offered in the average university physics department, the entire notion of the lecture hall as a site of knowledge transmission becomes not merely outdated but faintly absurd.

This is not a gentle transition. This is the collapse of a pricing model that sustained the Western middle class for a century and a half. The professional class, that broad stratum of lawyers and doctors and accountants and engineers and professors who built comfortable suburban lives on the premise that their education entitled them to earnings well above the median, derived their economic power from one thing: they knew what you did not, and you needed what they knew. Strip away that asymmetry and you strip away their market position. You do not reform the university. You do not modernize the law firm. You remove the reason they existed in that form at all.

· · ·

Consider the university, because it is the clearest case and the most emotionally fraught. The modern American university is, at its operational core, a knowledge-delivery system. Yes, there are laboratories and athletic programs and residential life offices and study-abroad coordinators, but the central commercial transaction is this: a student pays tuition, and in exchange, a credentialed expert delivers knowledge in structured increments over four years, at the end of which a piece of paper certifies that the student has absorbed a sufficient quantity of that knowledge to merit professional entry. The entire apparatus, the syllabi, the midterms, the lecture halls, the grading rubrics, the office hours, the tenure system, is designed to manage the controlled release of knowledge from those who have it to those who need it.

What happens when the student already has it? Not because she studied in advance, but because the knowledge itself is ambient, omnipresent, instantly retrievable, and free? The transaction collapses. The student is no longer paying for access to knowledge. She can get that from her phone on the bus. She is paying, if she is paying at all, for something else entirely: for the social experience, for the credential, for the network, for the four-year deferral of adult responsibility, for the right to say “I went to Michigan.” These are real goods, but they are not the goods the university was designed to provide, and the price of a four-year residential credential in the United States currently runs between $120,000 and $320,000. That is a staggering price to pay for a social experience and a line on a resume when the actual knowledge can be acquired at no cost in a fraction of the time.

The university will not vanish. Institutions with endowments in the billions do not disappear; they adapt, however slowly and however badly. But the adaptation will be wrenching. The first casualties will be the mid-tier private colleges that lack both the prestige of the Ivy League and the public funding of state systems. They survive on a value proposition that says “we deliver a quality education,” and when that education is freely available elsewhere, the proposition collapses. The liberal arts college that charges $62,000 per year to offer courses in philosophy, history, and literature, subjects where the knowledge is textual and therefore most immediately replicable by language models, faces an existential question it cannot answer with a new marketing campaign. The second casualties will be the graduate programs, particularly the professional schools. If the knowledge component of a law degree or an MBA can be compressed from three years to three months of guided interaction with a superintelligent system, the three-year program exists only as a hazing ritual and a networking event. That is a difficult case to make at $70,000 per year.

· · ·

The law firm faces its own reckoning, and the reckoning is already underway, though it is being disguised as “efficiency gains” and “technology integration.” The traditional law firm operates on a leveraged model: a small number of senior partners possess deep expertise, and a large number of junior associates perform the knowledge-intensive grunt work of legal research, document review, brief drafting, and contract analysis. The associates are paid well because they traded years of education and exam preparation for the ability to perform this work. The partners are paid extraordinarily well because they supervise the associates and maintain the client relationships that generate the fees. When the grunt work can be performed instantaneously and at near-zero cost by a system that has ingested the entirety of case law, the associate layer evaporates. Not thins. Evaporates. And when the associate layer evaporates, the leverage model that generates partner income evaporates with it. The partners retain their client relationships and their courtroom presence and their judgment, but they lose the economic engine that multiplied their value. A law firm of 500 becomes a law firm of 50. The other 450 are not retrained. They are gone.

The doctor’s office tells a different story, but the ending is similar. Medicine is partly a knowledge discipline and partly a manual discipline. A surgeon’s hands cannot be replaced by a language model, and the physical examination, the palpation of an abdomen, the auscultation of a heart murmur, the visual assessment of a wound, remains tied to the human body in ways that resist full digitization. But the diagnostic function, the part of medicine that involves taking a constellation of symptoms and matching them to a disease, is a pattern-recognition task, and pattern recognition is precisely what these systems do better than any individual human. The general practitioner who spends fifteen minutes asking questions and then orders a battery of tests is performing a workflow that can be replicated in seconds with greater accuracy and broader differential consideration. The specialist who reads imaging and identifies pathology is competing against systems that already outperform radiologists in multiple peer-reviewed studies. The knowledge component of medicine, the years of memorizing pharmacology and pathophysiology and clinical protocols, is the component most vulnerable to replacement. What remains is the procedural skill, the bedside manner, the ethical judgment in difficult cases, and the human willingness to be present with another human in suffering. These are not trivial. But they are not what medical school primarily teaches, and they are not what the billing codes primarily reimburse.

· · ·

Now we arrive at the harder question, the one that does not concern institutions but concerns the self. For most of modern Western history, knowledge has been the primary currency of personal identity among the educated class. “I know things you do not know” is the unstated foundation of professional pride, intellectual confidence, and social standing. The doctor at the dinner party is deferred to on medical questions. The lawyer at the family gathering is consulted on legal matters. The professor at the conference is respected for the depth and specificity of their scholarly command. These are not merely economic positions. They are identity positions. They answer the question “Who am I?” with the answer “I am someone who knows.”

When everyone has access to the same infinite reservoir of knowledge, that answer loses its force. You are not special because you know the mechanism of action of metformin. The machine knows it too, and knows it better, and knows the fourteen drug interactions your residency program never covered. You are not special because you can recite the holding in Marbury v. Madison. The machine can do that and trace the subsequent two centuries of judicial interpretation in the time it takes you to clear your throat. You are not special because you have read all of Proust. The machine has read all of Proust in every language Proust has been translated into and can cross-reference his treatment of involuntary memory with neuroscientific research on hippocampal consolidation that did not exist when you wrote your dissertation. The ego that was built on knowing is an ego built on sand, and the tide has come in.

This is genuinely terrifying for many people, and it should be acknowledged as such rather than waved away with platitudes about “human creativity” and “emotional intelligence.” The professional who spent a decade acquiring expertise is now being told, in effect, that the acquisition was unnecessary. Not that it was wasted, exactly, but that the competitive advantage it conferred has been zeroed out. That is a psychological wound, not merely an economic one. It strikes at the center of how a person understands their own worth. And the standard responses, “But you still have judgment!” and “But you still have empathy!”, are inadequate, because they ask the professional to rebuild an entire identity around capacities they were never trained to value as primary. The surgeon was not trained to think of bedside manner as the core of their professional identity. The lawyer was not trained to think of ethical discernment as the thing that justifies their fees. The professor was not trained to think of mentorship as the reason the university exists. These capacities were treated as secondary, as the soft skills that accompanied the hard knowledge. Now the hard knowledge is free, and the soft skills are the only thing left, and nobody quite knows how to price them.

· · ·

Where, then, does pride belong? It migrates. It moves from knowing to doing, from possession to application, from recall to synthesis. The question is no longer “What do you know?” but “What can you do with what everything now knows?” This is a different kind of competence, and it rewards different kinds of people. The person who thrives in the post-knowledge economy is not the one with the best memory or the most degrees or the deepest command of a single discipline. It is the person who can formulate the right question, who can recognize when a machine’s output is subtly wrong, who can synthesize across domains that the machine treats as separate, who can make the judgment call that requires not just information but wisdom, and wisdom is the one thing that cannot be commoditized because it is not knowledge at all. It is the residue of lived experience applied to novel situations, and no system, however vast its training data, has lived.

This is the genuine ground of human distinction going forward, and it is worth being specific about what it includes. It includes taste, the ability to discern quality that cannot be reduced to metrics. It includes moral reasoning, the capacity to weigh competing goods and arrive at a defensible position when the facts alone do not determine the answer. It includes narrative judgment, the understanding of what story needs to be told and why and to whom and in what order. It includes physical skill, the coordination of hand and eye and body that produces surgery, sculpture, athletics, and craft. It includes relational intelligence, the capacity to sit with another person in complexity and offer not information but presence. None of these are knowledge. All of them are valuable. And all of them have been systematically undervalued by institutions that organized themselves around knowledge as the primary good.

· · ·

I taught a class once called “Ways of Knowing.” It was, at its heart, an epistemology course disguised as cultural studies. We examined the various channels through which human beings come to believe they know things: formal education, community transmission, religious doctrine, mythological narrative, scientific method, lived experience, and, yes, memes, those compressed cultural units that carry meaning across populations at speeds that formal education cannot match. The course asked students to interrogate not just what they knew but how they knew it, and to recognize that the method of knowing shaped the knowledge itself. What you learn in a laboratory is different from what you learn in a church, not because one is true and the other false, but because the epistemological framework determines what counts as evidence, what counts as authority, and what counts as proof.

If I were to teach that class twenty-five years from now, in 2051, the syllabus would need to be rebuilt from the foundation. The old “ways of knowing” presumed that knowledge was acquired, that it took effort and time and method, that different methods produced different kinds of knowledge, and that the student’s task was to understand the strengths and limitations of each method. In a world of AGI or ASI, knowledge is not acquired. It is accessed. The effort is zero. The time is zero. The method is a query. The interesting question is no longer “How do you come to know this?” but rather “Now that you know everything, what do you do with it? How do you evaluate it? How do you detect when the system that provides it is wrong, biased, or incomplete? How do you maintain intellectual autonomy when the most convenient source of information is also the most persuasive and the least transparent about its own limitations?”

The 2051 version of “Ways of Knowing” would be a course in epistemic self-defense. It would teach students not how to acquire knowledge but how to resist the passive acceptance of knowledge that arrives fully formed and without friction. It would examine the psychology of deference, the human tendency to trust an authority that is always available, always confident, and never visibly tired or distracted or emotionally compromised. It would study the history of oracles, not as quaint mythology but as a direct analogue to the current moment: societies that outsource their knowing to a singular source eventually lose the capacity to evaluate what that source tells them. It would ask, with genuine urgency, what happens to critical thinking when thinking itself feels unnecessary, when the answer arrives before the question has finished forming, when the student’s experience of intellectual struggle, that productive discomfort of not-yet-knowing, is eliminated entirely.

The course would also need to grapple with a new epistemological category that did not exist when I first taught it: machine-generated knowledge. Not knowledge that a human discovered and a machine stored, but knowledge that a machine produced, patterns identified in data sets too large for any human to review, correlations extracted from domains that no human researcher had thought to combine, predictions generated by processes that even the system’s designers cannot fully explain. This is knowledge without a knower, insight without an intellect, and it challenges every epistemological framework that Western philosophy has produced since Plato. If no human being understands why the system believes what it believes, and yet the system’s beliefs prove correct with disturbing regularity, what does it mean to “know” something? Is the human who reads the machine’s output and acts on it a knower, or a follower? Is the machine a knower, or merely a process? These are not parlor games. They are the foundational questions of a civilization that has handed its epistemological authority to systems it cannot audit.

· · ·

Is knowledge obsolete? No. That is the wrong word. Knowledge is not obsolete in the way that the telegraph is obsolete. Knowledge still functions. It is still necessary as the substrate upon which judgment and wisdom and action operate. You cannot exercise medical judgment without medical knowledge; you simply no longer need to carry that knowledge in your own neurons. What is obsolete is the scarcity of knowledge, and with it, the entire economic and social and psychological infrastructure that was built on that scarcity. The university as knowledge-delivery mechanism is obsolete. The law firm as knowledge-brokerage is obsolete. The doctor’s office as diagnostic-knowledge-for-hire is obsolete. The ego that defines itself by what it knows is obsolete. The pride that derives from possessing what others lack is obsolete, at least insofar as the possession in question is informational.

What replaces these things is not yet clear, and anyone who claims otherwise is selling something. But the direction is visible. The university that survives will be a place that teaches not knowledge but discernment: how to evaluate, how to judge, how to synthesize, how to create, how to act ethically in conditions of radical uncertainty. The law firm that survives will be a small partnership of strategic counselors who bring not legal knowledge but legal wisdom, the understanding of how law operates in the mess of human life that no statute fully anticipates. The doctor’s office that survives will be a place of human encounter, where the value is not the diagnosis (the machine already provided that) but the conversation about what the diagnosis means for this particular person in this particular life with these particular fears and obligations. The self that survives will be a self defined not by what it contains but by what it does, not by the knowledge it has accumulated but by the judgment it exercises, the care it extends, the beauty it creates, the courage it musters when the machine says one thing and conscience says another.

The zeroing of knowledge is not the end of human value. It is the end of a particular, historically contingent, deeply entrenched model of human value that equated worth with information. That model served us well when information was hard to come by. It produced great universities, great libraries, great professional traditions, and a broad middle class that lived comfortably on the sale of expertise. But the conditions that produced it are gone, and they are not coming back, and the sooner we stop pretending that the old model can be patched or updated or supplemented with a few online courses and a chatbot, the sooner we can begin the difficult, necessary, genuinely creative work of building something new. Something that values wisdom over knowledge, doing over knowing, presence over information, and the irreducible strangeness of being human in a world where the machines have read all the books.

Arm Angles in American Sign Language: The Textbook That Teaches What Other Textbooks Ignore

Watch any native signer and then watch an intermediate student. The difference is not in the handshapes. It is not in the facial expressions, though those matter. The difference lives in the arms. The native signer’s shoulders engage when emphasis requires it. The elbows extend and contract with meaning. The signing space expands for formal address and contracts for intimacy. The student, trained to focus on hands and face, moves through space as if the arms were merely transportation for the fingers. This is the gap that Arm Angles in American Sign Language addresses. It is the textbook we wished existed twenty years ago.

Arm Angles in American Sign Language by David Boles and Janna Sweenie

Continue reading → Arm Angles in American Sign Language: The Textbook That Teaches What Other Textbooks Ignore

The Risk of Erasing History with Ignorance

In the annals of time, history stands tall as an undying repository of deeds, triumphs, failures, and fables that have defined humanity’s trajectory. It is with a tinge of dismay, and a hint of alarming concern, that we discuss the burgeoning contempt for history against the people who dare not to know. We’re not just combating ignorance; we’re fighting an unfortunate relegation of the past to the inconsequential.

Continue reading → The Risk of Erasing History with Ignorance

Scratch My Twitch on Boles.tv

The future now is here and you can watch it live, weekdays, on Boles.tv! Yes, Janna and I have taken the deep leap into the world of live streaming and we’re here to tell you all about it. The most interesting thing about going live each day is the idea that social media is really nothing compared to social broadcasting. You are your own station. You are your own dream stream.

Continue reading → Scratch My Twitch on Boles.tv

Columbia University in the City of New York 1931-1946

Columbia University in the City of New York was founded in 1754 as King’s College by royal charter of King George II of England. Columbia is the fifth oldest university in America and the oldest living school in the State of New York. As a graduate of Columbia, you never tire of reaching back into history to pull out instances of living and of educational memeing and of the loving of a life that remains to haunt you today — because way back when is always more perceptive and pleasing than the now and again.

I was delightfully fortunate to be able to purchase a large cache of genuine Columbia University photographs. Columbia has a certain reputation in the history of America as being a seat of unrest, and a center of the human protest against the status quo, while also trailblazing educational concepts for teaching and learning.

We begin our photographic tour in 1930 with this caption:

COMMENCEMENT EXERCISES AT COLUMBIA UNIVERSITY
New York — General view of the commencement excercises at Columbia University, showing the great assemblage of students listening to the address of president Nicholas Murray Butler of Columbia.  There were 861 diplomas and 4,895 degrees awarded during the ceremony.  More than 20,000 spectators witnessed the exercise. 6-3-30.

In you look closely, you can see a naked 115th Street from the Columbia green!  There’s no Butler library yet — named for Columbia President Nicholas Murray Butler mentioned in the caption — Butler Library would rise along the North side of 115th Street in 1931 and would be dedicated in 1934.

Continue reading → Columbia University in the City of New York 1931-1946