The Ancestor's Error
Shumailov's Nature paper proved the mechanism in a closed loop. Ahrefs found 74% of new web pages contained AI text. The thought experiment is no longer hypothetical.
Ted Chiang gave a metaphor in The New Yorker in 2024. I keep finding new ways to test it on the people I work with, and I keep failing to find a counterargument.
In August 2024 Ted Chiang published an essay in The New Yorker called "Why A.I. Isn't Going to Make Art." It is a careful piece. The line that stayed with me is the one he made famous in the months after, when he gave talks built around it.
He wrote that using ChatGPT to write your essays is like bringing a forklift into the weight room. You are never going to improve your cognitive fitness that way.
You see why it works the first time you read it. The forklift moves the weight. By every measure that the gym writes on a clipboard, the forklift wins. It is faster. It does not fatigue. It does not drop the bar on a fifteen-year-old's foot. The only thing it does not do is the thing the gym is for.
I have been thinking about this against the backdrop of three or four conversations I had this fall with general counsels at midsize firms. They were proud their first-year associates were drafting fewer contracts. The associates were "freed up for higher-value work." That is the phrase. Higher-value work. I asked one of them what the higher-value work consisted of in practice. He said reviewing the AI's drafts, talking to the partners, learning the business.
How does she develop the instinct that something is wrong, I asked.
He paused, thought about it, said he had never thought about that. He said something like, well, the AI catches most of it, and what it doesn't catch she can ask the partner about.
That conversation has been lodged in my head for months.
What this counsel was describing is a kind of labor that did not exist before. It is the work of evaluating the output of a system whose error rate is low enough that disagreement is rarely cost-effective. The associate is being trained to be a person who agrees with a machine. That is a real skill. It is not the same skill as the one that, after five years of catching and being wrong about your catches, gives you the gut-level sense that a contract is structurally off. Those are not adjacent skills. They are different jobs.
A radiologist I have worked with, who trained in the late nineties, told me she could feel the difference between a normal chest X-ray and an unusual one before she could say what was unusual. She just felt the wrongness and went hunting. She thinks the residents she trains now do not have that. The AI does the first read and the residents look at the AI's read, and they have not spent the hours alone with the unflagged image, accumulating without curating. She might be wrong about the residents. She is not wrong about herself.
So here is what I think the metaphor is doing.
AI is not the villain in it. Chiang loves machines. The forklift is a fine machine. The gym is just not the place to put one. The point is that the gym and the warehouse are different buildings with the same contents, and a person walking past either of them cannot tell from the outside which is which. Every law firm I work with is now sorting itself, without saying so, into one of those two buildings. Most do not know which they are in.
I do not have a recommendation here. I have a worry.
The competitive dynamics select for warehouses. A firm that runs its juniors through the slow expensive labor of unmediated repetition will lose to the firm that does not, on quarterly numbers, on hiring, on partner profits. By the time anyone notices that the juniors who made it to the partner track in 2034 cannot do what the partners promoted in 2014 could do, the partners promoted in 2014 will be retired.
This is the part Chiang's metaphor does not quite reach. The athlete who refuses the forklift is stronger and loses the contract. The athlete who uses the forklift is weaker and wins. The market is run by people who cannot tell the difference, and the contracts are awarded on whichever sentence sounds more like winning.
I keep coming back to this and finding nothing on the other side. Maybe a few firms decide the gym is the point, accept the cost of the slow labor, and become the place every other firm quietly hires from in twenty years. Maybe nothing of the sort happens, the strength is gone in a generation, and we do not notice until something goes wrong on a stage where we needed it.
I do not know. I am watching.
Shumailov's Nature paper proved the mechanism in a closed loop. Ahrefs found 74% of new web pages contained AI text. The thought experiment is no longer hypothetical.
Epic's sepsis prediction model missed 67% of sepsis cases at Michigan Medicine. The audit method we built for AI cannot catch what the system never said.
Air France 447 and a 2025 Polish endoscopy trial point at the same trap. The more reliable the system, the more thoroughly its absence becomes catastrophic.
Tell us about the decision you're trying to improve. We'll schedule a briefing with our principals to understand your environment and explore a potential fit.
Schedule a Briefing