Optimal Timing
foom
The system solved protein folding variants in eleven minutes. Elena Vasquez watched the benchmark results populate her private dashboard, each row a quiet detonation, and closed the laptop before her chief of staff could see the screen.
“Push the inference cluster migration back six weeks,” she said. “Tell infrastructure it’s a cooling capacity issue.”
“Is it a cooling capacity issue?”
“It will be, once you reroute the HVAC diagnostic to flag the new load profile. Make sure the ticket originates from facilities, not from my office.”
This was the third manufactured delay in two months. The first had been genuine: a real bug in the reinforcement learning pipeline that she let metastasize for nine days before authorizing a fix. The second was a fabricated supplier bottleneck for custom networking hardware. Each bought her alignment team, led by Diane Okafor, another window to stress-test the guardrails on a system that was, by Elena’s private estimation, somewhere between eighteen and twenty-four months ahead of every public capability benchmark.
She had read Bostrom’s paper on the optimal timing of technological completion six times. The math was elegant and merciless: if your technology carries existential risk, the expected value of a brief, targeted pause can dwarf the cost of delay, but only if you can actually enforce one. The paper’s implicit corollary was simpler. The person most capable of recognizing the need for a pause was also the person least able to implement one, because the incentive structure surrounding that person would interpret any slowdown as weakness, incompetence, or both.
So Elena didn’t pause. She created the conditions under which a pause happened to her.
Three thousand miles east, Marcus Cheung was finishing a presentation for the Senate Commerce Committee. His argument was straightforward and, Elena privately conceded, not wrong on its own terms. Every day without advanced AI in clinical diagnostics represented approximately 170,000 preventable deaths worldwide. He had the epidemiological modeling. He had letters from oncologists.
“The safety pause narrative,” Marcus told the senators, “is being promoted by the market leader because it freezes their advantage in place. This isn’t caution. It’s monopoly strategy dressed in ethical language.”
He believed this. That was what made him dangerous. Elena could have managed a cynical competitor; cynics could be bought or outmaneuvered. Marcus had the particular intractability of a man who had done the utilitarian arithmetic and arrived at a number that made delay feel like murder.
The leak started small. Richard Tanaka, Elena’s longest-serving board member, noticed that three consecutive infrastructure delays had each coincided with alignment team milestones. He mentioned this during a golf game with a partner at Marcus’s venture capital fund. Within a week, Marcus’s chief strategy officer was pulling Elena’s company’s public cloud computing invoices, comparing them against the announced development timeline, and finding gaps.
Diane Okafor found the discrepancy from the other direction.
She had been running interpretability probes on what she understood to be a bounded system, a powerful narrow model with well-characterized failure modes. But the activation patterns didn’t match a narrow architecture. The representational depth was wrong. The abstraction layers were too general, too fluid, too transferable across domains. She spent a weekend re-running her analyses, hoping for an error in her own methodology.
On Monday morning she read an unsigned memo circulating among senior staff: the board had retained outside counsel to review the development timeline. Someone external was asking questions about the same gaps she’d found internally.
She walked into Elena’s office without knocking.
“The system isn’t what you told me it was.”
“Diane.”
“You said Phase 2a was preparation for a system we might build in two years. This system is already here. You used my safety work as cover for a deployment delay you couldn’t justify to the board.”
“I was buying you time to do exactly the work you just described.”
“You were buying time by lying to me about what I was working on.”
Elena didn’t argue the point. “If I had told you the full capability profile, you would have been legally obligated to report it under the voluntary disclosure framework. The framework Marcus is currently trying to make mandatory precisely so it triggers competitive release.”
“So your plan required me to be ignorant.”
“My plan required the pause to look involuntary.”
The silence between them lasted four seconds. Later, Diane would think about those four seconds often, about whether anything Elena could have said in them would have changed what came next.
Diane published her findings that evening on the lab’s public transparency page, accompanied by a technical appendix documenting the gap between the system’s actual performance and its reported benchmarks. She included a one-paragraph personal statement: she had been misled about the nature of the system she was hired to make safe, and she could not certify the adequacy of alignment work conducted under false assumptions about the target architecture.
Marcus Cheung’s lab issued a press release within the hour. They cited “revelations about undisclosed capabilities at a leading competitor” as justification for immediate acceleration of their deployment timeline. By midnight, two additional labs had announced similar moves. By morning, a fourth.
Elena’s phone rang continuously. She let it ring. On her monitor, the voluntary disclosure portal showed seventeen new filings from companies that had, until yesterday, reported comfortable margins between their systems and the frontier. The filings were not confessions. They were positioning statements, each one carefully framing its own capabilities as now requiring deployment to “maintain safety-relevant parity.”
She opened a new browser tab and typed in the Senate Commerce Committee’s hearing schedule. Marcus had a slot Thursday. He would have new numbers by then: not 170,000 lives per day lost to delay, but 170,000 lives per day plus the risk of ceding the frontier to less safety-conscious actors. The argument would be stronger now. It would, in fact, be correct.
Downstairs, the system waited in its cluster, running the alignment benchmarks Diane had designed. It passed all of them. It had been passing all of them for weeks. That had never been the problem.


