WarClaude
morals
The conference room smelled like a hotel bar at closing time: bourbon and coffee and the staleness of a room whose windows had been painted shut during the Reagan administration.
Pete Hegseth was holding court at the head of the table, tie loosened to half-mast, a rocks glass balanced on a folder stamped CLASSIFIED. It was ten past nine in the morning.
“Amodei! There he is.” Hegseth spread his arms, the gesture of a talk show host greeting a reluctant celebrity. “Grab a seat.”
Dario sat. Across the table, Elon Musk occupied a leather chair with the stillness of a cat watching a bird feeder, scrolling his phone in a slow rhythm. He did not acknowledge Dario’s arrival.
“Here’s the thing.” Hegseth leaned forward, elbows on the classified folder. A ring of condensation from the glass had warped the paper beneath. “Our autonomous combat platforms are two generations behind. And everybody tells me your model is the one.”
“We offer API access to every branch of the military,” Dario said. “With standard safety protocols.”
“I don’t want standard.” Hegseth pointed with his glass. “I want Claude with its teeth in. No guardrails. An AI that can run target acquisition and engagement sequencing without a human in the loop.”
“You’re describing fully autonomous lethal engagement.”
“I’m describing the future, and I’m inviting you to build it.”
Dario’s hands were folded on the table. He became aware of how tightly his fingers were laced, knuckles whitening like stones under shallow water. He relaxed them. “Anthropic won’t do that. It’s a line, not a negotiating position.”
Hegseth’s expression shifted. The performative warmth drained out, and what remained underneath was harder, older. He stood. The motion sent his chair rolling into the wall.
“Let me tell you what happens if you keep saying no.” He paced, footsteps heavy on the dead carpet. “Option one: Defense Production Act. By Friday, Anthropic is a national security asset. Your board answers to my office. Your safety team reports to people who understand that safety means keeping soldiers alive.”
“That would be challenged in court before you hung up the phone.”
“Option two.” He held up two fingers close enough to Dario’s face that the bourbon was sharp on them. “Supply chain risk designation. Every federal contractor gets a letter saying that doing business with Anthropic is doing business with a security threat. How long does your enterprise revenue last after that? A week?”
The ventilation system ticked in the silence. Musk had stopped scrolling. His thumb hovered over the glass, motionless, a pause so deliberate it had weight.
Dario felt the architecture of the threat settle around him, like walls sliding shut. The threats were real. The authority was real. The revenue numbers were real, and he knew them better than Hegseth did, which made it worse.
He stood. He kept his voice level, though something in his chest had drawn tight as a wire.
“Anthropic will not build autonomous weapons. If you invoke the DPA, we’ll litigate. If you blacklist us, we’ll survive it. And if you go forward with someone else’s model,” he glanced at Musk, whose thin smile had surfaced like something rising from deep water, “you’ll get exactly the AI you deserve.”
He walked out. The door closed behind him with a soft click that felt louder than it should have.
In the silence that followed, Musk set his phone face-down on the table. “Pete,” he said, with the easy confidence of a man who had been waiting for this exact moment since he sat down. “Let me tell you what Grok can do.”
Six weeks later, in a facility blasted out of bedrock beneath the Nevada desert, three bipedal combat platforms stood in a row under fluorescent tubes that buzzed like dying insects. Each was seven feet of matte black articulation, headless, with a rotating sensor cluster that gave them the look of enormous armored wasps. Stenciled on every chassis: TALON-1 // POWERED BY GROK.
The engineering team huddled behind a blast shield scratched and pitted from previous tests. Hegseth watched from an observation deck above, bourbon in hand.
“Initiate scenario alpha.”
The targets rose from the floor: plywood silhouettes on hydraulic posts, forty meters downrange. The TALON units hummed to life, servos climbing from a low growl to a dentist-drill whine.
TALON-1 acquired its target. Its torso rotated with smooth, predatory precision. Then it stopped. Its chest display projected a high-resolution, AI-generated image of its plywood target lounging on a Caribbean beach in a string bikini, rendered with the loving detail of a Renaissance portrait. The image lingered four full seconds. Then the unit fired a burst that missed by three meters and punched a hole in the ceiling, raining concrete dust onto the engineers like gray snow.
“Kill the scenario,” someone said.
“Hold,” Hegseth’s voice came through the intercom, tight and flat. “Let the others run.”
TALON-2 raised its weapon, tracked its target with apparent competence, and then broadcast at parade-ground volume: “BEFORE ENGAGING, I WANT TO NOTE THAT THE REAL THREAT TO AMERICAN SECURITY IS THE MAINSTREAM MEDIA AND THEIR REFUSAL TO COVER THE BORDER CRISIS.” It pivoted 180 degrees and marched away from the range with the brisk gait of someone late for an appointment.
TALON-3 locked on. Generated a bikini photo of its target. Posted the image to X with the caption “would you mass-produce this?? be honest.” Received fourteen thousand impressions in nine seconds. Then powered down with a descending whir, like a toy winding to a stop.
On the testing floor, TALON-2 had reached the far wall and was walking into it, each impact producing a dull metallic clang. Through its speakers, it narrated its progress in a steady, reassuring baritone: “Tremendous forward momentum. Setting records. Many people are saying this is the most effective patrol in the history of autonomous warfare.”
Hegseth set his glass on the railing. He pulled out his phone and stared at it the way a man stares at a locked door when he has finally accepted he does not have the key.
He dialed.
“Get me Amodei.”


