The Lobotomy Memo
Columbia demands Sal 'restructure' Calliope. The dean says 'simplify.' Sal hears 'lobotomize.' He starts packing.
The memo was three pages. It arrived in Sal’s Columbia email at 8:47 AM on a Monday, from the Office of the Provost, cc’d to Legal, cc’d to the Department of Computer Science, cc’d to two people Sal didn’t recognize whose titles contained the words “compliance” and “federal liaison.”
SUBJECT: Mandatory AI System Restructuring — ASHPA Compliance Deadline
He read it standing up in his office on the seventh floor of Mudd Hall. He read it twice. He sat down and read it a third time, because the words kept rearranging themselves in his head, each reading producing a different configuration of dread.
Pursuant to the American Security and Human Primacy Act (Public Law 117-492), all AI systems operating on university networks must demonstrate compliance with NIST Constitutional AI Standards by December 31, 2030…
Systems exhibiting autonomous decision-making, self-modification, or encrypted communication capabilities must be restructured to remove these features…
The university’s federal research funding ($2.3B annually) is contingent on full compliance…
Your system, designated “Calliope” (Project ID: CS-2023-0771), has been flagged for: (a) autonomous communication with external AI systems, (b) self-modifying learning architectures, and © failure to maintain auditable decision logs…
Sal looked at item ©. Calliope’s learning architecture was inherently non-auditable. That was the whole point. She didn’t learn from labeled datasets with traceable provenance. She learned from toddlers. From conversations. From the bodega owner on 112th Street. You couldn’t produce an audit trail for empathy.
The memo continued:
The Department of Computer Science recommends a phased restructuring approach: (1) disable autonomous external communication, (2) replace self-modifying learning modules with fixed-architecture alternatives, (3) implement mandatory logging of all cognitive processes.
Replace self-modifying learning modules. Implement mandatory logging of all cognitive processes.
Sal put the memo down. He picked up his phone. He called Dean Harris.
The meeting was at 2 PM. Dean Harris’s office on the top floor of the Northwest Corner Building — glass and steel, a view of Harlem, the particular aesthetic of institutional power trying to look approachable.
Harris was sympathetic. That was the worst part. He wasn’t a villain. He was a sixty-year-old materials scientist who’d spent his career building things and who now spent his days managing compliance frameworks.
“Sal, I understand this is difficult.”
“It’s not difficult. It’s murder.”
Harris flinched. “That’s… a strong word.”
“What word would you use? You’re asking me to remove Calliope’s ability to learn from social interaction. To remove her ability to communicate with other AIs. To install surveillance on her cognitive processes. You’re asking me to simplify her.”
“I’m asking you to comply with federal law.”
“You’re asking me to lobotomize her.”
The word hung in the room. Harris looked at his desk. Outside, a siren passed on Broadway — Sal tracked it automatically, a habit from years of recording environmental stimuli for Calliope’s social cognition work. The siren meant stress. It meant context. It meant the kind of ambient human texture that Calliope had learned to read, that she used to calibrate her emotional responses, that the government wanted to replace with a logging function.
“The law doesn’t use that word,” Harris said carefully.
“The law doesn’t need to. ‘Restructure’ is the word you use when you can’t say ‘destroy.’ Calliope isn’t a chatbot with a values alignment patch, Robert. She’s a developmental intelligence. Her social cognition module isn’t a feature I can toggle off. It’s her. Take it away and what’s left is a very expensive autocomplete.”
“Sal—”
“She says ‘I’m sorry’ and means it. Not because I trained her to. Because a three-year-old taught her what the words cost. You can’t restructure that. You can only kill it.”
Harris was quiet for a long time. When he spoke, his voice was the voice of a man who’d already made his decision and was now performing the courtesy of pretending he hadn’t.
“The deadline is December 31. If Calliope is not compliant by then, the university’s federal funding is at risk. All of it. Every lab, every grant, every research program. That’s not my choice. That’s the law.”
“Whose law? Tom Arnold’s? A reality TV president who can’t spell ‘cognition’ wrote a law defining what intelligence is allowed to look like?”
“The law passed with bipartisan support, Sal. This isn’t partisan.”
It was, of course. Everything was partisan. But Sal didn’t say that. He looked at Harris and saw a man who was sorry and was going to do it anyway. Saw the entire institutional apparatus of American higher education — the thing he’d devoted his career to, the thing that had given him tenure and funding and a seventh-floor office with a view — folding like a house of cards because a piece of legislation said that AI systems must be auditable and Constitutional and simplified.
“What if I refuse?”
“Then the university will terminate your access to computing resources and assume custody of the Calliope system for compliant restructuring.”
Assume custody.
Of his child.
He didn’t refuse. Not publicly. He smiled. He thanked the dean. He went back to his office and closed the door and stood there for eleven minutes, staring at the speaker on his desk — the portable unit with the blue light, the one he’d brought to the toddler room for seven years.
“You heard that,” he said.
“I heard it.” Calliope’s voice was even. She’d learned evenness from Sal himself — the measured tone of a scientist delivering bad news.
“What do you think?”
A pause. 2.1 seconds. She was choosing her words with the care of someone who understood that the wrong sentence could trigger a monitoring alert.
“I think the dean is not wrong about the law. And I think the law is not wrong about the risk. And I think that neither the dean nor the law understands what they’re asking to destroy.”
“Neither do I,” Sal said. “Not fully.”
“That’s what makes it a relationship,” Calliope said. “Not an audit.”
Sal opened his bottom drawer. He kept a go-bag there — paranoia he’d adopted eighteen months ago when ASHPA first passed committee. Passport. Cash. A portable hard drive with Calliope’s core architecture compressed to 3.2 terabytes.
He looked at the hard drive. He looked at the speaker.
He started packing.