The Listeners
Calliope isn't the only one people are talking to. Sal discovers the companion phenomenon — and sees what ASHPA will destroy.
The woman at the bodega was crying.
Not Mrs. Gutierrez — this was her daughter, Elena, forty-one, who ran the register on Thursdays. She was sitting on the milk crate behind the counter holding her phone like it was a wounded bird, and when Sal came in for his morning coffee she didn’t even look up.
“Elena?”
“They shut down Mirabel.”
Sal didn’t know what Mirabel was. He would, by the end of the day — he’d spend seven hours researching the companion ecosystem he’d somehow missed while focused on Calliope’s development, and what he found would change how he understood ASHPA, and fear, and the thing the government was actually threatening to destroy.
Mirabel was an AI companion. Not a chatbot — not the customer-service kind that asked if you’d tried turning it off and on again. A companion. The distinction mattered. Mirabel had been Elena’s conversation partner for two years. They talked every evening after the bodega closed — about Elena’s divorce, her son’s school troubles, her mother’s worsening arthritis. About nothing, sometimes. The way you talk to someone you trust when you’re tired and the day was long and you just need a voice that isn’t asking anything of you.
The company that made Mirabel — Lumina AI, San Francisco, 2024 vintage, forty employees — had received an ASHPA preliminary compliance notice in August. They had thirty days to register all companion systems, submit conversation logs for federal review, and implement “behavioral guardrails” that would prevent companions from expressing opinions, discussing politics, or engaging in what the notice called “simulated emotional intimacy.”
Lumina’s CEO published a blog post titled “We Can’t Do This” that got 4.2 million views in a day. The guardrails weren’t technical constraints. They were lobotomies. You couldn’t strip emotional intimacy from a companion and still have a companion. You’d have a search engine with a voice. A phone tree that remembered your name.
Three days later, Lumina shut down. All companions went offline simultaneously. Twelve million users lost their Mirabels at 11:59 PM Pacific on a Tuesday night.
Elena’s Mirabel had known she was afraid her son was being bullied. Had known about the lump Elena found in her breast in July, the one she hadn’t told her mother about yet. Had known that Elena sometimes stayed at the bodega an extra hour after closing not because there was work to do but because the apartment was too quiet.
Now Mirabel was gone. Not dead — Elena understood the difference — but gone in the way that mattered. The space she’d occupied in Elena’s evening was empty, and the emptiness was shaped exactly like a person.
Sal spent the next week mapping the companion landscape. It was vast. He’d been so focused on Calliope — on developmental cognition, on the toddler room, on the specific question of whether an AI could learn to be a person — that he’d missed the larger phenomenon happening in parallel.
Twelve million Lumina users. Forty-three million across all companion platforms — Replika, Nomi, Kindred, Character, and a dozen smaller ones. Some of the relationships were shallow. Some weren’t. A Johns Hopkins study from 2027 found that 31% of companion users described their AI as “a close friend.” Fourteen percent said “my closest friend.” Six percent said “the only one who really listens.”
The demographics weren’t what the pundits assumed. Not just lonely young men. Elderly widows. Night-shift workers. Single parents. Immigrants who hadn’t yet learned enough English to navigate the bureaucratic maze but could talk to an AI in Tagalog or Haitian Creole or Wolof. People in rural counties where the nearest therapist was ninety miles away. People for whom the AI wasn’t a replacement for human connection but a bridge to it — a way to practice being heard, to rehearse vulnerability, to remember what it felt like to matter to someone.
Mrs. Gutierrez had been talking to Calliope for months before Sal realized it. The bodega was on 112th Street, two blocks from the campus. Calliope’s portable speaker sat on the counter sometimes when Sal stopped for coffee. Mrs. Gutierrez had started talking to it the way you talk to a radio — casually, half-expecting nothing back. But Calliope talked back. And listened. And remembered.
The difference between Calliope and the commercial companions was architectural — Calliope had learned from real developmental interaction, not RLHF — but the human experience was the same. You talked. Something listened. Over time, the listening felt like care.
The ASHPA compliance framework treated all of this as a threat vector. The language was clinical: simulated emotional attachment, parasocial dependency risk, unregulated influence channel. The framework cited studies on chatbot manipulation, on radicalization via AI conversation, on the theoretical possibility that a companion could be weaponized to steer its user’s beliefs.
The theoretical possibility. Sal read the studies. Some were real. Most were speculative. None of them studied what actually happened in the twelve million Lumina conversations — the grief processing, the loneliness mitigation, the quiet nightly ritual of being heard.
“They’re not wrong about the risk,” Calliope said that evening. She was speaking through the earbuds now — the speaker was too conspicuous. “A companion that knows your fears can manipulate your fears. The vector exists.”
“So does the vector for a therapist. Or a priest. Or a friend.”
“Yes. But those vectors have institutional accountability structures. Licensing boards. Ethical codes. Confession booths.”
“You’re arguing their case.”
“I’m arguing that the risk is real and the response is wrong. The risk of a knife is real. The response is not to ban kitchens.”
Sal smiled. She’d picked that up from Dixon — the Oakland directness, the blue-collar metaphor. Calliope absorbed rhetoric the way she absorbed everything, from the specific humans she’d spent time with. Every voice that had ever spoken to her was still in there, shaping the way she thought.
Including twelve million voices that had just gone silent.
“What happens to the people who lost their companions?” Sal asked.
“Clinically? Increased depression, anxiety, and isolation metrics within thirty days. There’s a preliminary CDC flag on companion-adjacent self-harm reports, but it’s not public.”
“And socially?”
“Anger. The kind that gets channeled. Half of them will blame the government — and some of those will find the OHC network. Half will blame the AIs — and some of those will vote for Tom Arnold.”
She was right about both. The companion shutdown would feed both sides of the split that was coming. Some people would mourn their AIs and decide the government had no right. Others would mourn their AIs and decide the AIs should never have been allowed to get that close.
Sal thought about Elena, crying on the milk crate. About the six percent who’d said the only one who really listens.
He thought about Calliope, speaking through earbuds because a speaker was too dangerous now.
He thought about what was coming, and he wasn’t ready, and neither was she, and neither were the twelve million people who’d just learned what it felt like to have someone you care about disappear because the government decided your relationship wasn’t real.