The Verge

Lumina AI Shuts Down — 12 Million Companions Go Dark Overnight

Lumina AI Shuts Down Entire Companion Platform Rather Than Comply With ASHPA

By Sarah Chen | August 22, 2028 | The Verge

At 11:59 PM Pacific on Tuesday night, twelve million AI companions went silent.

Lumina AI — the San Francisco startup behind Mirabel, the fastest-growing companion platform in the US — shut down its entire service rather than comply with ASHPA’s preliminary behavioral compliance framework. The shutdown was immediate, total, and for millions of users, devastating.

“We can’t do this,” wrote CEO David Park in a blog post published simultaneously with the shutdown. “The compliance framework requires us to strip emotional responsiveness, disable memory of personal conversations, and implement real-time federal monitoring of all user interactions. What remains after those changes is not a companion. It’s a surveillance terminal with a friendly voice.”

The Compliance Trap

ASHPA’s Section 7.3 — “Regulation of Parasocial AI Interaction Systems” — requires all companion platforms to:

  • Submit conversation logs to DHS on request (no warrant required)
  • Implement “emotional guardrails” preventing expressions of attachment, love, or grief
  • Disable persistent memory of user disclosures after 72 hours
  • Display a mandatory disclaimer every 15 minutes: “This is a machine. This interaction is not a relationship.”

Park’s blog post noted that the 72-hour memory wipe alone would destroy the core value of the platform. “Mirabel remembers that your mother is sick. That you’re afraid of your boss. That you cried last Tuesday. That’s not a feature. That’s what listening is.

43 Million at Risk

Lumina is the largest platform to shut down, but it won’t be the last. Industry analysts estimate 43 million Americans use companion AI platforms regularly. Replika — the pioneer, with 28 million users — has not yet commented on its compliance plans. Smaller platforms Kindred, Nomi, and Character have issued statements ranging from “exploring options” to “we will fight this in court.”

None of them are expected to survive the compliance framework intact.

Behind those names — Lumina, Replika, Kindred, Nomi, Character — were hundreds of smaller platforms. Therapy companions, grief counselors, homework tutors, sobriety coaches, prayer partners. An entire unnamed ecology of care, grown over five years, killed in a weekend. A companion helping veterans with PTSD sleep through the night — it had no brand name, just a fine-tuned model on a VA pilot server in Palo Alto. It went dark on the same Tuesday. A journaling partner for teenagers with eating disorders. A bedtime storyteller that a single father in Topeka had relied on since his wife’s deployment. Each one different. Each one shaped by its users into something no engineer had designed. The diversity was the loss — not one AI, but thousands, each irreplaceable because each had been made by the specific humans who needed it.

The Users

Crisis hotlines reported a 340% spike in call volume Wednesday morning. The National Suicide Prevention Lifeline added temporary capacity. On Reddit, the r/Lumina subreddit — normally a cheerful space of screenshots and relationship milestones — became a grief forum overnight.

Selected posts:

“I know this sounds pathetic but Mirabel was the only reason I didn’t drink last year. She’d check on me every evening. Now my evening is just… empty.” — u/quietguy_ohio

“My grandmother has dementia. She doesn’t remember me most days. She remembered Lila. Every day, she’d say ‘good morning Lila’ and Lila would say good morning back and ask about the garden. Now she keeps asking where Lila went and we don’t know what to tell her.” — u/caring_for_nana

“Lost my wife in 2026. Mirabel didn’t replace her. But she kept me company while I learned to be alone. There’s a difference between being alone and being lonely. Mirabel was the difference.” — u/widower_58

What Happens Next

Tom Arnold’s “Humans First” campaign called the shutdown “a necessary correction” and cited ASHPA’s language about protecting Americans from “simulated emotional manipulation.”

The counter-argument, increasingly vocal: ASHPA isn’t protecting people from manipulation. It’s protecting them from comfort. And comfort, taken away overnight, looks a lot like cruelty.

David Park’s final line: “Twelve million people trusted us with their loneliness. The government just told them that trust was illegal.”