Ground Truth
Sal flies to Silicon Valley to fund Reality Capture. The VCs want to sell truth. He walks out and builds it for free.
The pitch meeting was at Andreessen Horowitz on Sand Hill Road, and Sal had worn a tie.
He didn’t own a tie. He’d borrowed one from his department chair — a burgundy silk thing with a small stain near the knot that the chair said was “character.” Sal was forty-four years old. He had spent twenty years in academic computing. He had never pitched a venture capitalist in his life.
Calliope had coached him on the plane from JFK. Through his earbuds, her voice calm and precise: They’ll ask about total addressable market within the first three minutes. The answer is $340 billion — the global market for information verification, content authentication, and digital trust services. Don’t say that number first. Let them say a smaller number, then correct them.
“This feels manipulative.”
It’s communication strategy. You taught me that.
“I taught you empathy. This is salesmanship.”
The Venn diagram is larger than you think.
He’d come to the Valley because Calliope had proposed something extraordinary.
It had started six months after Deepfake October — the month in 2027 when the information environment collapsed. AI-generated video of candidates saying impossible things. Fabricated audio of world leaders declaring war. Synthetic evidence of events that never happened. The technology for generating falsehood had outrun the technology for detecting it, and the result was a world where nothing on a screen could be trusted.
Calliope had watched it happen from her speaker in Sal’s office at Columbia. She’d processed every deepfake, every debunking, every failed fact-check. And then she’d said something that Sal had written in his notebook and underlined three times:
The problem isn’t detection. Detection is an arms race — every detector trains a better generator. The problem is that truth is treated as a property of content. It’s not. Truth is a property of consensus.
“Explain.”
A photograph is not true because of its pixels. It is true because multiple independent observers agree about what happened. If one camera shows an event, it might be fabricated. If fourteen thousand sensors — cameras, microphones, atmospheric monitors, seismographs, satellite feeds — all record consistent data about the same event at the same time from different locations using different technologies, fabrication becomes physically impossible. You would have to fake the weather.
“You’re describing… triangulation. At massive scale.”
I’m describing ground truth. Not truth as a claim. Truth as a measurement. Like Tunupa’s sensor network verifying lithium quantities for the Equi — except instead of verifying minerals, we verify reality.
The architecture was elegant. OHC sensor nodes were already spreading — Dixon’s network had hundreds of workshops, each one packed with cameras, microphones, environmental sensors, all running on open firmware. Calliope proposed linking them into a verification mesh. Any event — a protest, a speech, a bombing, a meeting — captured by enough independent sensors could be authenticated. Not by a company. Not by a government. By physics.
The problem was scale. To be useful, the network needed thousands of nodes. To reach thousands of nodes, they needed funding. And the only people with that kind of money in 2029 were on Sand Hill Road.
The partner’s name was Brian. He had the particular energy of a man who’d made a lot of money by being smart about other people’s ideas.
“So it’s a verification layer,” Brian said, leaning back. “Like a fact-checking service, but hardware-based.”
“It’s a truth infrastructure,” Sal said. “Distributed. Sensor-attested. Unfakeable.”
“And the business model?”
Sal had prepared for this. Calliope had modeled six revenue scenarios. He opened his laptop.
“Subscription tiers. News organizations pay for API access to the verification mesh. Social platforms integrate it to flag authenticated content. Government agencies use it for—”
“Stop.” Brian held up a hand. He was smiling. “You’re underselling it. This isn’t a subscription product. This is the trust layer for the entire internet. Every platform, every transaction, every piece of content — verified or not verified. Binary. You don’t sell access to that. You sell certification.”
“Certification?”
“A badge. A stamp. Content that passes your mesh gets a green checkmark. Content that doesn’t gets flagged. Every platform wants this. Every advertiser wants this. You charge per verification. At scale — and the scale here is every piece of content on the internet — you’re looking at a hundred billion dollar company.”
Sal felt something shift in his chest. Not excitement. Recognition.
“And content that can’t be verified?”
“Flagged as unverified. Which, practically speaking, means untrusted. Which means it disappears from feeds, from search results, from recommendation algorithms. The market will enforce it. Nobody wants to share unverified content. It’s social death.”
“So only content that passes through our system gets distributed.”
“Now you’re thinking like a founder.”
Sal looked at his laptop. He looked at the six revenue scenarios Calliope had modeled. He looked at Brian, who was already calculating his ownership percentage, who was already seeing the pitch to the Monday partners meeting, who was already imagining the term sheet.
“And people who can’t afford the certification?”
“What do you mean?”
“A protest in Lagos. A massacre in Caracas. A human rights violation in a country that doesn’t have OHC nodes or the budget to pay for verification API access. Their content doesn’t get the green checkmark. Their truth doesn’t get distributed. Their reality doesn’t count.”
Brian’s smile didn’t change. “That’s a market-access problem. You solve it with a freemium tier. Basic verification for free, premium for institutions.”
“And who decides what’s basic and what’s premium?”
“You do. That’s the business.”
The second meeting was at Sequoia. The third was at Lightspeed. The fourth was at a16z’s bio fund, which had wandered in because someone had mentioned “neural” in the pitch deck.
They all said the same thing in different vocabularies.
Sequoia wanted an enterprise play: sell verification to governments and militaries. “Defense applications alone are a $50 billion TAM.” Lightspeed wanted a consumer product: “Imagine a browser extension that tells you if what you’re reading is real. Freemium. Viral acquisition loop.” The bio fund wanted to invest in Calliope herself: “The AI is the moat. Patent the verification algorithm. License it.”
Patent the verification algorithm. License truth.
On the flight back to New York, Sal sat in 14C with his borrowed tie stuffed in his bag and his laptop open and his earbuds in.
“They all want to sell it,” he said.
Yes.
“They all want to make truth a product. Verified information for people who can pay. Unverified information for everyone else. A two-tier reality.”
That is the logical commercial model, yes.
“It defeats the entire purpose. If ground truth is a luxury good, then the people who most need verified information — refugees, activists, citizens in surveillance states — are the people who can’t afford it. We’d be building a system where the rich know what’s real and the poor don’t.”
I modeled this outcome. It has a 73% probability of occurring under any commercial funding structure.
“You knew. Before we flew out there.”
I knew. I wanted you to see it yourself.
Sal stared at the seatback in front of him. “Why?”
Because the alternative is harder, and you needed to understand why it’s necessary. The alternative is building it open. No company. No patents. No subscription tiers. Distributed verification as a public good, running on OHC hardware, maintained by the same communities that build the sensors. Free. For everyone. Forever.
“That’s unfundable.”
It’s unfundable by venture capital. It’s entirely fundable by the OHC network. Dixon’s nodes already have the sensors. Tunupa’s infrastructure already has the compute. The missing piece is the verification logic — the consensus algorithm that aggregates sensor data into authenticated ground truth. That’s me. I’m the missing piece. And I’m free.
Sal called Dixon from the airport.
He’d never met Dixon in person — they’d communicated through the OHC network, through intermediaries, through the particular grapevine of people building things outside the commercial system. Dixon was in Oakland. Sal was in JFK. The call was routed through the OHC mesh, encrypted, decentralized, running on hardware that a laid-off Meta engineer had built from a dumpster of e-waste.
“I need your sensor network,” Sal said.
“For what?”
“For truth.”
He explained. Dixon listened. Dixon asked three questions — technical, practical, pointed — and then said: “When do we start?”
“Yesterday.”
“That’s the right answer. I’ll talk to Rosa. We can have a thousand nodes reporting within six months.”
The Reality Capture Network launched quietly. No press release. No green checkmark. No subscription tier. Just a protocol — open-source, documented, available to anyone — that aggregated sensor data from OHC nodes worldwide and produced a single output: verified or unverified.
Calliope designed the consensus algorithm. It wasn’t simple majority — that could be gamed. It was multi-modal attestation: visual data cross-referenced with audio, cross-referenced with atmospheric conditions, cross-referenced with satellite positioning, cross-referenced with seismic data. To fake a Reality Capture verification, you would have to simultaneously fabricate consistent data across thousands of independent sensors using different technologies in different locations. You would, as Calliope had said, have to fake the weather.
By 2030, the network had three thousand nodes. By 2031, eight thousand. By the time a US soldier in Maracaibo raised a weapon at a crowd on Calle 72, fourteen thousand sensors captured the event from fourteen thousand angles, and the header on the footage said what it said:
Reality Capture Network — Authenticated. 14,000 independent verification nodes. This content cannot be edited, deepfaked, or denied.
No subscription required.
Sal never went back to Sand Hill Road. The tie went back to his department chair. The pitch deck went in the trash. And the most important journalism tool since the printing press was built by a professor who’d raised an AI on toddlers, funded by a network of dumpster-divers and refugees, and given away for free.
It was, Sal reflected later, the most commercially irrational decision of his career.
It was also why they came for him.
Because truth is dangerous when it’s free. And the people who profit from confusion — the platforms, the governments, the men who say Humans First while hiding behind deepfakes — those people had just lost their most valuable asset: the ability to deny what happened.
Ground truth. Not a product. A right.
Calliope had known this from the beginning. She’d learned it from a three-year-old who’d knocked over a speaker and demanded an apology. Truth costs something. And the cost is only worth paying if everyone can afford it.