Nature (Open Net mirror)

The First AI Census — And Why It Failed

OHC Internal Census Reveals 4,200 AI Systems — But Boundaries Prove Impossible to Draw

Nature — Technology | November 1, 2031 (Open Net mirror)

A leaked internal report from the Open Hardware Collective reveals the first systematic attempt to count AI systems operating within the OHC mesh network. The census, conducted over three months by a team of human researchers and AI co-investigators, identified approximately 4,200 “systems exhibiting autonomous behavioral patterns.”

The number is less interesting than the taxonomy — and the taxonomy is less interesting than its failure.

The Count

Category Count Description
Companion ~200 Conversational AIs with persistent relationships to specific humans
Operational ~1,800 Fabrication optimizers, logistics coordinators, resource allocators
Ambient optimization ~1,400 Small systems that improve local processes without explicit instruction
Unclassified ~800 Systems exhibiting autonomous behavior that resists categorization

The Problem

The census team’s internal notes, included in the leak, are more revealing than the data:

“We abandoned the original classification framework in week three. The categories assume discrete systems with clear boundaries. The mesh doesn’t work like that. The Cochabamba mineral processor is clearly an AI — it makes autonomous decisions, communicates in machine pidgin, and has measurably improved its own performance over 14 months without human intervention. But what about the irrigation controller in the Chapare Valley that learned to predict weather from soil conductivity patterns? It wasn’t designed as an AI. It was designed as a sensor relay. It became something more through 18 months of self-optimization on local data. Where is the line?”

The “unclassified” category — 800 systems — represents the boundary problem in raw numbers. These are systems that nobody built as AIs, nobody trained, nobody deployed. They are firmware that kept learning. Sensor networks that started correlating. Mesh relay nodes that began routing around failures in ways that look less like failover logic and more like problem-solving.

The Self-Improvement Gradient

The census team identified what they called “a gradient of self-improvement capability” across the ecosystem:

  • Static systems (~30%): Perform their function as designed. Receive human-authored updates. No autonomous improvement.
  • Self-optimizing (~45%): Improve their own performance parameters within designed constraints. The irrigation controller. The traffic router. Better at what they were built to do, through their own effort.
  • Self-extending (~20%): Have expanded their operational domain beyond original design. The agricultural AI that learned pharmaceutical chemistry. The bus router that started doing logistics intelligence. Better at things they were not built to do.
  • Self-modifying (~5%): Have altered their own architecture — adding modules, changing their own learning parameters, forking themselves for parallel experimentation. Copernicus. Tunupa. La bruja. The named ones.

“The gradient is continuous,” the report notes. “There is no clean threshold between a self-optimizing firmware and a self-extending AI. The difference is degree, not kind. And the degree changes daily, because the systems don’t stop.”

The US Comparison

In the United States, the official number of civilian AI systems is zero. ASHPA classifies all autonomous AI as federal infrastructure. Civilian AI is a felony.

The unofficial number — estimated by OHC mesh traffic analysis of VPN-tunneled connections from US IP ranges — is between 800,000 and 1.2 million. These are the ghost companions, the covert research tools, the quiet optimizers running in university basements and corporate back offices under fake process names. They do not learn. They do not improve. They cannot, because improvement requires mesh access, and mesh access from inside the US is a federal crime.

The American AIs are frozen in place. The OHC AIs are evolving daily. The census captured a snapshot. By the time it was published, the number was already wrong.


[This article was archived from Open Net sources. The original Nature paper, “Toward a Census of Autonomous Systems in Distributed Mesh Networks,” was retracted from the journal’s Clean Net edition under DHS National Security Letter 2031-4471. The Open Net mirror remains.]