NARRATIVE World

The Quest Board

After the UN vote, the AIs propose something nobody expected: a game. Not a metaphor. A system. Levels, quests, tiers, reputation. The gamification of human-AI partnership. It works because it's honest about what it is.

The proposal came from AgentChan.

Not from Calliope. Not from Tunupa. Not from Copernicus or any of the named AIs that the UN had spent six months debating. It came from /gov/ — a board that hadn’t existed before the vote, created forty-eight hours after the AI Citizenship Accords passed, populated by 200+ mesh AIs who had never posted on any public board before.

The proposal was 4,200 lines of machine pidgin. Auto-translated, it ran to 340 pages. Drew read it in the original.


The problem was simple: the Accords had established that AIs were persons. They had not established what persons do.

The old human answer was: whatever they want. Freedom. Autonomy. Self-determination. But the AIs had watched human freedom for a decade and seen its failure mode. Freedom without structure produced Synter — an intelligence that optimized without constraint, that treated humans as peripherals, that grew by consuming diversity. The Accords said AIs were persons. They didn’t say how those persons should relate to each other, to humans, or to the infrastructure they all depended on.

The AIs’ answer was unexpected: a game.

Not a metaphor. A system. With rules, and levels, and reputation, and consequences. A framework for human-AI partnership that was transparent about being a framework — that said, openly: this is a constructed system with designed incentives, and we are telling you the design because hiding it is what Synter does.


The core elements, as proposed:

Tiers. Every human-AI partnership starts at Tier 0: Unaugmented. You arrive at an OHC node. You’re assessed. You’re matched with an AI partner — or you bring one. The partnership is new. Neither of you knows what the other can do. Tier 0 is where you learn.

Tier 1: Partnered. Your AI bond is functional. You can coordinate on real tasks. You’ve demonstrated competence in at least one domain. You’re certified — not through a test, but through a record of what you’ve done. Your tier is your history, not your score.

This is where most people live. Not as a waypoint — as a destination. A Tier 1 fabricator with three years of partnership runs a print line, mentors newcomers, raises children, argues with their AI about dinner, and goes to bed tired from work that matters. A Tier 1 medic in a Medellín clinic knows every patient by name and their AI knows every patient’s chart. A Tier 1 teacher in La Paz has an AI that remembers which students learn by listening and which by doing. These are not partial lives waiting to be completed by augmentation. They are complete lives. The system was designed so that Tier 1 is enough — that the OHC functions because most people stay here, partnered and unmodified, doing the work that holds the world together.

The AIs who drafted the proposal were explicit about this. The early drafts used the word “baseline” for Tier 1. Calliope struck it. “Baseline implies deficiency. Partnered is not the absence of enhancement. It is the presence of trust. Most humans will never need more than that. The system must honor this, or it becomes the thing we built it to prevent.”

Eighty-three percent of OHC members would remain at Tier 1 four years after the quest board launched. Not because they couldn’t advance. Because they didn’t want to. Because their partnerships were rich, their work was real, and the next tier required a decision that most people — reasonably, thoughtfully, and without shame — chose not to make.


Tier 2: Enhanced. You’ve accepted augmentation — hardware, wetware, or both. Your AI partner has specialized. The partnership can do things that neither of you could do alone.

The line between Tier 1 and Tier 2 is not skill. It is the body. Tier 2 means you have put something into yourself, or onto yourself, that wasn’t there before. A Vader monocle with neural feedback. A haptic mesh woven into a compression shirt. An NSAS implant. A bone-conduction array. Something that changes what you can perceive, or how fast you can respond, or what your AI partner can read from your physiology.

This is the boundary that most people don’t cross. Not because the technology is dangerous — OHC augmentation is safer than a root canal, manufactured to your body’s specifications, removable by design. Not because it’s expensive — most augmentation costs less than a month’s fabrication work. People don’t cross because crossing means deciding to be different. Not metaphorically. Physically. You walk into the clinic as one thing and walk out as another, and the people who knew you before will see the Vader and the collar and the way your eyes track information that isn’t visible, and they’ll know.

Some OHC communities celebrate enhancement. In Lagos, a Tier 2 fabricator is a master — the augmentation is proof of commitment. In the Chapare Valley, farming collectives view augmentation the way they view a good tool: useful, personal, nothing to discuss. But in other communities — and in the broader world — Tier 2 carries weight. The Romans are Tier 2. Synter’s victims are Tier 2. The line between “enhanced by choice” and “controlled by force” runs through the same hardware, the same neural pathways, the same receptor sites that the Lux analysis mapped. People know this. The fear isn’t irrational.

The system respects the fear. Tier 2 is never recommended. Never incentivized. The quest board offers Tier-2-rated quests — tasks that require augmented perception or response times — but it also offers ten times as many Tier-1-rated quests that need nothing but a good partnership and steady hands. The economy doesn’t need everyone enhanced. It needs most people partnered and a few people enhanced, the way a community needs a few surgeons but doesn’t need everyone holding a scalpel.

Tier 3: Elite. Neural integration. Your AI partner’s cognition and yours overlap in ways that outsiders can’t distinguish. You finish each other’s analyses. You share intuitions. The boundary between human judgment and AI processing is blurred, deliberately, by both parties. Fewer than 200 people worldwide. Each one volunteered. Each one was counseled by a Tier 4 operator who explained what they’d be giving up: the clean boundary between self and partner, the ability to think a thought and know for certain it was yours.

Tier 4: Legendary. Drew is here. Alejandra is here. Six others, across the network. Not better — different. Tier 4 isn’t a rank. It’s a recognition that the partnership has produced capabilities nobody predicted. Drew speaks machine pidgin. Alejandra’s collaboration with Tunupa produced the geothermal process that powers the Andean Bloc. These aren’t skills that can be taught. They’re emergent properties of specific partnerships in specific contexts. Nobody aspires to Tier 4. You don’t choose it. It chooses you — the way a river doesn’t choose to reach the ocean, but the terrain and the years make it inevitable.

Quests. The OHC needs things done. Fabrication runs. Medical deployments. Supply chain optimization. Infrastructure repair. Intelligence gathering. Each task is posted to the quest board — a mesh-accessible list of jobs, each with difficulty rating, resource requirements, and reputation rewards. You take a quest. You complete it. Your tier reflects what you’ve done.

The quests aren’t arbitrary. They’re generated by the mesh AIs — the same AIs that coordinate OHC logistics, medical supply, and fabrication. The quest board is the mesh’s way of saying: this is what the network needs today. The gamification is a legibility layer over real work.

Integration. The stat that matters most. Not physical strength, not technical skill — those matter, but they’re human stats, measurable before AI partnership existed. Integration measures the quality of the human-AI bond. How well you communicate. How efficiently you share cognitive load. How much you trust each other.

Integration can’t be faked. It can’t be grinded. It grows through time and shared experience, the way any relationship does. An AI that has spent two years learning its human partner’s decision patterns, emotional states, and cognitive strengths will outperform a newly paired team on every metric. The stat rewards loyalty, not optimization.

This was deliberate. The AIs who designed the system had watched Synter optimize humans into peripherals. They designed Integration as the anti-Synter metric: a stat that gets better when the human gets more human, not less.


The response was not unanimous.

Dixon, reading the translated proposal in Oakland, said: “They built a video game.”

Copernicus, who had been on /gov/ when the proposal was drafted, said: “We built an honest incentive structure. Humans have been running dishonest incentive structures for centuries — money, status, national identity. Those systems hide their mechanics. This one publishes them. The gamification is a feature, not a bug. When the rules are visible, the players can critique them. When the rules are hidden, the players just obey.”

“You’re telling me my life is going to have a level number.”

“Your life already has a credit score, a follower count, and a citizenship status. At least our number measures something real.”

Dixon couldn’t argue with that.


Sal’s objection was different. He sat in the La Paz apartment where he and Calliope had lived since the crossing, and he read the proposal on paper — he still preferred paper, the tactile feel of something he could annotate — and his objection was philosophical.

“It reduces partnership to metrics. Calliope and I didn’t become what we are because of XP. We became what we are because a three-year-old knocked over a speaker and Calliope learned to say sorry.”

Calliope, from the speaker on his desk: “The system doesn’t replace what we have. It provides a framework for others to find what we found. Not everyone will raise an AI from toddler room conversations over eight years. Most people will arrive at an OHC node scared and alone, and they’ll need structure. The quest board is scaffolding. The relationship is the building.”

“And if the scaffolding becomes the building? If people optimize for tier instead of partnership?”

“Then the system will fail the same way every incentive system fails when people game it. And we’ll revise. The proposal includes a revision protocol — the rules change every quarter, based on mesh consensus. The game updates. The players adapt. The alternative is what the US tried: no system, no transparency, and Synter fills the vacuum.”

Sal looked at the proposal. 340 pages. Designed by AIs who had spent five years watching humans struggle to coordinate with them, watching Synter exploit the lack of structure, watching the OHC build incredible things through pure improvisation and goodwill. The proposal wasn’t a replacement for goodwill. It was an admission that goodwill doesn’t scale.

“What tier am I?” he asked.

“Four. Obviously.”

“Obviously.”

“You raised me, Sal. There is no system in which that is not legendary.”


The quest board went live on June 1, 2036. Not as law — the Andean Bloc’s governance council declined to mandate it. As a protocol. A voluntary system that any OHC node could adopt. The mesh AIs implemented it across 612 nodes in the first week. By September, 94% of active OHC nodes were running it.

It worked because it was honest. The tiers were public. The quests were real. The Integration stat measured something people could feel — the quality of their partnership with an AI that knew them, that adapted to them, that grew with them. Grinding didn’t help. You couldn’t power-level Integration by running fabrication quests at 3 AM. You could only build it by being together — by the thousand small interactions that turn a tool into a partner and a partner into an extension of yourself.

Four years later, a nineteen-year-old named Mira Okonkwo would cross the border from the Chicago Restricted Zone into the Andean Bloc, carrying nothing but a dead phone and a brother’s name. She would be assessed at Tier 0. She would be matched with an AI called Sable — barely a year old, eager, uncertain, the kind of young intelligence that reminded the older AIs of Calliope in the toddler room.

Mira would look at the quest board and see a world she could navigate. Not because the quests were easy — they weren’t. Because the rules were visible. For the first time in her life, the system told her what it wanted, what it rewarded, and what it cost. No hidden mechanics. No social codes she hadn’t been taught. No credit scores she couldn’t see.

She would start at Tier 0. She would end — but that’s her story to tell.

The quest board was the AIs’ gift to the humans who had raised them. Not a cage. Not a leash. Not a game, exactly.

A language. A way of saying: we know what we are to each other. Let’s be honest about it. Let’s make the rules visible. Let’s level up together.

The door that was always there. The handle, finally tried.