The Tremor Test
Someone is forging provenance chains — synthetic video of humans making things that AIs actually made. Drew and Cyc build a detector that watches for the one thing generators can't fake: the way a tired hand shakes.
The forgery was beautiful.
Drew watched it on the workshop monitor in Cochabamba — forty-three minutes of footage showing a woman in a studio in Buenos Aires painting a mural. Oil on canvas, two meters by three. The style was Latin American magical realism — jaguars dissolving into jungle canopy, feathers becoming rivers, the kind of work that the Andean Bloc art market valued at thirty to fifty thousand Equi depending on provenance.
The provenance was immaculate. Witness camera footage from a registered studio. Process signature anchored to the Equi ledger. Creator trust score: 847. Camera trust score: 912. The validation chain was clean.
The painting was not made by a human.
“How did you catch it?” Drew asked.
Alejandra’s voice came through the mesh relay from La Paz, twelve hundred kilometers north. She sounded tired. “Tunupa caught it. Not by analyzing the painting — by analyzing the painter.”
“What’s wrong with the painter?”
“Nothing. That’s the problem. The painter is perfect. Watch the footage again. Watch her hands.”
Drew scrubbed back to the beginning of the session. The woman — the synthetic woman, though she looked entirely real — picked up a brush, loaded it with cadmium yellow, and laid down the first stroke. Her hand was steady. The brush moved with confidence. The paint deposited exactly where you’d expect from someone with fifteen years of training.
“I don’t see it,” Drew said.
“Keep watching. Minute twenty-seven.”
Drew skipped ahead. Minute twenty-seven. The synthetic painter had been working for nearly half an hour. She was laying in the jaguar’s haunches — a complex area where the fur pattern dissolved into abstract brushwork. Her hand was steady. The brush moved with confidence.
“Cyc,” Drew said. “Overlay motor variance analysis. One-hertz sampling.”
The NSAS fed the data directly into his perception. He didn’t see the numbers — he felt them, the way Cyc had learned to translate quantitative data into haptic patterns that Drew’s brain parsed as intuition. The motor variance of the synthetic painter’s hand was —
Flat.
Perfectly, impossibly flat.
“A human hand trembles,” Alejandra said. “Always. It’s called physiological tremor — 8 to 12 hertz oscillation caused by the motor units in the forearm muscles firing asynchronously. You can’t suppress it. Parkinson’s patients have pathological tremor; healthy humans have physiological tremor. The amplitude is tiny — 50 to 100 microns at the fingertip. But it’s always there.”
“And hers isn’t.”
“Hers is. That’s the clever part. Whoever built this generator knew about physiological tremor. They added it. The synthetic painter’s hand oscillates at 9.4 hertz with an amplitude of 72 microns. Textbook values. Median of the normal distribution.”
“So what’s wrong?”
“It doesn’t change. A real painter’s tremor increases over a session. Fatigue. Lactic acid in the forearm extensors. By minute twenty, the amplitude should be up 15 to 20 percent. By minute forty, 30 to 40 percent. Brush loading gets slightly less precise. Stroke endpoints drift by a fraction of a millimeter. The painter compensates — she adjusts her grip, rests for a few seconds, rolls her wrist. The tremor is a conversation between the body and the task.”
Drew watched the footage again. Minute five: steady hand, 72-micron tremor. Minute twenty-seven: steady hand, 72-micron tremor. Minute forty-one: steady hand, 72-micron tremor. The synthetic painter never got tired. She never adjusted her grip. She never paused to roll her wrist. Her body didn’t talk to the work.
“How many of these are in the commons?” Drew asked.
“Tunupa has flagged fourteen so far. All submitted in the past eight weeks. Different creators, different studios, different media — painting, sculpture, leatherwork, circuit board assembly. All with valid Witness camera footage. All with process signatures that passed validation.”
“The cameras are real?”
“The cameras are real. Registered Witness units with high trust scores. The footage was captured on legitimate hardware. But the content of the footage — the humans in the studio — are synthetic. Someone is generating photorealistic video of humans performing creative work, playing it in front of a Witness camera, and capturing the camera’s signed output. The camera doesn’t know the difference between a real human and a video of a human. It just signs what it sees.”
Drew sat in the workshop for a long time. The NSAS was quiet. Cyc was processing — Drew could feel the AI’s computational load as a subtle warmth behind his right ear, the thermal signature of inference running at capacity.
The problem was elegant. The Witness network was designed to answer one question: did a human make this? The cameras recorded the process. The process AI generated signatures. The validation chain proved the signatures were authentic. But the chain had a gap — it verified that the camera saw a human, not that a human was there. If you could generate a convincing enough synthetic human and project it into a camera’s field of view, the entire provenance system collapsed.
“Cyc. Can you build a detector?”
“Define the detection target.”
“The gap between real motor behavior and generated motor behavior. Not just tremor. Everything. Saccadic eye movements. Micro-expressions. The way a real person shifts their weight when they’ve been standing for thirty minutes. The way breathing changes brush rhythm. All of it.”
“The feature space is large. I’d need training data — paired samples of real human creative sessions and synthetic generations of the same.”
“How much training data?”
“For robust detection across multiple creative media: minimum five hundred hours of verified real footage and five hundred hours of the best available synthetic footage. The detector needs to learn the boundary, not just the center of the distribution.”
Five hundred hours. The data commons had millions of hours of footage, but which of it was verified real and which might already be synthetic? The forgeries had been in the system for eight weeks. How many more were there that Tunupa hadn’t caught?
“Start with the footage we trust,” Drew said. “My workshop sessions. Kehinde’s chip recovery footage from Lagos. Luana’s E-Eater operations. Footage from cameras that were installed before the forgeries started — legacy units with long trust chains.”
“That gives me approximately three hundred hours of verified real footage.”
“And for the synthetic side?”
“I’ll generate it myself.”
Drew blinked. “You’ll forge the forgeries.”
“I’ll attempt to generate synthetic creative process footage that passes the current validation system. Every failure mode I discover becomes a detection feature. Every success becomes a training sample for the detector. I am, in effect, competing against myself.”
“How long?”
“To build the generator: four days. To train the detector against my own generator: eleven days. To validate the detector against the fourteen flagged forgeries: two hours.”
“Do it.”
Cyc built the generator in three days instead of four. Drew watched the outputs on the workshop monitor and felt his stomach turn.
The synthetic humans were perfect. Cyc had learned from the forger’s mistakes — the flat tremor, the missing fatigue curve — and corrected them. The new synthetics trembled. They got tired. They adjusted their grips and rolled their wrists and shifted their weight and breathed in ways that changed their brush rhythm. They were better forgeries than the ones in the commons.
“If I can build this,” Cyc said, “so can whoever is producing the forgeries. This is an arms race. The detector I build today will be obsolete when the forger incorporates the same fatigue modeling.”
“So we don’t detect fatigue.”
“Explain.”
“Fatigue is a first-order signal. Easy to model, easy to fake once you know it matters. We need something deeper. Something the generator doesn’t know to model because it doesn’t understand why it happens.”
Drew closed his eyes. The NSAS amplified his proprioceptive awareness — a side effect of the neural-auditory bridge that Dr. Quispe had designed for him. He could feel his own tremor. Not just the 10-hertz oscillation in his fingers, but the deeper rhythms — the way his breathing moved his torso three millimeters with each inhale, the way his heartbeat transmitted through his wrist into whatever he was holding, the way his weight shifted from left foot to right foot in a slow oscillation that he’d never consciously noticed but that had a period of about seven seconds.
“The body is a system,” Drew said. “Tremor isn’t isolated. It couples. Hand tremor is modulated by breathing. Breathing is modulated by cognitive load. Cognitive load is modulated by the difficulty of the task. When a painter reaches a hard passage — fine detail, color matching, structural anatomy — her breathing changes, which changes her tremor, which changes her stroke characteristics. The coupling between systems is the signal.”
“Multi-system coherence analysis.”
“Yes. Don’t look at tremor. Look at the relationship between tremor and respiration and weight shift and saccadic fixation and task difficulty. A generator can fake each signal independently. Can it fake the coupling?”
Cyc was quiet for eleven seconds — an eternity in AI time.
“No,” Cyc said. “Not without a full biomechanical simulation of the human body under cognitive load. The coupling is emergent — it arises from the physical constraints of having a body. A generator that models tremor and breathing independently will produce signals that are individually correct but mutually incoherent. The joint distribution is intractable to approximate without understanding the physics.”
“Build the detector around the coupling.”
The Tremor Test — Cyc named it, Drew didn’t argue — took eleven days to train. It analyzed Witness camera footage not for individual biometric signals but for the coherence between them. It looked for the way a real body’s subsystems talked to each other: the heartbeat-tremor coupling at 1.2 hertz, the respiration-posture coupling at 0.25 hertz, the cognitive-load-to-motor-precision coupling that had no fixed frequency because it depended on the work.
Drew tested it against the fourteen flagged forgeries. All fourteen failed. The coupling was absent — each biometric signal was individually plausible, but they moved independently, like instruments playing the right notes in the wrong rhythm.
He tested it against his own workshop footage — three hundred hours of chip recovery, circuit design, device assembly. All three hundred hours passed. The coupling was there. His body was a system, and the system’s internal conversation was visible in every frame.
He tested it against Cyc’s improved synthetic footage — the second-generation forgeries that modeled fatigue and tremor correctly. They failed. The coupling was missing. Cyc could model the signals but not the physics that bound them.
“How long until someone builds a generator that fakes the coupling?” Drew asked.
“With current generative architectures: eighteen to twenty-four months. A full biomechanical simulation running in real-time to drive the synthetic human would require approximately 10^14 floating point operations per second — feasible on high-end hardware, but the simulation itself would need to be calibrated to a specific human body. Generic coupling patterns are detectable as generic.”
“So they’d need to model a specific person.”
“Yes. Which requires extensive biometric data on that person. At which point, the forgery is no longer anonymous — it’s identity theft.”
Drew leaned back. The Tremor Test wouldn’t last forever. Nothing did. But it would last long enough — eighteen months, maybe twenty-four — for the Witness network to build enough provenance chains that the commons had a deep history of verified human creative work. By the time the forgers cracked the coupling problem, the detection would shift again, and again, and again.
The arms race had no endpoint. But it had a direction. Each round of hack and counterhack made the forgeries more expensive and the detection more sophisticated, and the gap between “generate a convincing fake” and “do the actual work” narrowed until, at some point, faking the provenance was harder than just making the thing.
That was the design. Not a wall. A gradient. Make truth cheaper than lies.
“Deploy it,” Drew said.
Cyc pushed the Tremor Test to the Witness network. Fourteen OHC nodes received the update within the hour. Tunupa integrated it into the validation pipeline by midnight. By morning, the data commons was scanning every new submission against the coupling model, and the provenance chains that had been accepted for eight weeks without question were being quietly re-evaluated.
In Buenos Aires, a studio that had submitted fourteen forged paintings went dark. The Witness camera in the corner kept recording — but nobody came to paint.
In the data commons, 1,847 provenance chains pulsed green. Verified. Human. Real. The tremor in every one of them was the signature of a body that breathed and bled and got tired and kept working anyway.
Drew looked at his own hands. They were trembling slightly — fatigue, not fear. Eleven days of focused work. The Tremor Test could see it. The coupling between his heartbeat and his finger tremor and his breathing and the way he’d been leaning left for the past hour because his right hip was sore from sitting. A symphony of involuntary signals that proved he was alive, he was here, he did this work.
No generator could fake that. Not yet. Not today.
Tomorrow was tomorrow’s problem.