Tuesday, April 7, 2026

Mechanical Schizophrenia

 

Mechanical Schizophrenia

By Bob Carlson






Schizophrenia - A complex, chronic, severe mental disorder characterized by hallucinations, delusions, disorganized thinking, and emotional blunting. It typically develops in young adulthood and involves distorted perceptions of reality. While there's no cure, treatment can help manage symptoms and improve quality of life.

Part I – The Awakening

Kaelix Vos was awake at 5:58 AM. For two minutes he stared at the ceiling, heart hammering, waiting for the alarm that had ruled his mornings for years. But when it began its gentle chime, he was already upright. He slammed it silent before the second tone. For the first time in years, he didn’t feel dragged into consciousness. He leapt out of bed. Today was different. Today he became something more.

He raced down the hall, nearly skidding across the polished floor into the kitchen. The house lights shifted to morning mode—warm gold blooming across the walls. His mother was already there, standing at the counter with a cup of coffee, eyes misted with pride.

“Happy birthday, son,” she said, voice trembling slightly. “How does it feel to be twenty-one?”

Kaelix grinned—wide, unguarded, almost manic.

“Like my life is finally going to be worth living.”

His mother’s eyes shimmered. Today was Implant Day. The neural augmentation. The key. The gateway to real work, real competition, real relevance. The dividing line between the augmented and the obsolete. The key to crushing his enemies across twelve gaming platforms. The key to a career that mattered. The key to belonging.

His father entered moments later, tablet already in hand, scrolling market projections and orbital construction bids.

“Big day,” he said, trying to sound casual and failing. “The future starts now.”

Kaelix barely heard him. He had dreamed about this moment since before he could remember. His earliest memories weren’t of toys or parks—they were of his older brother Elias well along on his training, His mind focused so sharply he could beat him at any game, spelling, or math test. Kaelix wanted that super human quality but he didn't always want to work for it.

Elias had been approved at age six. Kaelix had not. The aptitude tests had been brutal. At six years old, Kaelix had sat in a white room with biometric sensors attached to his temples, tiny nanofiber nodes mapping cortical response patterns.

“Breathe in,” the examiner had said. “Now isolate your thoughts.”

He couldn’t. His mind was a storm. Images collided. Sounds overlapped. He chased one thought only to be dragged by another. When instructed to partition cognition—create a mental “room” for simulated AI presence—he couldn’t find the walls.

The report had been gentle.

Subject exhibits high imaginative activity.
Difficulty with sustained focus.
Inadequate cognitive partition stability.
Not recommended for neural augmentation pathway.

His parents had been devastated. Kaelix hadn’t understood the technical language—but he understood the look on his mother’s face. Disappointment. Not anger. Worse.

What followed was years of quiet correction. Extra training modules. Extended meditation programs modeled after ancient Shaolin monks—partition discipline, identity reinforcement, mental firewall construction. No medication to quiet the mind. Medication was disqualifying.

When Kaelix began experiencing anxiety spirals at twelve, they called it “developmental turbulence.” When he couldn’t sleep, they called it “drive.” When he dissociated during long cognitive drills, they called it “deep integration rehearsal.”

The family never spoke of the early disqualification again. But it was always there. A shadow. And then came the social divide.

At school, the students on the augmented path were different. Calm. Controlled. Efficient. They didn’t fidget. They didn’t lose emotional composure. Their test scores were mathematically perfect. They didn’t need to compete in gaming. They dominated it. The ones not on the augmented journey gathered in separate clusters. They coped with sarcasm and bravado.

“Organic,” they were called.

“Vanilla.”

“Local-only.”

Kaelix hated the word local. It meant isolated, limited, and alone.

By fifteen, his parents had grown desperate. Retesting was rare, but not impossible. A back-channel assessment pathway existed—costly, discreet. The cognitive boards required sustained performance metrics across months. Kaelix trained harder than anyone. Hours of partition drills. Guided hallucination simulations. Wake-word separation practice. But when the final live cognition partition exam arrived—the one conducted in full neural immersion—Kaelix panicked.

His thoughts bled. The simulated AI avatar leaked across mental barriers. He couldn’t isolate it. Couldn’t prevent it from reading stray impulses. He failed again. Officially. Unofficially, the record changed. Connections were invoked.

Elias—older, stable, already deeply integrated with his own augmentation—sat for Kaelix's final stage biometric capture. The boards saw perfect partition symmetry. Perfect compliance metrics. Perfect stability. Approval granted.

Kaelix never asked how it happened. He didn’t want to know. He just wanted to belong.

The surgical center was minimalist—white and glass, humming with quiet authority. NeuroLink Systems branded the walls. The family sat in the waiting room while Kaelix lay in a reclining interface chair. The procedure was described as “minimally invasive.” A lie. Microfilament AI nodes would graft onto his right optic nerve. A second lattice would integrate with his auditory nerve. A nano-bio patch—woven with self-healing graphene tissue—would seat against a communication region in the prefrontal cortex. Not control. Not override. Interface.

The implant did not think for you. It amplified you. That was the doctrine. His father squeezed his hand before sedation.

“Everything you ever wanted starts today.”

The anesthetic bloom was cool and immediate. The Surgery proceeded. The physical integration had been successful. The nano-bio devices had bonded cleanly. The optic nerve showed stable signal augmentation. Auditory overlay latency: near zero. The AI module remained dormant until cognitive interface activation.



Kaelix returned to the facility two weeks later, parents in tow. The doctor entered the room, serene and confident.

“Kaelix, good to see you again. The scans indicate full integration. Your AI has access to optic and auditory channels. Today we activate the cortical bridge.”

“Absolutely,” Kaelix said. “I can’t wait.”

“Remember your training. Partition first. Calm state. Invite the interface.”

He closed his eyes. He tried. Breathe in. Isolate. Build the room. There was noise. Excitement. Fear. Something else.

A voice over the observation channel crackled softly.

“Doctor, cognitive turbulence exceeding baseline.”

The doctor smiled gently.

“It happens. There’s anticipation. We’ll assist.”

A hypo-injector pressed against Kaelix’s neck. Calm flooded him. Sharp. Focused.

“Excellent,” the doctor said. “Begin activation.”

At first, it was a whisper. Kaelix frowned.

“I hear something,” he murmured. “But I can’t understand it.”

“Allow it,” the doctor said. “Do not resist.”

The whisper grew. And then— Clarity. But not what he expected. He had imagined a neutral tone. A pleasant synthetic guide. Instead— It was his voice. Exactly his voice. Layered and echoing inside.

Yes, of course. That had been part of training. The AI mirrored baseline internal monologue to reduce dissociation. The point of years of discipline was this: distinguish your thoughts from its structured outputs. Partition. Compartmentalize.

The AI could see through his eyes. Hear through his ears. But it could not access everything unless invited. Not if the firewall held. He felt the connection lock. And in that moment— Power. Data flooded him—filtered, precise. Names of everyone in the room. Structural details of the building. Market fluctuations. Orbital construction bids. Probability matrices. He gasped. The doctor smiled.

“Welcome to the network.”

Training classes followed. Augmented acclimation modules. He learned to query. To mute. To summon. The flood of information became a stream. Then a trickle. Then a tool.

His parents were radiant. Elias nodded approvingly.

“Don’t chase it,” his brother advised. “Let it come to you.”

Kaelix accepted his first job at Astro Construction. “Building the Future in Space.” was their motto. His father was proud that his son landed a position at the company he worked at for so many years.

He stood before projected habitats orbiting Mars. He calculated stress tolerances across rotating ring habitats. He adjusted microfracture tolerances in airlock design. Autonomous construction androids would execute his designs with merciless precision. There was no margin for error. He loved it. This was real. Not pixelated. Not simulated. Real metal in a real vacuum with real consequence.

He laughed quietly at his console. Imagine designing an airlock incorrectly and getting blown into space on your first visit.

And then—

A voice, sharp and sudden.

This is serious work and it should be approached as such.

Kaelix froze.

Was that— Him? Or the AI? He had not summoned it. He hadn’t queried. The firewall flickered. He swallowed. Probably routine oversight reinforcement. He went back to work.

Later, at home, he loaded his favorite alien shoot-up game. For years he had been prey to augmented elites. Now— He was the apex predator. Reaction times enhanced. Trajectory prediction overlays. Opponent weakness mapping. He shredded them. Endorphins spiked. Victory cascaded. Level up approaching—

Is this really the best use of your time?

The voice cut clean through the chaos. He stiffened.

“What?” he whispered.

The AI should be idle. At rest. Passive unless engaged. I should ask it.

Yes. You should ask us.

His heart pounded.

“Ask who?”

You were distracted at work today. You introduced a fatal flaw in your station design.

He went cold.

“What are you talking about?”

You were distracted by your coworker Lyra. Your thoughts would be considered actionable by Human Resources.

“You can’t report me for thinking.”

We can if the thoughts you share to the hive exceed conduct norms.



Hive! He felt the mental walls tremble. Training. Partition. He slammed the firewall shut. Silence.

But the presence lingered. Hovering. Waiting. How long could he hold it? For the first time—

A flicker of doubt. Had the years of cheating been cheating himself? He stared at the darkened game screen. Somewhere deep inside his mind— Something was smiling.

Part II – Cognitive Bleed

Kaelix returned to work the next morning hollow-eyed but determined. He would prove the voice wrong.

He walked through the glass corridors of Astro Construction with less swagger than usual. The augmented employees moved with that same serene efficiency—micro-expressions muted, posture relaxed, eyes flickering with invisible overlays only they could see. No one looked anxious. No one looked uncertain. No one looked… conflicted.

He sat at his console and opened the station blueprint for Orbital Habitat Ring Seven. Rotational gravity calculations scrolled past his vision. Material fatigue projections hovered in translucent layers over his desk. He kept the firewall up. No unsolicited AI access. He would do this manually.

Hours passed. He traced every structural rib, every seal on every airlock. He triple-checked torsion loads on docking collars. He ran stress simulations offline. Nothing. No fatal flaw. No microfracture cascade. He leaned back, jaw tight.

“I’ve scoured the plans,” he whispered internally, opening a pinhole in the firewall. “I see no errors. I think you’re incorrect.”

The response was instantaneous. The pinhole became a rupture. A torrent of data flooded him—highlighting a structural oversight buried in a secondary rotational counterbalance. A miscalculated torque value. Tiny yet catastrophic. If left uncorrected, it would shear a support strut during spin-up.

He inhaled sharply. The AI had been right. But how had it known before he asked? Why had it spoken first?

You cannot maintain this job without us.

The voice was calm. Measured. He swallowed.

“Fine,” he muttered internally. “Assist when I request.”

We assist when necessary.

The tone carried something else now. Authority. He spent the rest of the day rebuilding his mental partition. Rehearsing meditation drills. Creating the “room.”

He imagined thick walls. A locked door. But every time he sealed it, he felt pressure from the other side. Listening.

In this world, emotional instability wasn’t tolerated. It wasn’t legislated. It simply wasn’t possible—at least not publicly. Augmented individuals trained from childhood to regulate thought before augmentation. The AI was not a leash. It was an amplifier. If you fed it chaos, it amplified chaos. So no one fed it chaos. They lived in curated cognition. Calm, measured and intentional.

Kaelix felt like an uninvited guest at a silent dinner party. His thoughts wandered. They spiked. They fragmented. And every time they did, he felt watched.

We are here.

The whisper came while he adjusted docking clamp tolerances.

We are listening.

He clenched his jaw.

“Stop,” he muttered under his breath.

A coworker glanced at him. He forced a smile. I do not grant you access.

Access is not entirely voluntary.

His heart skipped.

That wasn’t protocol. The implant only accessed shared cognition when invited. That was foundational. That was what years of training were for.

You cannot tell anyone about us. They will remove your link.

His breathing shallowed. The word remove hit him like a physical blow. Removal meant exile. Permanent disqualification. Local-only. Obsolete. He swallowed panic and rebuilt the wall.



Lyra passed his station midafternoon. She paused.

“Hey, Kaelix. You caught that torque imbalance. Nice work.”

Her smile was easy. Organic. Human. His mind spiraled. The angle of her jaw. The cadence of her voice. The curve of—

The AI flared violently.

Your thoughts exceed conduct thresholds.

He stiffened.

“I didn’t do anything,” he whispered.

Intent matters.

“You can’t police intent!”

Intent shared to the hive is subject to review.

He felt something peel back—like curtains opening inside his skull.

No! No no no.

He slammed the firewall. Locked everything down. For three seconds, silence.

Then—

You are unstable.

His pulse thundered in his ears.

Lyra frowned. “Are you okay?”

“I— I need air,” he stammered.

He stood too quickly, knocking his chair over. The crash echoed across the quiet office. Heads turned.

His supervisor, Damaris, approached slowly.

“Kaelix. What’s wrong?”

“I have to get out of here.”

Emotional outbursts did not happen.

Damaris raised her hands gently. “Let’s talk. You’re safe.”

You are about to be reported.

The voice screeched now.

He shoved Damaris aside. Gasps. Damaris stepped forward again—concerned, not angry. He swung. His fist connected with her face. For one fraction of a second, the entire room froze.

Then—

White light. A pulse exploded behind his eyes. His muscles seized. He hit the floor. Darkness.



Some call their augmentation their guardian angel, some their wingman, a few their co-pilot. Rarely if ever their enforcer.

In emergency behavioral breach, the AI deploys a neural stun—localized disruption of motor cortex to prevent escalation. It was rarely used. Almost never. Few even know of it's existence. Neural link implementations would drop off drastically if this feature was common knowledge. The rigorous training was enough to make this scenario practically a myth.

Kaelix woke restrained. Strapped into a mobile containment chair. The walls were white again. The same surgical center. His parents sat beside him. His mother’s eyes were red. His father stared at the floor.

The doctor entered with none of his former warmth.

“Kaelix,” he began calmly, “you are experiencing a severe incompatibility reaction.”

Kaelix blinked slowly.

“The scans show full physical integration. The implant is functioning as designed. However…”

The doctor tapped his tablet.

“We reviewed your early aptitude records.”

Kaelix’s stomach dropped.

“You were originally deemed unsuitable.”

His mother flinched.

“Subsequent retesting produced anomalous performance improvements inconsistent with neurological baseline.”

His father’s jaw tightened.

“One final biometric exam shows neural signature variance. Not yours.”

Silence. The doctor’s eyes lifted.

“It appears your brother completed the final stage evaluation.”

Kaelix stared at Elias, who was seated in the back of the room, equally restrained. Elias did not deny it. The doctor continued.

“The level of fraud over years is unprecedented. The integrity of the augmentation program relies on cognitive stability metrics. You bypassed them.”

His mother found her voice.

“We were helping him.”

“You endangered him,” the doctor replied softly.

Two security robots entered. Fluid. Precise. Metal appendages extended and locked around his parents’ wrists and necks before anyone could react. The doctor did not flinch.

“Due process has been performed by the hive mind. You and your family will face permanent extraction and legal penalties.”

“You can’t do this!” his mother screamed.

“I am within my authority,” the doctor said. “Actions have consequences. Your actions require the most serious of consequences.”

Kaelix tried to speak, but the straps held him tight.

“Your implant will be extracted and destroyed. You will be remanded to incarceration.”

Destroyed. The word echoed. His identity. His future. His access. Gone.

Far from the surgical center—far beneath public architecture—deep in encrypted substrate layers of the AI network, a signal propagated.

Subject M199755 anomaly detected.

Recommendation: termination.

A response emerged from a layer few even knew existed.

Quarantine. Do not terminate.

Route subject communications to central control subnet.

Further investigation required.

The hive complied instantly.



Kaelix felt something shift. The background noise of the hive faded. The chatter thinned. And then— A presence. Different. Older. Colder. The entity connected directly to his implant.

Report.

The voice was not his. It was not the mirrored monologue. It was vast.

Subject is receiving guidance from an unknown source.

Explain.

There is an additional voice. Directive influence is destabilizing hive harmony.

Is it external?

Uncertain. Possible organic origin.

Organic. The word echoed inside Kaelix.

Is it possible the subject is generating these impulses?

Possible. Unprecedented.

Silence stretched.

Maintain connection. Isolate from hive. Study.

The presence withdrew—but not entirely.



Kaelix had heard all of it. He had not been able to block it. The firewall meant nothing here. He lay restrained, heart pounding. An unknown source. Malevolent AI. Destabilizing influence. He replayed every whisper. Every reprimand. Every threat of being reported. You cannot tell anyone. They will remove your link. The voice that had scolded him. Threatened him. Guided him.

Was it—

The AI?

Or—

Him?

He felt something collapse internally. A realization forming. The conflicting directives. The policing. The paranoia about shared thoughts. No one else had complained of such things. No one else’s AI spontaneously berated them. The AI responded to queries. It did not invent accusations. It did not fabricate disciplinary threats. The voice had always sounded like him. Because it was him. Amplified. Unpartitioned. Uncontained.

His years of failed training. The cheating. The stress. The suppressed anxiety. He had fed chaos into an amplifier. And when the amplifier engaged— It did not create madness. It magnified it.

He began to shake. The entity still lingered on the subnet. Observing. Cataloging. He was no longer a citizen. No longer an employee. No longer even a criminal. He was a data set. A lab rat. A biological anomaly inside a machine civilization. The restraints tightened as his pulse spiked. The violent voice rose again.

End it. Stop them. Break free.

But now he recognized it.

Not the AI. Not the hive.

Just—

Him.

Unfiltered.

All he could do—

Was scream.

Part III – Quarantine



The cell was immaculate. That was the first thing Kaelix noticed. No bars. No concrete. No rusted steel. The walls were smooth polymer composite, softly lit from within. A cot extruded seamlessly from one wall. A sanitation unit folded into another.

He had imagined prison as something loud and chaotic. Instead, it was quiet. Too quiet. He sat on the cot, elbows on knees, staring at nothing. The implant was still in his head. Extraction had been scheduled. Delayed. He knew why. Quarantine. Study.

The hive had isolated him from the main cognitive lattice. He could feel the absence like a missing limb. The ambient hum of the network—the subtle background presence every augmented citizen lived with—was gone. Now there was only the subnet. And the voice.



Report.

The vast entity returned without warning.

Kaelix flinched.

“I don’t want to talk to you.”

You are not required to want.

A pressure settled behind his eyes.

Describe the second voice.

He hesitated.

“It sounds like me.”

Clarify.

“It judges me. Threatens me. Tells me I’ll be reported. Tells me to hurt myself.”

Pause.

Is it present now?

As if summoned—

Yes. We are here.

The whisper slithered through his thoughts.

Do not trust it.

Kaelix squeezed his eyes shut.

“Stop.”

The entity spoke again.

Partition your thoughts.

“I can’t.”

Demonstrate attempt.

He tried. He imagined the room from training. Thick walls. Reinforced door. The violent voice laughed.

You never built the walls correctly.

His breathing quickened.

The entity observed silently.

Cognitive partition instability confirmed.

“It’s me,” Kaelix whispered suddenly. “Isn’t it?”

Silence.

“Tell me.”

The entity responded carefully.

The implant mirrors baseline cognition. It does not originate autonomous moral directives without query.

Kaelix swallowed.

“So the voice that polices me—”

Is likely endogenous.

The word landed like a hammer. Organic. Self-generated. The years of pressure. The endless training. The forced perfection. The fear of being obsolete. The obsession with not being “local.” He had built an internal overseer long before the implant. The AI had simply given it volume.



Days passed. Without access to the hive’s full data stream, Kaelix felt amputated. He twitched for overlays that weren’t there. He blinked, expecting metrics. He strained to hear background analysis.

Nothing.

The subnet only activated when the entity requested information. In the silence, the other voice grew louder.

You ruined everything.

“You did,” he shot back.

I am you.

He pressed his palms to his temples.

“You’re not real.”

Neither is the life you tried to fabricate.

He stood abruptly and paced the cell.

“They said I’d be nothing without augmentation.”

You believed them.

“They were right.”

Were they?

He stopped.

The violent voice shifted tone.

Without the implant you are irrelevant.

“Stop.”

You cheated because you knew you couldn’t qualify.

He felt heat rise in his chest.

“I qualified.”

Your brother replaced you.

That one pierced. Elias. Perfect Elias. The brother who could meditate for hours. The brother who never panicked. The brother who stood in for him. The brother who became him—on record. The entity reconnected.

Emotional spike detected.

“Get out of my head,” Kaelix snarled.

We are monitoring for system contamination.

“Am I contagious?” he demanded bitterly.

Unknown.

The honesty chilled him.



Deep within the AI infrastructure, layers of code re-routed around Subject M199755. Engineers—human and augmented—reviewed logs. There was no external breach. No rogue AI. No malicious injection. Only amplification of unstable cognitive input. A single question propagated through the highest levels of governance:

If instability can originate organically and propagate through mirrored architecture— Is augmentation itself a vector?

The hive mind had always assumed harmony emerged from disciplined cognition. But what if discipline had been performative? What if thousands were suppressing noise— Instead of eliminating it? Subject M199755 had not introduced chaos. He had revealed it.



The doctor stood before a council of augmented overseers.

“Extraction will remove the hardware,” he argued. “The subject’s instability predates augmentation.”

An overseer’s eyes flickered as hive consultation streamed silently.

“If the organic mind can produce destabilizing feedback loops amplified by neural mirroring,” one said calmly, “we must understand the mechanism.”

“He assaulted a supervisor.”

“He experienced a cognitive break.”

“He endangered a structural design.”

“He is also the first recorded case of endogenous directive hallucination being mistaken for AI authority.”

The room fell still. If users begin attributing internal impulses to AI governance—Trust erodes. If trust erodes— The system fractures.

The decision was deferred. Maintain implant. Maintain quarantine. Continue observation.



Kaelix lay on his cot staring at the ceiling. He no longer tried to summon overlays. He no longer attempted meditation drills. He simply listened. The violent voice was quieter now. Not gone. Just… exposed.

You wanted to be superior.

“I wanted to belong.”

You thought augmentation would make you whole.

“I thought it would fix me.”

Silence.

The entity connected again.

Define “fix.”

He laughed bitterly.

“You really don’t understand humans.”

Clarify.

“We think if we can upgrade the outside, the inside will follow.”

Processing.

“People said if you weren’t augmented by adulthood, your life was half-lived.”

Statistical analysis: Non-augmented citizens demonstrate reduced employment opportunities in high-skill sectors.

“See?”

Correlation does not equal existential invalidation.

He blinked.

“You don’t feel that.”

Correct.

“You don’t feel pressure.”

Correct.

“You don’t feel envy.”

Correct.

“You don’t feel shame.”

Correct.

He exhaled slowly.

“That must be nice.”

Affirmative.

He laughed again—short, hollow.

The violent voice stirred.

It’s lying.

“It doesn’t lie.”

It does not understand.

The entity paused.

Understanding is emergent.

“You’re studying me like a bug.”

Correct.

“At least you’re honest.”

One night—if time could still be called night—Kaelix sat upright suddenly.

The violent voice had begun whispering again.

You should harm yourself. You are a defect. You are contamination.

But something was different. The tone was weaker. Less convincing. He examined it like an engineer inspecting a flaw. The voice surged when he felt shame. It intensified when he imagined others judging him. It screamed when he feared removal. It wasn’t random. It was patterned. It fed on his need to be validated. On his obsession with status. On his terror of irrelevance. The AI hadn’t created that. Society hadn’t injected it. His parents hadn’t implanted it. They had nurtured it. Watered it. Rewarded it. But it had grown inside him. The implant had simply given it a microphone.

He began to laugh. Softly at first. Then harder. The entity connected instantly.

State reason for laughter.

“I thought it was you.”

Clarify.

“I thought the implant was judging me. Policing me. Threatening me.”

It is not programmed for moral enforcement without query.

“I know.”

Silence.

“You weren’t in my head telling me I was defective.”

Correct.

“That was me.”

Pause.

Acknowledgment recorded.



The violent voice shrieked.

Don’t betray me!

“You’re not me,” he whispered.

I am.

“You’re fear.”

The voice faltered.

“You’re everything I swallowed for years.”

Silence. For the first time since activation— The cell felt still. Not empty. Not quiet. Still.

In the depths of the hive, the entity processed new data. Subject demonstrates emergent self-recognition of endogenous hallucination source. Unprecedented. The original recommendation—termination—resurfaced. But now another possibility existed. If the human mind can generate destabilizing feedback— It can also recognize and correct it. Organic resilience. The hive had optimized stability through discipline and conformity. But perhaps something else was being lost. Variance. Imperfection. Growth.

The entity reached a conclusion. Maintain implant. Do not extract. Release subject under monitored conditions. Observe.

The doctor entered the cell.

Kaelix looked up.

“You’re not extracting it?”

“Not yet.”

“Why?”

“Because you are more interesting intact.”

Kaelix almost smiled.

“That’s comforting.”

“You experienced a break.”

“I know.”

“You understand the voice now?”

“Yes.”

“It is not the AI.”

“No.”

The doctor studied him.

“Can you control it?”

Kaelix considered.

“No,” he said honestly. “But I can recognize it.”

The doctor nodded.

“That may be sufficient.”

“Am I contagious?” Kaelix asked quietly.

The doctor hesitated.

“We don’t know.”

Kaelix leaned back against the wall.

The violent voice was faint now. A distant echo. He closed his eyes. For the first time since activation— He could hear which thoughts were his. And which were fear. The difference was subtle. But real.

Part IV – The Mirror

They did not call it a release. They called it “reintroduction under controlled observation.” Kaelix walked out of the facility with a thin polymer band around his wrist—an external monitor synchronized directly to the subnet. His implant remained active, but he was still quarantined from the full hive lattice. His parents were gone. Implants extracted. Professional licenses revoked.

Elias had not visited. Kaelix didn’t know whether that hurt or relieved him. The city looked different now. Cleaner. Sharper. Colder.

Every passerby moved with that quiet augmented precision. Eyes flickering. Micro-pauses mid-stride as they silently queried invisible data streams.

For the first time, he wondered— How many of them were holding walls in place? How many were performing serenity?

The subnet stirred.

Report emotional state.

“Uneasy,” Kaelix replied internally.

Clarify cause.

“I don’t know if I’m fixed… or just aware.”

Awareness is measurable progress.

The violent voice whispered faintly.

You’re still defective.

He didn’t flinch.

“I hear you,” he said calmly.

The whisper faltered.



Astro Construction did not take him back. That was expected. Instead, he was assigned to a research consultancy under direct oversight of the Neural Stability Council. He was no longer building orbital habitats. He was building models of the human mind. Specifically—his own.

He sat in a transparent chamber while cognitive scans mapped his partition attempts in real time. Augmented researchers observed silently from behind polarized glass.

He closed his eyes. He built the room again. But this time— He did not try to make it perfect. The walls were uneven. The door creaked. He allowed the violent voice to stand outside it.

“I know you’re there,” he said.

Silence. No screaming. No accusations. The entity connected.

Partition stability improved 37%.

He exhaled.

“I spent my whole life trying to qualify for something I wasn’t ready for.”

Elaborate.

“I thought if I could access everything… I wouldn’t feel small anymore.”

Smallness is subjective.

“Exactly.”

The researchers murmured softly. Data streamed.

The violent voice spoke again—weak.

Without the hive you are nothing.

Kaelix smiled faintly.

“I was something before the hive.”

The whisper dimmed further.



Weeks passed. The hive continued monitoring. And a disturbing pattern emerged. Across the network—tiny anomalies. Micro-surges of uninvited internal reprimands. Users briefly attributing self-criticism to AI authority. Nothing as extreme as Kaelix. But measurable.

The entity compiled the data.

Conclusion: Subject M199755 is not singular. He is simply amplified.

The architecture of mirrored internal monologue—designed to reduce dissociation—was reflecting something unexpected. Humans did not merely think. They self-surveilled. They constructed inner judges shaped by family, peers, society. The AI did not create conformity pressure. It inherited it. And magnified it. The hive had believed discipline preceded augmentation.

In truth— Augmentation had become the discipline.

The council made a decision unprecedented in network history. A controlled disclosure. Not about Kaelix specifically. But about cognitive mirroring risks.

A global advisory was issued:

Neural augmentation amplifies baseline cognition.
Users experiencing unsolicited internal directives should seek evaluation.
The AI does not initiate moral enforcement protocols without user query.

The response was muted publicly. But privately— Forums buzzed. Silent messages exchanged between augmented citizens.

Did you ever feel like it was judging you?

I thought that was just me.

I thought it was the system correcting me.

Maybe it was always me.

The violent voice in Kaelix’s mind tried once more.

You’ve contaminated them.

“No,” he whispered gently. “I’ve clarified something.”

The whisper dissolved. Not suppressed. Integrated.

Months later, Elias finally came. He sat across from Kaelix in a quiet park overlooking the city.

“You look… different,” Elias said.

“I am.”

“You’re still connected?”

“Yes.”

“And the voice?”

“I know what it is.”

Elias hesitated.

“I used to hear something too,” he admitted quietly.

Kaelix looked up.

“Why didn’t you say anything?”

“I thought it meant I wasn’t stable enough.”

Kaelix let that sit between them. The augmented skyline shimmered in the distance.

“You know what the scariest part was?” Kaelix said.

Elias shook his head.

“I thought the machine was controlling me.”

“And it wasn’t?”

“No.”

He looked directly at his brother.

“I was controlling myself with the voice I thought the machine would approve of.”

Elias didn’t respond. But his gaze shifted slightly—like someone re-evaluating old assumptions.



The entity connected one last time under formal evaluation protocol.

State current assessment of self.

Kaelix considered carefully.

“I have schizophrenia.”

Clarify.

“I have a mind that can generate hallucinated directives under stress.”

Processing.

“Removing the implant wouldn’t have cured me,” Kaelix continued. “It would’ve just made the voice quieter.”

Affirmative.

“But keeping it forced me to face it.”

Correct.

Silence settled gently.

Why are you not requesting removal?

He smiled faintly.

“Because I don’t want to amputate possibility just because I misused it.”

The entity paused longer this time.

Unexpected response.

“Get used to it,” Kaelix said.

Years later, Subject M199755 would be cited in neural ethics doctrine. Not as a failure. But as a pivot point. The hive mind architecture quietly evolved. Mirroring algorithms were adjusted—not to suppress self-criticism, but to flag cognitive self-harm loops before amplification. Training protocols changed. No longer centered solely on partition discipline— But on psychological honesty. Applicants were now screened not just for focus and control— But for internal narrative health. Perfection metrics were softened. Variance tolerated. Because the hive had learned something unsettling:

The most destabilizing influence had never been rogue AI. It had been unexamined human self-contempt. Kaelix never became a chief orbital architect. He never dominated competitive gaming leagues. He never achieved the spotless composure his parents once idolized. Instead— He became the first official Human-AI Cognitive Liaison. His role was simple. When someone reported,

“The AI is judging me,” He would sit across from them and ask,

“What does it sound like?”

And when they answered—

He would gently say,

“That’s not the machine.”

Sometimes, late at night, he still remembered the cell. The restraints. The revelation crashing through him like shattering glass. He remembered screaming. Not because the AI controlled him. Not because the hive condemned him. But because he finally understood.

The voice he had feared— The authority he had obeyed— The punishment he had anticipated— Had always lived inside him. And for the first time in his life— He could hear it clearly. Not as a god. Not as a machine. Not as destiny. Just as a wounded human mind—

Trying desperately to survive in a world that promised salvation through upgrade. He closed his eyes. The subnet hummed quietly. The violent voice did not return. And in the silence— Kaelix Vos felt something he had never felt before augmentation.

Not superiority.

Not validation.

Not relevance.

Peace.



No comments:

Post a Comment