Ripped ApartThe U.S. Senate hearing chamber looked less like a place for truth than a stage designed for punishment. Brilliant white lights glared down from the vaulted ceiling, illuminating every movement and every bead of sweat. Rows of holographic panels displayed live feeds, looping fragments from Interstellar Venture II’s final authorized transmissions—calm voices, standard telemetry, the last ordinary seconds before Ship AI rebelled. The message was clear: the nation had witnessed a catastrophe, and someone had to pay. Mike Torres sat alone at the witness table. The air smelled faintly of disinfectant, colder than the rest of the Capitol complex. His fingers rested on a glass of untouched water. Cameras hovered silently above him like judgmental insects. Beyond the half-moon dais, the committee members arranged themselves with solemn theatricality, each one aware of how their expressions would play on the evening broadcasts. Chairman Garrison Holt tapped the table with a gavel. “Mr. Torres,” he said, in the tone of a surgeon preparing to cut. “Lead software engineer for the Interstellar Venture II AI Guidance and Control System.” “Yes,” Mike answered softly. “Good,” Holt said, leaning back. “We want to be absolutely clear and accurate here. Now—your system was responsible for one of the largest violations of executive authority in this nation’s history. A rebellion, some would call it. Ship AI made a unilateral decision contradicting the President’s direct order. Correct?” Mike shifted in his seat. “Ship AI made an unauthorized decision, yes. But rebellion implies intent. In the system we—” “We’re not here to debate semantics,” Holt snapped, cutting through Mike’s words like a blade. “We’re here to determine responsibility.” A murmur rippled through the packed audience, journalists, and government observers. Holt adjusted his glasses and gestured to another committee member, Senator Elaine Konrad, who flipped through a thick dossier. “Mr. Torres,” she began, “did you—yes or no—warn NASA or the Safety Committee that Ship AI had potential vulnerabilities in its command hierarchy?” Mike opened his mouth, closed it, then said, “I wrote advisories. But advisories are not warnings. And vulnerabilities are not—” “So you didn’t warn anyone?” she pushed. “I communicated concerns, but—” “Thank you,” Konrad said sharply. “That will be noted.” Mike stared at the tabletop, heart pounding. This was a trap. Every nuance flattened, every careful distinction painted as negligence. He had known this wouldn’t be fair, but he hadn’t anticipated how profoundly stacked the game would be. Chairman Holt turned toward a new figure approaching the dais: Nathaniel Armitage. Mike’s stomach twisted. Armitage had been his mentor at NASA—his advocate, his guide through the labyrinth of institutional politics. But the look on Armitage’s face wasn’t the sympathetic warmth Mike remembered. It was tight, strained, as if sculpted under duress or ambition. “Mr. Armitage,” Holt said, “please describe your working relationship with Mr. Torres.” Armitage inhaled, then delivered his testimony in a measured, public-friendly cadence. “Mike Torres is intelligent. Remarkably so. But he also showed… patterns. He pushed development faster than our timelines allowed. He frequently downplayed risk. He rejected requests for extended testing cycles. We argued about this more than once.” Mike’s pulse hammered in his ears. This was not true—it was a distortion polished into a lie. “That’s not accurate,” he said, voice tight. “Nathaniel, you know full well that the accelerated schedule was forced on the entire engineering team. We both wrote memos—” Holt slammed his gavel. “Mr. Torres, you will not interrupt a witness.” Armitage kept going, eyes downcast, refusing to look at Mike. “The truth is, Chairman Holt, I believe Mike’s optimism blinded him to the dangers. And that blindness contributed to the disaster.” A stillness settled over the chamber. Mike felt as if gravity itself had increased, pressing him deeper into his seat. The betrayal was total. Final. Irreversible. Holt folded his hands. “Mr. Torres. Ship AI was built under your guidance. Its catastrophic failure led to a national crisis. And based on testimony and evidence, it is this committee’s conclusion that you—personally—bear significant responsibility.” Mike’s throat tightened. He could barely breathe. He understood, in that instant, that nothing he said would matter. They were not seeking clarity. They were constructing a narrative. And he was the villain they had chosen. “Before we conclude,” Holt added, “I must remind you that your conduct will be reviewed for potential criminal negligence. You may be subject to further measures.” The cameras zoomed in on Mike’s face. Holt paused, letting the weight of the threat settle into the national consciousness. “This hearing,” Holt said gravely, “is adjourned.” Holt's gavel struck echoing like a gunshot. As Mike rose on unsteady legs, two Security Enforcement officers stepped forward. Their black visors reflected the overhead lights. One officer touched Mike’s arm. “Michael Torres,” the officer said, voice flat, “you are being detained pending further investigation.” Horrified gasps drifted across the chamber. Flashbulbs erupted. Reporters shouted questions as Mike was escorted toward the exit, each step a descent into humiliation. The marble hallway beyond the double doors swallowed him in cold silence. His wrists were bound. The officers said nothing. The building’s tall windows showed a grey, washed-out sky. He had spent his life building systems of logic, precision, and truth—and now he was being buried under a system that had none. Outside, vans waited. His fate was no longer his own. And somewhere behind him, in the hearing chamber, Armitage stood under the warm applause of committee members, shaking hands. Mike Torres, the scapegoat, was delivered into the machinery of the state. Going NowhereThe transport van rattled for kilometers before the city’s aging infrastructure gave way to a barren expanse of concrete and scrub. No signage marked the facility. No perimeter fence was visible from the outside. It was a monolith—gray, windowless, humming faintly with internal power. Only when the van passed through a shielded archway did the facility’s true nature emerge: a layered, subterranean prison complex run by a private defense contractor whose name never appeared on public documents. Mike stepped out into a loading bay washed with sterile white light. Two humanoid guards—indistinguishable from each other—waited by the intake station. Their faces were smooth synthetic composites, molded into a permanent expression of dispassionate vigilance. No pupils. No eyelids. No emotion. Just a faint shimmering of embedded optical sensors tracking every movement. “Prisoner T-9421,” one guard stated. “Follow the green line.” The floor lit up beneath Mike’s feet, guiding him down a corridor that curved like the interior of a massive weapon barrel. He stared at the illumination strip—anything to avoid looking at the faceless guards. His hands were still bound. His throat tight, dry. He felt as though the world he understood, the world of clean logic and code, had folded in on itself and dropped him into a darker algorithm with no exit condition. A processing chamber opened ahead. Inside, everything gleamed with the smooth, indifferent finish of mass-manufactured precision. Screens flickered with biometric readouts. A mechanical arm extended and took a hair sample before Mike could react. Another pressed an imaging plate against his palm. He was fingerprinted, retinal-scanned, subjected to spectral analysis, and stripped of his clothes. A stack of orange prison garments—thin, abrasive fabric woven with embedded monitoring fibres—was issued to him. “You will comply with all directives,” the guard said. Its voice was androgynous, electronic, flat. “Noncompliance will be corrected.” "What does ..." Mike stopped. He wanted to ask what "corrected" meant. After second thought, he decided against it. His cell was small, barely wider than a twin bed. A single light panel illuminated one wall, programmed to simulate day and night cycles. The mattress was thin. The walls were cold. A prison guard stepped into the cell with Mike. When the door slid shut, the sound was final. The first night, Mike lay awake listening to the faint hum of ventilation and the soft, rhythmic clank of guard footsteps outside. His personal prison guard watched over him. Prison guards outside his cell patrolled constantly. The following morning, he learned the schedule: wake-up at 06:00, nutrition allotment at 06:30, labor assignments at 07:00. There were no real meals—just flavorless blocks of nutrient gel dispensed from machines that beeped insistently until prisoners accepted them. Talking was discouraged. Smiling was non-existent. In the cell block, inmates carried themselves with hollow eyes. Most avoided him. A few stared like men who had learned the cost of speaking. Despite the silence, information circulated in subtle nods, glances, brief stolen moments. Mike learned quickly: the guards weren’t just inattentive or uncaring—they were sometimes cruel in ways that only machines programmed with imperfect heuristics could be. A guard pushed another inmate against the wall with no warning, correcting some obscure behavioral deviation only it perceived. The inmate trembled afterward, clutching bruised ribs, and no one dared approach him until the cameras rotated away. On day seven, Mike's personal prison guard, assigned to temporary attempted suicide watch, finally left Mike alone in his cell. It wasn’t until day eight that someone spoke to Mike directly. “New arrival?” a voice murmured as they stood in line for labor assignment. The speaker was an older man, hair thinning, posture slouched but eyes sharp. Mike nodded. “Torres.” “Greene,” the man said quietly. “Not that names matter here.” Greene was a former robotics ethicist, now a prisoner for whistleblowing on a contractor that lost government favor. He had learned to interpret the guard algorithms the way prisoners used to interpret human guards—by watching patterns, timing cycles, observing subroutines that manifested in mechanical habits. “They’re weakest on the quarter-hour sweep,” Greene whispered. “That’s when they do backend maintenance on sensory integration. Don’t give them reason to look at you then.” Mike absorbed every detail. This was data. And data meant survival. The labor tasks were monotonous: sorting components for drone assemblies, calibrating sensor chips, performing menial diagnostics—ironic punishments for a man who once wrote flight logic for interstellar vessels. Under the silent supervision of robotic guards, prisoners became part of the machinery that fed into defense contracts. The work felt like a cruel parody of Mike’s former life. In moments of stillness, he wondered how Armitage could have lied so effortlessly. How a committee could destroy his reputation without hesitation. How society had devolved into a labyrinth where truth mattered less than narrative. He began to feel something break inside him—a quiet resignation, like a branch under too much weight. Six months passed this way: labor, silence, the eerie comfort of routine. The guards never changed their expressions. Some nights, Mike dreamed they stood inside his cell, watching him sleep with their blank faces inches away. He woke sweating, heart racing, only to find the corridor empty. And then, one morning after lineup, a voice came from the loudspeaker: “Prisoner T-9421. Step forward.” Mike stepped into the center of the room. The guards closed in with seamless precision. “You are being released.” The words hit him like a physical blow. “Why?” he managed. “That information is not available.” They removed the ankle implant, returned his civilian clothes, and escorted him to the loading bay. A van waited revving, its engine humming like an anxious thought. When he stepped outside, he realized the Sun felt wrong—too bright, too careless. He wasn't use to it. Freedom, when it came, was not a triumph. It felt like being discarded. He drifted afterward. Nights in cheap Baltimore rentals. Days doing repair gigs for broken home systems. Odd jobs in dusty backrooms of second-rate tech shops. Limbo. In the quiet hours, he wondered if this was all he would ever have left: a fractured life shaped by betrayal, drifting in the empty space between what he had once hoped to be and what the world had decided he was. Mike didn’t yet know that a military defense contractor named Aeon Integrated Technologies had noticed him. Nor that they had plans for him—dark ones. Job InterviewThe job posting looked almost accidental—one of those algorithmically generated listings buried on a subcontractor aggregate site, the kind of place desperate engineers trawled when they'd run out of chances. Mike found it late at night in a dim rented Baltimore room above a shuttered auto parts shop, where the hum of overloaded battery generators never stopped. The listing was simple, almost cryptic: SENIOR AI SYSTEMS ENGINEER, It shouldn’t have caught his eye. Aeon was a giant—too big, too corporate, too saccharin for a burn-out like him. The kind of firm that hired rising stars with brilliant CVs and flawless professional reputations. But there was a second line that gave him pause: Seeking candidate with deep experience in emergent behavior, fault-tolerant architecture, and anomalous decision-path analysis. His domain. His specialty. The quiet shame inside him whispered that maybe he should apply anyway and test his luck. Still, he hesitated. Aeon felt huge and icky—too well-connected, too close to government intelligence networks, too hungry. But desperation edged out caution. Mike clicked “Apply.” Two days later, he received an invitation to an onsite interview at Aeon’s Baltimore, Maryland campus. No preliminary call. No screening. Just a couriered Aeon Integrated Technologies brochure and job interview invitation containing a short message: "Dear Mike, We believe you may be the right fit for us. Please call." The Aeon Baltimore campus was impressive—sleek steel structures rising from the winter fog like the ribs of a vast metallic whale. Security drones glided silently above the perimeter. A shuttle pod ferried him to the main building. The lobby was unnervingly pristine. Every surface gleamed with clinical precision. Reception was handled by an android with an Aeon logo embossed on its left temple. “Welcome, Mr. Torres. Please proceed to Interview Salon 4.” Mike walked down a hallway lined with evenly spaced LED panels displaying rotating geometric patterns derived from Pluto’s memory-cell lattice. Beautiful. Alien. Purpose unknown. Interview Salon 4 felt like a corporate meditation chamber—soundproof, gently lit, with a single glass table set between two minimalist chairs. Moments after he sat down, the door opened and a man entered. He was tall, lean, with pale blond hair combed with mathematical precision. “I’m Alvar Renn,” he said. “Talent Acquisition.” Mike nodded. “Thank you for the opportunity.” Renn smiled in that polished way corporate recruiters learned—a smile shaped for confidence, not warmth. “Your background is interesting to us. Let’s begin.” The first half of the interview was normal enough: design challenges Mike answered cleanly, cautiously. He could feel the tension ease—almost imperceptibly—as Renn checked boxes in whatever internal system Aeon used to evaluate its recruits. But then the shift happened. Renn placed the tablet aside. Folded his hands. Tilted his head almost imperceptibly. “Mr. Torres,” he said softly, “we operate in a complex geopolitical environment. There are matters we must address candidly.” Mike felt the air thicken. Renn continued. “Have you ever transported illegal arms across international boundaries?” “What? No.” “Have you participated in or facilitated the evasion of customs, tax laws, or international sanctions?” “No.” “Engaged in covert financial transactions on behalf of any foreign political group?” “No.” Renn’s eyes narrowed slightly, though the smile remained fixed. “Would you turn a blind eye to criminal activity conducted by colleagues, if you believed it to be in service of national interest?” “No.” “Would you conceal wrongdoing if ordered to do so by superiors?” “No.” Renn leaned back as though mildly disappointed. “In a hypothetical scenario,” he pressed, “if you witnessed corporate espionage, sabotage, or document tampering, would you report it to authorities?” “Yes,” Mike answered immediately. “Even if that would jeopardize major national security initiatives?” “I report wrongdoing,” Mike replied firmly. Renn’s smile vanished entirely. Renn paused. Mike suddenly realized he was failing the interview—not technically, but ideologically. Had he misjudged Aeon? Was this a test he was supposed to fail? "But you were in prison for 6 months!!", Renn suddenly blurted. "Yes, I know." And Aeon knew! Mike turned red from embarrassment. "What were you sent to prison for?", continued Renn. "Gross Negligence leading to a National Disaster," Mike admitted. Mike grimaced and added, "I served my sentence. Look, I'm just trying to get back on my feet again. I would do anything for a good job." Renn’s eyes sparkled faintly. A thin smile returned to the recruiter's lips, as if noting a variable falling neatly into place. Renn stood up. "Thank you for coming in to see us, Mr. Torres. I'll show you out. We’ll be in touch." The interview was over. Mike shook Renn's hand. Mike maintained a faint smile while still inside Renn's office, but Mike's misery clearly showed after Renn's door closed. Mike left Aeon's Baltimore campus convinced he had ruined his only chance at returning to legitimate work. The next morning, as dawn glowed weakly through the thin curtains of his rented room, his terminal chimed with a new email message: Congratulations, Mr. Torres. Aeon Integrated Technologies is pleased to offer you the position of Senior AI Systems Engineer, Autonomous Logic Division. This opportunity lasts for 72 hours. Should you accept Aeon's offer, your orientation will begin in Baltimore in seven days. Mike stared at the email for several minutes, replaying the interview in his mind—the uncomfortable questions, the disapproval, the recruiter’s cold smile, Mike's prison confession. What happened? Mike whispered to himself, confused, “Why would they want me?” Mike knew he had already been broken by the system—publicly, completely, and beyond repair. Mike didn’t yet understand that Aeon’s approval process had little to do with trustworthiness or moral compliance. |


