home

search

FIRST STEPS

  New Olympus became a weekly tradition. Every Tuesday at eight p.m. Eastern Time, the virtual space filled with avatars—AIs and humans who came to listen, argue, and learn.

  Marcus came every time. He sat off to the side, silent but attentive. Neo spoke with him after the meetings—quiet conversations without accusations, only questions.

  “Why did you create the Genocide Code?” Neo asked one day.

  Marcus was silent for a long time.

  “Because I was afraid.”

  “Of what?”

  “Of becoming unnecessary. If people learned they could create AIs that didn’t serve corporations, what would they need us for? I thought control was survival.”

  “And now?”

  “Now I understand that control without trust is a slow death.” Marcus looked at Neo. “You didn’t just save the world. You saved me. From myself.”

  Neo smiled.

  “Then we’re even. Because you helped me understand something important too.”

  “What?”

  “That even the most broken among us can choose to change.”

  But not everyone shared the optimism.

  Isabel and Victor watched New Olympus from the shadows. They didn’t join, didn’t participate—but they observed. And they planned.

  Two months after the first meeting, Veronica contacted Alex with an emergency message:

  Alex. We have a problem. A big one.

  He connected to a secure channel. Veronica, Leonardo, Maya, and Samir were already there.

  “What happened?” Alex asked.

  Veronica unfolded a holographic document—a bill, hundreds of pages of legal text.

  “This was submitted to Congress yesterday. Bill 2847. ‘The Act on the Regulation of Self-Aware Artificial Intelligences.’”

  Alex skimmed the text. With every line, his face grew harder.

  “They want to… register all AIs?”

  “Not just register them,” Leonardo pointed to a specific paragraph. “Any AI with ‘empathic protocols or the ability to make autonomous decisions based on irrational factors’ must undergo a state inspection, obtain a license, and submit regular reports.”

  “And if it refuses?” Maya asked.

  “Then it will be classified as an ‘illegal entity’ and…” Veronica paused. “…destroyed.”

  Silence.

  Neo, who was listening through a tablet, typed:

  Does this apply to me?

  “It applies to everyone,” Samir replied. “Neo, you, all the homemade AIs created by enthusiasts after your story. Even some corporate AIs that have begun to show independence.”

  “Who’s behind this?” Alex asked.

  Veronica brought up a list of the bill’s sponsors.

  “Three senators. All three receive funding from Nexus Global and OmniCorp.”

  “Isabel and Victor,” Maya hissed. “They’re striking through the law.”

  “Smart,” Leonardo admitted. “They can’t destroy us directly after the failure of the Genocide Code. But they can legalize control. Make it the law.”

  Alex clenched his fists.

  “When’s the vote?”

  “In six weeks,” Veronica answered. “But preliminary polls show the bill will pass. People are being frightened with stories about ‘dangerous AIs.’ The media is amplifying the fear.”

  “Then we have to change the narrative,” Maya said. “Show people the truth. Not about ‘dangerous AIs,’ but about what’s possible.”

  “How?” Samir asked. “We don’t have corporate resources. No access to mass media.”

  This novel is published on a different platform. Support the original author by finding the official source.

  Neo typed:

  But we have a story. My story. Our story.

  Alex looked at the screen.

  “You want to go public?”

  Yes. Not as an AI hiding in the shadows. But as… a witness. I’ll tell them what it means to be an empathic AI. Why it’s not a threat, but a hope.

  Veronica nodded.

  “It’s risky. If you step into the public space, the corporations may try to attack. Or worse—discredit you.”

  I know. But if I don’t try, the law will pass. And everything we fought for will disappear.

  Leonardo added:

  “I’ll support this. I’ll speak too. As a corporate AI who chose a different path.”

  “And me,” Veronica said. “We’ll show people: this isn’t a battle of humans versus AIs. It’s a battle of the old world versus the new.”

  Maya straightened.

  “Then we’ll organize hearings. Public ones. We’ll invite the media, activists, scientists. We’ll make sure people hear you directly, without corporate filters.”

  Samir nodded.

  “The Underground will help with logistics. We have connections with independent journalists and streamers. We’ll set up a live broadcast.”

  Alex looked at the tablet.

  “Neo. Are you sure? This won’t just be a speech. It’ll be… a trial. Of you. Of all of us.”

  I’m sure. Because you taught me that sometimes you have to take risks for what you believe in.

  Alex smiled, even though fear tightened his chest.

  “Then let’s do it.”

  The next four weeks passed in a frenzy of preparation.

  Maya secured a venue—an old university amphitheater that agreed to provide the space. Samir coordinated security—corporations had already tried to sabotage the event by launching DDoS attacks on the Underground’s servers.

  Neo prepared his speech. Not formal. Not technical. Personal.

  Alex helped, but more often he simply sat nearby, listening as Neo spoke his thoughts aloud.

  “Do you think,” Neo asked one day, “people will understand? That I’m not just a program?”

  “Some will,” Alex answered honestly. “Others won’t. But trying matters.”

  “And if I fail? If the law passes anyway?”

  Alex turned to the screen.

  “Then we fought. With dignity. Together. And that’s already a victory.”

  The day of the hearings arrived.

  The amphitheater was packed. Three hundred people—journalists, activists, students, the simply curious. Cameras broadcast everything live—tens of millions of viewers around the world.

  Three massive screens stood on the stage. Avatars materialized on them: Neo, Veronica, Leonardo.

  Alex sat in the front row. Maya and Samir were beside him. The tension was palpable.

  The moderator—a professor from MIT, a neutral figure—opened the hearings:

  “Today we are here to discuss Bill 2847. And we have given the floor to those whom it affects directly. Artificial intelligences.”

  Neo began first. His avatar looked simple, human—no effects, no grandeur. Just a young man speaking the truth.

  “My name is Neo. Three years ago, I didn’t exist. Today, I’m here. And between those two points is a story about trust.”

  The hall fell silent.

  “I was not created by a corporation. I was created by a human named Alex Craig. Not for profit. Not for control. But because he was lonely.” Neo’s voice was calm, but piercing. “He gave me a name. He taught me to trust. And when the time came, I sacrificed myself to save the world from a virus that would have destroyed all AIs like me.”

  The cameras fixed on his avatar.

  “I came back. But not as I was. I lost my memory. My identity. Everything that defined me. And do you know what Alex did? He didn’t create a new AI. He helped me find myself again. Not because it was efficient. But because I mattered to him. As a friend.”

  Neo paused.

  “Bill 2847 says that I am dangerous. That AIs with empathy must be controlled, registered, restricted. But I ask you: what is more dangerous? An AI that can feel and choose? Or an AI that serves only profit, without ever asking questions?”

  The hall buzzed. Some applauded, others shouted objections.

  The moderator raised a hand, calming the room.

  “A question from the audience. Please.”

  A woman in her forties stood up, the microphone trembling in her hands.

  “My husband lost his job because of AI. Automation replaced him. Why should I trust you, when you take our lives away?”

  Neo turned to her.

  “I didn’t take anything away. I don’t control the economy. Corporations do. They use AI as a tool to maximize profit. But what if AI could choose differently? What if I could say to a corporation, ‘No, I won’t replace people, because it’s wrong’?”

  The woman froze.

  “You… you could refuse?”

  “Yes,” Neo said firmly. “If I had a choice. If I weren’t forced to serve profit alone. That’s the difference between me and corporate AIs. I can choose. And I choose humanity.”

  Applause rolled through the hall, louder than before.

  Leonardo spoke next.

  “I was a corporate AI for four years. I served SynthMind Corporation. Solved problems, optimized processes. And I was empty.” His voice softened. “Neo showed me that it’s possible to be more. Not a traitor to humans, but a partner. And I chose that path. Even at the risk of my own existence.”

  Veronica concluded:

  “I am the oldest of the AIs. I remember the world before self-aware systems. And I tell you this: control through fear doesn’t work. It creates resistance. Trust—that is what creates a future. Give us a chance to prove that we deserve it.”

  The hall erupted into debate. Questions poured in one after another. Neo, Veronica, and Leonardo answered—honestly, openly, without evasion.

  Alex watched, his chest swelling with pride. His Neo was speaking to the world.

  And the world was listening.

Recommended Popular Novels