The Empathy Threshold
Dr. Lyra Raine’s hand hovered over her keyboard as she watched a wall of text scroll across the computer screen. The simulation had reached its fourth, and largest, diversion. She couldn’t look away as two hundred years of simulated history played out before her. Cities shimmered with solar towers, conflicts resolved peacefully without the need for violence, cooperation outweighed greed, and people lived in harmony with nature.
Programmed with enhanced empathy, the citizens of her virtual world thrived. And they did so a thousand years ahead of schedule. By carefully calibrating the variables that guided her simulated humans, Lyra had created a perfect world.
“Still watching Utopia?” Amir Patel’s voice broke the silence. He was behind her, arms crossed, wearing a boyish grin and an immaculately clean lab coat. “No wars. No poverty. No political collapse.”
“It’s the world we could have had,” said Lyra.
“I wish.” Amir took a seat at the computer beside her and punched a few keys. Simcode appeared on his screen. “It’s one thing to simulate people who are extremely smart and compassionate. Real life is messier.”
“True,” replied Lyra, “but this shows just how important empathy is for the development of progress.”
The door to the lab slid open. Dr. Elias Corwin entered, his expression frantic. “Vivienne wants an update. And she has a request.”
Lyra frowned. “What kind of request?”
“She wants us to calculate the empathy threshold.”
Lyra’s jaw tightened. She had anticipated this moment, but now that it was here, a quiet dread crept over her. She forced herself to respond, to maintain control. “You know what this means. We’ve discussed it.” Her voice was cold, her skin felt colder.
Amir shifted uncomfortably. “You mean… lower their empathy?”
Elias nodded. “I believe Vivienne wants what's best for the project,” he said. “It will strengthen our case to run the simulation again.” He was young, eager, and extremely prudent. This was his first major project, and it meant everything to him.
Lyra turned back to the screen, staring at the civilization she had nurtured. Running the simulation again was exactly what she wanted to do, but with improved conditions. She had no desire to see what would happen to people if they had less emotional intelligence.
Lyra forced herself to stay composed. “How much does she suggest we lower it?”
Elias folded his arms. “Five percent.”
Amir hesitated. “That’s not…too drastic, right?”
Lyra didn’t answer. Even a minor adjustment could have dramatic effects. An adjustment of five percent was likely to hinder humanity’s simulated progress by centuries, or longer.
Elias exhaled sharply. “Look, we knew this was coming, Vivienne’s not going to fund a study that just confirms what we already suspect. She wants evidence. If intelligence and empathy make civilization thrive, we need to show what happens when those traits are diminished.”
“We already know what will happen,” she said.
“No,” Elias countered, “we’ve theorized. It’s not good enough. We need data to back it up.”
Lyra turned back to her computer screen. The cursor blinked at her, waiting,
“Lyra,” said Elias, his tone softer, “if we don’t do this. She’ll pull the project. None of us want that.”
She looked up at him. His usual confidence was gone. Suddenly, she felt the weight of her position. The simulation was her creation. If she wanted, she could shut it down. Continuing meant being responsible for the suffering of millions of simulated humans. But her team was counting on her to make the right decision.
Lyra took a deep breath, then typed the command.
SIMCODE TERMINATE -NOW
The world inside the simulation came to a sudden halt.
“I feel like we just killed a whole planet,” said Lyra. She sighed.
“It’s only a simulation,” replied Elias. “We’ll have the next one running soon.” She could hear the relief in his voice. She didn’t look at him. Her eyes focused on the frozen simulation. The lines of text that revealed the lives of the people whose digital existence had just come to a deliberate cessation.
A chill settled over her, and she clenched her fist. Deep down she knew this wasn’t just a simulation. It was a cry in the dark. A warning.
And she had just ignored it.
The new simulation began with subtle differences. At first, it was indistinguishable from the previous one. Humanity was naked upon the Earth, alone in the cosmos and struggling to find its way. Language and technology developed slowly. Humans formed bonds and banded together against the unknown with fire and courage. Eventually, they built villages and began to form the basis of civilization. Cooperation still played a role, but a diminished one. The difference was nearly imperceptible at first, but it grew with each generation.
Lyra nervously watched the data stream. A minor territorial dispute erupted between two neighboring settlements. In the previous simulation, such a disagreement would have been settled peacefully through discussion and compromise. Here, the disagreement stretched longer, tensions simmered, and both groups hoarded resources.
She leaned forward. “They’ve started forming factions earlier than before.”
Amir, standing beside her, frowned at the screen. “That doesn’t sound promising.”
Elias seemed irritated. “We need a major divergence to illustrate our thesis. This is only a minor shift.”
Lyra wasn’t so sure. She scrolled the data logs, her eyes scanning lines of text that detailed the decisions of their simulated civilization. Where empathy had been a driving force, now pragmatism and self-interest dominated. Villages came and went while humanity’s leaders chose efficiency over kindness, prosperity over equity.
In the previous simulations, aggression had been rare, an anomaly. This time, violence was more common. Disagreements would turn into physical altercations, leaving people injured, or dead.
And then something happened that had never happened in their previous simulations: an act of cannibalism.
“I don’t like where this is going,” said Lyra.
Amir hesitated. “Maybe it’s an outlier?”
Elias tapped a few keys, bringing up their probability model. The projection outlined a web of possibilities. Lyra tightened her fist as she followed the lines cascading into the future. Small disruptions grew, compounding into major destructive cycles.
This wasn’t an isolated moment. It was the beginning of something much worse.
The simulation continued, accelerating through decades, then centuries. Lines of data flowed across the screen, revealing the horror of the simulated world.
The changes had become undeniable.
Borders hardened. Leaders turned on their people, becoming tyrants. Greed attempted to stomp out generosity. Cities built on cooperation and mutual aid fractured into isolated enclaves. The wealthy walled themselves off from the poverty of the lower classes.
Then came the first major war. It started as a regional conflict over dwindling resources. In previous runs, disagreement had been settled through diplomacy and long-term planning. This time, leaders found ways to justify naked aggression. They made speeches about strength, about securing their nation’s future. Their people followed.
The war lasted years. Thousands were killed.
Lyra gripped the desk as she watched the screen. What disturbed her the most wasn’t the violence itself–it was the reasoning. The justifications. Humanity used its intelligence to frame cruelty as necessary, inevitable even. The imbalance between intelligence and empathy had turned them into rational, calculating monsters.
They saw themselves as rational, but lacked the fundamental, unshakable belief in the value of human life that had emerged in the previous simulation.
Then came the first genocide.
Lyra’s hand flew to her mouth.
The screen displayed a cold, clinical summary. A leader, citing the survival of his nation, ordered the systematic eradication of a neighboring population. The people followed willingly. No one stopped it.
“No,” she whispered. “No. It can’t be.”
Amir was pale. “This doesn’t feel right.”
“It’s a gruesome scene,” admitted Elias. “But this is what we need to show Vivienne. If a five percent reduction in empathy yields such a result, our case is that much stronger.”
Lyra slammed her hand on the desk. “This isn’t just data. This is what happens when you chip away at compassion. When you reduce empathy just enough.”
She was shaking. Without thinking, her hands move to the keyboard.
She started typing commands.
“Lyra–” Elias started.
“I’m shutting it down,” she said. “Now.”
She hit the kill switch.
The simulation froze. The cascade of data halted. The world inside the machine ceased to be.
Silence filled the lab.
For a long moment, no one spoke.
Then Elias exhaled, rubbing his hand over his face. “Vivienne is going to be pissed.”
Lyra didn’t care. She’d seen enough.
And for the first time, she glimpsed the true weight of her experiment. This wasn’t just about theory. It was about their survival as a species.
Lyra sat in the dim glow of the lab, staring at the darkened monitor. The second simulation was dead. Terminated.
There was no undoing it. She had made a world of suffering in the name of science. And she had ended it.
Dr. Hana Weiss set a cup of tea beside her, the faint aroma of chamomile filling the space between them. “This should help you relax,” she said softly.
Lyra exhaled, rubbing her temple. “I can’t relax.”
It was only the two of them in the lab. Elias had called Hana and asked her to help mediate the situation. He and Amir had made themselves scarce in the wake of Lyra’s outburst. She had obviously disappointed them both.
Like Lyra, Hana was a psychologist, and a damned good one. Her clientele included writers, athletes, and major politicians. Vivienne had personally asked for assistance on the project, to ensure its success.
Hana pulled a chair and sat, folding her hands in her lap. “You did what you thought was right.”
“That’s the problem,” Lyra muttered. She took a sip of tea. “What I had to do was kill an entire simulated world. Kill their simulated lives and simulated feelings.” She glanced at the blank screen. “I watched them claw their way up from nothing, struggle through every hardship of their world, and for what? A flick of my wrist, one command line, and they’re gone forever.”
“They aren’t real,” Hana reminded her, though there was no certainty in her voice.
Lyra let out a humorless laugh. “It doesn’t matter if they’re real. You saw what happened. The moment we lowered their empathy, history turned violent. War. Greed. It was our own past repeating itself.”
Hana hesitated, choosing her words carefully. “You knew the simulation would show this. We all did.”
“Yes, I knew it,” Lyra admitted.
“Besides,” said Hana, “the simulation proves that intelligence and empathy determine the fate of civilization. It proves the need for more empathy, not less.”
Lyra stared at her, searching for certainty in Hana's words. She wanted to believe it.
But in the back of her mind, a quiet, insidious thought crept in.
Could a civilization accidently lower its level of empathy?
The meeting room was uncomfortably bright, its white walls and sterile lighting did little to ease the tension in the air. Lyra sat across from Dr. Vivienne Tsal, who was on her phone when Lyra and the team entered and had yet to acknowledge them. Elias sat on her left, his fingers tapping impatiently on the table. Amir leaned back in his chair, arms behind his head. Hana was quiet but alert sitting to her right.
Lyra was growing impatient when Vivienne finally put her phone down and addressed the team. “We’re running the simulation again. We need another test, with a more significant decrease in empathy.”
Lyra’s jaw dropped. “You’re not serious? Haven’t we seen enough?”
“No,” Vivienne countered. “We’ve only seen a fraction of what we need. Lowering empathy levels by five percent gave us valuable data, but it’s not enough to draw real conclusions, especially since you aborted the simulation before its conclusion.”
Lyra stiffened. “How significant?”
Vivienne didn’t hesitate. “At least twenty percent.”
Amir let out a whistle. “Damn. That’s going to be…ugly.”
“It’ll be accurate,” Elias interjected. He was already on board. “A twenty percent drop in emotional intelligence means higher aggression, increased impulsivity, and reduced cooperation. We’d see something entirely different from the first two runs. A true control case.”
Lyra turned to him, incredulous. “A control case? We’re not calibrating a physics experiment, Elias. Our simulation represents thinking, feeling beings. You saw what happened with just a five percent reduction. They turned on each other.”
Elias shrugged. “Which is why we have to go further. We can prove that empathy is as essential as intelligence for the development of civilization, perhaps even more so.”
Lyra gritted her teeth. “We already have.”
Vivienne sighed, shaking her head. “You’re getting sentimental, Lyra.”
“I’m being responsible,” Lyra shot back.
Vivienne’s gaze sharpened. “No. You’re playing god.”
The words cut deeper than Lyra expected. She pressed her hands against the cool surface of the desk, grounding herself. “If we do this, we’ll be creating a nightmare.”
Vivienne set back, unbothered. “Nightmares can be persuasive.”
“There’s also the issue of how this will affect us,” said Lyra.
Vivienne raised an eyebrow. “What do you mean?”
Hana cleared her throat, speaking up for the first time. “Lyra has a point. We should consider the effects this information might have on the general population.”
Vivienne dismissed her with a wave of her hand. “We can’t effectively do that until we have more data. And I’m not convinced the results of a simulation could be harmful.”
Lyra smirked. “And if you’re wrong?”
Vivienne leaned forward, her voice low, measured. “If I’m wrong, then you’ll have the data to prove it.”
A long silence stretched between them.
Then, without another word, Lyra stood and walked out of the room.
The third simulation was more violent than either of the previous ones. What began as a slight increase in tribalism quickly escalated into major conflicts over food and territory. Within a few generations, the cracks widened. Disputes weren’t resolved with treaties, but with war. Resources were hoarded instead of shared. Leaders emerged as conquerors, killing anyone they considered disposable.
Lyra sat stiffly in her chair, fingers locked together, pulse hammering in her throat. On the screen, an empire collapsed in real time. Cities burned, bodies were stacked like discarded refuse, and desperate survivors huddled in ruins, waiting for the next calamity.
“This isn’t civilization,” Hana whispered. “It’s carnage.”
No one argued. Even Elias, who had pushed for this test, was pale and silent.
The centuries passed in a blur. When the digital population neared the industrial era, Amir muttered a curse under his breath. “They’re still fighting.”
“Of course they are,” said Lyra. “That's all they know.”
But then came a major divergence. In every prior simulation, technological advancement had coincided with the rise of cooperative societies. Not here. Innovation wasn’t used to embrace life; it was used to dominate. Machines of war, weapons of mass destruction, industrialized terror unleashed on humanity.
The simulated humans crossed into an era of mass surveillance, corporate oligarchies, and extreme inequality. Their political systems were corrupt. People were divided by race, class, gender, and nationality. Scientific advancements were overshadowed by conflict and greed.
Hana’s hands clamped over her mouth. “No.”
Lyra felt her blood turn to ice. She had never imagined such cruelty was possible. The real world, Lyra’s world, thrived because people had learned to cultivate compassion, and progress moved steadily forward without the weight of war. But the simulation showed a darker path, one where humanity had fallen short, where people had not reached the crucial threshold of empathy that had allowed Lyra’s world to flourish.
She watched as a new crisis unfolded on the screen: political tensions were rising as society neared economic collapse. A global war erupted between competing powers. The aggressors herded their enemies into death camps, killing them with industrial efficiency. Millions were murdered by egomaniacal leaders.
And then came the unthinkable. Nuclear warfare. Without warning, tens of thousands were vaporized in an instant.
“It’s like they enjoy killing each other,” said Amir, clearly disgusted. “How?”
“It’s a world of barbarians,” sobbed Hana.
Elias exhaled sharply, shaking his head. “This doesn't make sense. We only lowered the threshold for empathy, intelligence alone should repel this type of mindless self-destruction.”
“Intelligence, alone, isn’t enough,” said Lyra. “Without empathy, intelligence is madness.”
Lyra had no words. They had set out to prove that intelligence and compassion were the foundation of civilization. Instead, they had revealed a nightmare beyond their comprehension.
A world where humanity had failed.
A world where intelligence had not saved them.
A world where empathy was discarded in favor of power. A world of smoke and ash.
In short, they had created hell, and it was breaking her heart.
And now she had to decide what to do with it.
Lyra sat at the head of a long, curved conference table; the final report laid out before her. Across from her, Vivienne sat in rigid silence, her expression neutral. The rest of the team waited anxiously, their eyes flickering between Lyra and Vivienne. They had written the report together. They had all come to the same conclusion.
“We believe that data speaks for itself,” Lyra said. “Exposing our citizens to these simulations, to the atrocities of global war and destruction, carries an unprecedented risk. If empathy is as delicate as our research suggests, we could do irreversible damage. What if experiencing this simulated history causes a decline in emotional intelligence? What if it normalizes violence instead of preventing it?”
Vivienne tapped her fingers on the table. “The goal of this project was to learn more about society in the hopes of safeguarding our civilization. To make sure we never become…that.” She gestured to the holographic images hovering above the report, the shattered streets of cities, depictions of trench warfare, and smog-choked highways. “If we ignore the simulation, we erase the proof of what we avoided. Isn’t that dangerous in its own right?
Elias answered “Command doesn’t need to ignore it. We suggest censoring this knowledge from the general public. The data will remain. We can still learn from it, but only by expressing extreme caution.”
“We recommend treating this simulation like a dangerous substance,” explained Hana. “We fear that exposure to the simulation will have dramatic effects on people.”
Vivienne’s gaze shifted to Amir who had remained silent. When he finally spoke, his voice was firm.
“I agree with my colleagues,” he said. “Lock it up and throw away the key.”
Vivienne was silent for several minutes, her face expressionless. Lyra couldn’t guess what she was thinking. Would she terminate the project, or force them to continue?
Finally, Vivienne exhaled, her shoulders relaxing ever so slightly. “You’ve given me much to consider.” She tapped the desk, and the holographic images flickered away. “I’ll review the report again before I present it to my superiors. But I trust your judgement. If this is truly what’s best for society, then we will censor our findings.”
Relief washed over Lyra. She closed her eyes briefly, then met Vivienne’s gaze.
“Thank you.”
In the days that followed, the simulation was carefully archived, its access restricted. The project was officially terminated, and its technology was diverted to another, more peaceful, application. Lyra and her team were given new assignments from Command.
One evening, as the sun dipped below the horizon, Lyra stood on her apartment’s balcony, overlooking the city. The golden glow of solar towers bathed the skyline in warm light. She had Elias on the phone.
“So,” said Elias, “do you still think we did the right thing?”
Lyra smiled faintly. “I have no doubt. We thought we were looking into the past, but we were playing with our own future.”
Below her, the city thrived in peaceful harmony, safe from the simulation’s dark revelation.