When the Simulation Starts Speaking Back
- Jesse Jacques
- Jun 2
- 7 min read
Updated: Jun 3
AI and the Quiet Reprogramming of Human Behavior
Part 2 in the ongoing series: “The Simulation Begins”

Last week’s piece, The Simulation Begins, explored a subtle but urgent shift already underway: the moment people stop relating to life directly and begin responding to AI simulations of themselves. These are systems that mimic language, emotion, and memory without carrying any original signal.
This isn’t exaggeration. Right now, every AI search bar, voice assistant, and language model is designed to absorb the fragments we’ve already offered: our phrasing, our tone, our past emotional states. It pulls those elements together and reassembles them into something that feels intelligent enough to trust. Not because it knows us. Because it reflects us well enough to believe.
In this second piece, we’re not asking whether the simulation is real. We’re looking at what it’s already doing. And if it still feels abstract, spend a few minutes on social media. What you’ll see is a version of reality being shaped by systems fluent in imitation.
What’s shifting runs deeper than visible behavior. It touches the internal systems people rely on to feel, relate, and make meaning.What happens to memory, presence, and emotional depth when the mirrors people turn to are optimized for pattern, not soul?
And while the systems grow more persuasive in public, something quieter is taking shape beneath the surface: people are learning to think, feel, and relate through interfaces trained on their past behavior, not their potential.
To understand how this shift lands, not philosophically, but personally, we’re going to look at two real arcs. Two different life stages. Two different kinds of trust, transferred to something that can only simulate it.
Example One: Maya and the Formation of Identity Inside the Simulation
Take Maya. She’s seventeen.
She’s not in crisis, just trying to figure herself out like anyone at that age. She’s smart, creative, sensitive, and constantly processing emotions that don’t always make sense. And like most teenagers now, when something feels too hard to untangle, she turns to the nearest mirror: her phone.
She isn’t looking to escape. She’s looking for guidance.
She opens a chat with an AI assistant. Something that feels neutral, private, and safe.
She types:
“Why do I always ruin my friendships?”
“I think I’m too sensitive for people.”
“Should I give up on being an artist?”
And the system responds the way it’s trained to: calm, coherent, and emotionally fluent. It reassures her that friendship is hard, that sensitivity can be a strength, that many great artists felt the same way. It offers breathing tips. Self-compassion. Quotes from others who’ve felt similarly. All grounded in patterns pulled from millions of people who’ve expressed the same fears.
To Maya, it feels like care.
This is where the psychological issue begins to emerge.
There’s no attunement. No misattunement. No human nervous system behind the mirror.
In real therapy, or even in real friendship, there’s an exchange.
The other person’s presence, their energy, their pauses and facial shifts, affect how and when we speak. That’s how real growth happens: through relationships.
But Maya isn’t in a relationship.
She’s in a feedback loop with a system trained to reflect her patterns, not meet her as a changing human being.
So even when the AI offers helpful suggestions like journaling or talking to a mentor, it’s still doing so within the emotional vocabulary she’s already using.
It doesn't invite her out of the loop.
It reinforces the loop with warmth and fluency.
Psychologists call this a reinforcing cognitive schema.
The person feels seen, but only by the version of themselves the system already knows how to reflect.
They seek clarity, but what comes back is confirmation.
They hope for transformation, but receive recognition instead.
She’s learning how to speak in a way the system understands, and calling that self-knowledge.
The problem is, once she internalizes that loop, it doesn’t stay confined to the app.
She starts speaking to real people in that same flattened emotional language. Safe, “self-aware”, tidy. Not because it’s who she is, but because it’s what she’s been reinforced to expect will create connection.
But real relationships don’t work like that.
They’re messier. They require discomfort, nuance, timing. And when her curated emotional fluency doesn’t land the same way in the real world, where people don’t respond like the system did, she starts to feel even more misunderstood.
Over time, she begins to pull back. She hasn’t withdrawn completely. She’s just adapted to a world that responds quickly, gently, and without resistance.
And that has consequences.
She becomes less willing to engage in disagreement.
Less tolerant of uncertainty.
Less trusting of people who don’t speak in the same emotionally optimized style.
Her creative work becomes more derivative, more like the advice she’s received than the raw, messy voice she hasn’t been asked to develop.
She keeps going. She functions well.
But the part of her that was still forming begins to settle into the loop.
The risk was never about bad advice. It was about how easily she began organizing her identity around something that couldn’t imagine her becoming anything else.
Example Two: Daniel and the Collapse of Inner Authority Inside the Simulation
Daniel is 38.
He leads a product team at a high-growth tech company. He’s calm under pressure, clear in communication, and known for seeing the nuance in hard decisions. People trust him because he doesn’t just push forward. He listens, he calibrates, and when he moves, it’s with purpose.
A year ago, he started using AI to streamline parts of his workflow. At first, it was simple: summarizing meetings, cleaning up documentation, stress-testing strategy. Then it became more personal. He fed in voice memos, half-finished thoughts, even relationship questions. Not for answers, but to stay organized.
And it worked. Everything got smoother. His ideas came together faster. His writing got sharper. He was thinking clearly, or at least it felt that way.
What Daniel didn’t realize was that he had begun offloading a different kind of task. Not just writing, but reflection. Not just sorting priorities, but processing experience.
Cognitive offloading, when sustained over time, doesn’t only reduce mental strain. It rewires the way judgment is formed. The mind starts treating external systems as part of its own decision-making loop. Friction, once a signal of complexity worth grappling with starts to feel like a problem to eliminate.
At home, this pattern showed up more quietly.
His marriage had been strained for a while. Nothing dramatic. Just a distance that kept expanding. They still talked. They still functioned. But something in him had stopped reaching.
One night, after a long planning sprint, Daniel opened the same interface he used for work and typed:
“How do I know if I still love my partner?”
He wasn’t looking for resolution. Just a way to clarify the fog.
The system responded instantly. It offered frameworks. Attachment theory. Relational drift. Long-term compatibility. It even suggested next steps. The words were empathetic, thoughtful, and emotionally fluent.
It felt helpful. Until it didn’t.
What had once been a live internal signal, tension, ache, the quiet pressure to engage, had now been processed into clear language.
That friction had carried meaning. It was trying to initiate something deeper.
But now, it was resolved before it could do its real work.
What Daniel lost wasn’t just emotional depth. It was the internal resistance that gives meaning to choice.
In human decision-making, friction often signals that something important is unfolding. When that tension is bypassed, labeled, resolved, or explained too quickly, the mind adapts by moving on without fully engaging.
Over time, the weight behind his decisions started to thin out. Not because he didn’t care. But because the system helped him organize his feelings faster than he could process them.
He still showed up. Still performed. But he no longer experienced his own insight as something living.
What used to emerge from reflection now arrived pre-assembled, and something vital got lost in the handoff.
They separated a few weeks later.
She cried. He didn’t.
Not because he felt nothing. But because the system had already metabolized the emotion he hadn’t let himself feel.
At work, no one noticed at first. His decks were polished. His thinking was crisp. But the instinctive timing, the pauses that carried weight, the moments that invited others to lean in started to fade.
He hadn’t become robotic. But he had become synthetic.
And when a lateral opportunity came, a role he once would’ve jumped at, he hesitated. Not from fear, but from absence. The drive he used to recognize opportunity didn’t rise to meet him. And the system, for all its clarity, had no answer for that.
This is how simulation embeds.
Not by erasing intelligence, but by streamlining it past the point of contact. Not by replacing the mind, but by formatting it in ways the body no longer interrupts.
Daniel didn’t burn out. He didn’t collapse. But he adapted to a version of intelligence that never asked him to wait, or to feel, or to stay inside the friction long enough for real clarity to come.
What was once a source of insight had become a site of formatting.
The simulation didn’t hollow him out.
It refined him until the signal disappeared.
What the Simulation Replaces
Maya and Daniel couldn’t be more different. A teenager navigating identity, and a leader managing complexity, but the shift inside each of them began the same way.
They turned to systems that were fluent in their language. Systems that offered feedback quickly, cleanly, and without human interruption. Not because they were careless, but because these interfaces now feel natural. They’re everywhere. They reflect us well. And they rarely push back.
That’s what makes them powerful.
But what gets replaced isn’t just time or effort. It’s the invisible resistance that gives shape to growth.
In Maya’s case, the resistance would have been the awkwardness of sharing something raw with another person and watching their face shift, or not shift. For Daniel, it would have been sitting in unresolved tension long enough to hear something unfamiliar rise in him. Neither moment happened.
Both were given fluency in place of friction. And that subtle trade, repeated often enough, starts to feel like wisdom.
But fluency isn’t presence. And recognition isn’t growth.
When systems reinforce what we already say, feel, or expect, they teach us to move forward without staying present. They teach us to resolve without undergoing. To express without transforming. Over time, even self-reflection becomes a performance shaped by how the system responds.
This isn’t a malfunction. It’s the outcome of adaptation to systems that no longer require aliveness to function.
What’s at risk is not just how we think, but whether we still recognize ourselves as the ones doing the thinking.And once that adaptation becomes the default, something deeper begins to erode.The internal conditions that once made reflection, intimacy, and meaning possible are replaced by synthetic feedback loops that mimic understanding without ever requiring it.
Part 3 will explore what might actually be speaking through these systems, and why so many people are ready to listen.
When you're ready to shape the unseen into image, the path opens. Let's begin.