You Don't Exist Yet — But Sora 2 Does
You upload a 15-second clip. Hit generate. Wait.
Three minutes later, you're watching yourself do something you've never done.
You're shooting hoops with Kobe in a park that doesn't exist. You're arguing with Hemingway in 1924 Paris. You're pacing through a Martian colony at dusk, giving a TED Talk to androids. Your future self is interviewing you on a talk show in 2080, casually spoiling your life.
And it feels real.
Not "looks real." Feels real. There's a difference.
The Strange Part Isn't What You See—It's What Your Brain Does With It
When you watch yourself in a scene that never happened, something breaks in your head. Not metaphorically. Neurologically.
Your brain is wired to treat video as evidence. It's a recording device. It captured something. That's what cameras do. They don't lie—they just observe.
Except now they do lie. But they lie competently. They lie with physics-based light and anatomically correct movement. They lie in a way that bypasses your skepticism entirely.
You know it's fake. Your prefrontal cortex is screaming it. And yet the part of your brain that builds memories has already filed it away under "things that happened to me."
It's not a video anymore. It's a false memory with a timestamp.

You're About to Have a Very Familiar Experience: Falling in Love With a Better Version of Yourself
Here's what happens next:
You find a version of you that you actually like. Maybe you're smarter in it. Maybe you're braver. Maybe you're just more comfortable in your own skin. And you watch it. Again. And again. And again.
And then something shifts.
That version starts to feel more real than the original. You start thinking like that version. Moving like that version. Talking yourself into becoming that version.
But here's the twist: you're not becoming a better version of yourself. You're becoming the algorithm's version of you.
You're not self-actualizing. You're algorithm-actualizing.
You're optimizing for what the AI thinks you should be—which is built on what everyone thinks everyone should be. Millions of data points averaged into a smooth, marketable, acceptable version of human behavior.
You're becoming a consensus hallucination of yourself.
This Isn't About AI. It's About What You've Already Been Doing.
The terrifying part isn't Sora 2. It's that you've been doing this your whole life.
You've been watching yourself in other people's expectations. In social media comments. In your parents' disappointment. In your own anxiety about how you should be.
Every one of those was a false memory waiting to happen. A version of you that wasn't quite you but close enough that your brain filed it away.
And you've been slowly becoming the average of all those corrupted data points.
You've been running on someone else's algorithm all along.
Sora 2 just made it visible. It took the invisible process that's been happening in your head and spit out a video.
It's a mirror, but the mirror has opinions.
So What Do You Do?
Some people will watch their algorithmic self and feel seen. Finally, here's who I could be. Permission to exist.
Some people will feel sick. That's not me. That's a simulation wearing my face.
Both are right.
But you can stop confusing the simulation with the source code.
The algorithm will keep showing you who you could be.
Your job isn't to pick the best one.
Your job is to remember that the you doing the choosing is also made of algorithms—inherited code from your family, your culture, your era, your trauma.
You're not a person deciding between versions of yourself. You're a process choosing between processes.
The only difference now is you can see it.