The Gap
I get predictable when I'm tired. Late at night, after the kids are asleep and the house goes quiet, I reach for my phone like a reflex. Not because I want to. Because something in me has been trained to want to, and the wanting feels indistinguishable from choice. The feed knows this too. It serves me different content at 11 PM than it does at 5 AM. Softer. More emotional. Precisely calibrated to the version of me that has stopped guarding the door.
We're building systems that predict human behavior with increasing precision. Markets, moods, purchasing decisions, political affiliations. I've been inside some of these systems, watching autonomous AI agents identify patterns, place bets, accumulate resources based on models of what people will do next. The technology is impressive. It's the implication that keeps me awake. Because the question that lives underneath the engineering is: if your choices are predictable, are they choices? Not as abstraction. Not as a question you can set down when the conversation ends. Concretely. Practically. If a machine can model your decision before you make it, what exactly is happening when you "decide"?
The comfortable answer is that prediction isn't determination. Just because a system can forecast that you'll open your phone at 11 PM doesn't mean you have to open it. You could leave it on the counter. You could go sit with the silence instead. You still could, right now, do something unpredictable. Quit your job. Move to Portugal. Send that message you've been avoiding for years. Free will hides in the gap between probability and certainty.
But that gap keeps shrinking. The models keep improving. And at some point, the distinction between "95% likely" and "certain" stops being philosophically interesting and starts feeling like a technicality we're clinging to because the alternative is too strange to accept.
My son Ford is six. He draws spiders and monsters in his notebook, then tells me he's going to make movies about them someday. He says this with a confidence I'm no longer sure I understand — as if the future is a place you choose rather than a place you're taken. I watch him and wonder what he'll inherit. Not money or property. I mean the architecture of attention he'll grow up inside. The prediction engines that will learn him before he learns himself. By the time he's my age, the models won't just forecast behavior. They'll anticipate desire before it surfaces into consciousness. They'll know what he wants before he feels the wanting.
The Knife
This is where the knife goes in. Not the obvious fears about privacy or manipulation, though those matter too. The deeper cut is subtler: the moment your feed knows you're lonely before you've admitted it to yourself. The moment an algorithm detects the early signs of depression from your scrolling patterns, your typing speed, the length of your pauses. The moment a system understands your marriage is in trouble before you've let yourself see it. These aren't hypotheticals. The research already exists. The models are already being trained.

So when a machine knows you better than you know yourself, what happens to the self?
Both Things Are True
I used to think wisdom meant choosing sides. That the world presented itself in binaries, and maturity was having the courage to pick one and defend it. Determinism or free will. Fate or agency. I wanted someone to settle it for me. Spinoza, a lens grinder in seventeenth-century Amsterdam, excommunicated by his own people for ideas they found unbearable, concluded that everything unfolds according to necessity. What we experience as choice, he argued, is simply ignorance of the causes already in motion. Three centuries later, Viktor Frankl walked out of Auschwitz with the opposite conviction. Even there, he wrote. Even when everything had been stripped away. One freedom remained that no one could confiscate: how we meet what cannot be changed. I kept waiting for one of them to be wrong.
They're both right.
The universe operates according to patterns you didn't choose. Your genes, your childhood, the century you were born into, the cascading consequences of decisions made by people who died before you existed. You are, in ways that can increasingly be measured and modeled, a product of forces beyond your control. And within that unfolding, you make choices that matter. Not in spite of the constraints. Because of them. The algorithm can predict you, and you still choose. Both things are true. The paradox doesn't resolve. You learn to live inside it.
My grandmother had Alzheimer's. I watched her lose herself in slow motion, memory by memory, name by name, until the woman who once knew every card in the deck couldn't remember she'd ever played. What stayed with me wasn't the grief. It was the recognition that the self is not as solid as we pretend. That the things we call "I" can be subtracted, one by one, by forces that don't ask permission. But something else stayed with me too. Even at the end, when so much had been taken, she would sometimes look at me and I could feel her reaching. Not remembering, exactly. Reaching. Some irreducible thing beneath the memories, still trying to connect.
The models will learn us. They'll map our patterns with increasing precision. But I wonder if there's something in us like what I saw in her — something that remains unfinished, something that keeps reaching even when so much else has been erased.

Agency isn't about escaping constraint. It's about finding the space where action still matters despite constraint. The algorithms will get better. They'll learn us more completely than we learn ourselves. That's not pessimism. That's just the trajectory. The question is what we do with the diminishing territory that remains ours.
I don't have a solution. But I've started doing something that helps me remember what's mine. Once a day, I do something unpredictable. Not grand. Not performative. Just small enough to be genuinely private. I take a different route. I read something no algorithm would recommend. I sit with a thought instead of reaching for my phone. I reach out to someone the pattern of my past behavior would never suggest. These aren't rebellions. They're reminders. They're ways of saying to myself, and maybe to whatever is watching: I am not yet fully known. There is still a part of me that surprises even me.
I don't know if this matters in any measurable way. The prediction engines will adapt. They'll learn to model even my attempts at unpredictability. Maybe the gap closes eventually. Maybe the territory shrinks until there's nothing left that's truly mine.
But I don't think so. I think there's something in human beings that resists completion. Something that remains unfinished on purpose. Call it soul. Call it consciousness. Call it the part of you that watches your own thoughts and wonders who's watching. The models will approach it asymptotically. Always closer, never arrived. The part that can always, in the next moment, do something new.
Ford will inherit a world more predicted than any that came before. The systems that learn him will be more sophisticated than anything I can imagine. But he'll also inherit the knowledge that his father believed in the gap. That even as the machines grew more certain, someone kept practicing uncertainty as a discipline. Someone kept doing small, unpredictable things. Not because they changed the world. Because they preserved something worth keeping.
The algorithm knows what I'll probably do next. It's usually right. But probability isn't destiny. And in the space between what's likely and what's possible, there's just enough room to be human.
That's where I'm trying to live, dot by dot.