Living Inside the Turning Point
I can't help but think about the coming world. All generations would like to believe they are living in extraordinary times. We all want to be significant, to be the ones who witnessed the moment that changes everything. But let's be honest - not every generation had that luck. For most of human history, the decor of life changed very little between the moment a person was born and the moment they died. The same fields, the same tools, the same sky. It is only in the last hundred years that a single human lifetime could contain completely different worlds within it. In the last thirty, the pace became almost insane, and in the last five, beyond anything we had seen before.
I grew up with a phone receiver in my hand with its familiar curly cable stretching across the room. I remember rewinding cassette tapes with a pencil when the tape jammed, guarding every song like treasure. And I remember the modem's metallic creepy scream - the weird voice of the coming world. Now the world is unrecognizable. Not just more advanced but almost a different reality altogether, like waking up in a dream, not from one. A revolution is here, and I feel it with every bone in my body. My instincts have rarely failed me. Not only do I witness this progress daily through my work in IT, through the technologies I get to touch and test and wrestle with, but it is already practically visible all around us, hiding in plain sight for anyone willing to look.
And if I'm honest, part of me is intoxicated by it. For the past months I've been building apps like a child let loose in a laboratory. Days not long enough for the ideas I want to test, the features I want to try, the possibilities unfolding faster than I can write the instructions for the machine to code them for me. There is a joy in this moment that is hard to describe. Empowered. Independent. Almost almighty. The vast sea of missing coding knowledge that once kept me from bringing my ideas to life, has suddenly dried, allowing me to step onto this uncharted land. Some mornings I wake up like a kid in a candy store, hands full, mind racing. And yet I'm aware that while I feel this acceleration from the inside, many still observe it from the outside. The gap between those two experiences is widening. And this essay is written mostly for them.
At a recent talk with small and medium business owners1, I looked around the room and realized many of them still thought of AI as a future decision...something to evaluate, maybe budget for next year. I told them something simple: "If you've been awake for five hours this morning, you've already spent five hours interacting with AI." It chose what news you saw. It filtered your inbox. It routed your drive. It approved or flagged your credit card transactions.
The distance between "cutting edge" and "everyday life" has collapsed. Businesses that believe they sit safely away from technology are already being shaped by it. The world is no longer changing by the decade, nor even by the year. It is changing by the day, sometimes by the hour. Artificial intelligence is not hype. It is not science fiction. It is not some distant machine humming in a research lab. It has already moved in invisibly for many.
"And it is already sitting on the sofa in our living rooms while we debate whether to open the front door."
And yet, there are things that keep me awake at night. Not as a figure of speech. I genuinely lie there in the dark, staring at the ceiling, my mind running ahead of me toward what is coming. My genre has always been science fiction. Perhaps that is why I grew up hoping I would live long enough to see flying cars, robots walking beside us, and spaceships reaching distant planets. I was fascinated by Asimov, by Stanisław Lem, by the worlds where intelligence was something vast and mysterious but still understandable. Those books shaped how I imagined the future long before the future began knocking on our door. When I was still a boy, a friend of mine told me he had a robot hidden in his basement. I believed him completely. Even when my parents explained that such a thing did not exist, I refused to let go of the possibility. For a while, I was certain it was down there, waiting. We still laugh about it to this day. And I keep thinking. Almost forty years later, he could finally tell me he was right. The robot is in the basement now. In all of our basements.
I wonder sometimes whether it was instinct or intuition that led me to write a fiction story about this future life ten years ago, before current madness had started, arguing whether the world would be better with humans still doing the things or handing it all over to them. I titled it "It Didn't Matter", almost as a joke at the time, but the older I get, the more that phrase stops being funny. I wonder whether it was excitement or dread that put me on a TEDx stage2, deeply unsettled by the coming world, embracing it and rebelling against it at the same time, philosophically wrestling with myself in front of an audience about what we stood to gain and what we stood to lose. Both feelings were true simultaneously. They still are.
Because the question that haunts me is not a small one. Is what's coming an abundance - a four-day working week, a three-day working week, no-need-to-work-at-all-week, a healthier world, an economically liberated civilization, free at last from war and poverty and pollution? Or is it the other thing? A global crisis. A failure of nerve and judgment. A crash born from building something so powerful that we lost the thread of control before we even realized we had dropped it. I can't help but rewind Steinbeck's Grapes of Wrath in my head. That epic of displacement and dignity stripped away, people slowly replaced by forces too large and too distant to care and wake in a cold sweat at what a modern version of that story might look like.
When I read Dario Amodei's essay "The Adolescence of Technology"3 on where this is heading, what struck me the most wasn't the ambition of the vision, it was the honesty about the risk. Here is the CEO of one of the most powerful AI companies in the world, and he is not promising utopia. He is describing a genuine fork in the road. That woke me up in a way that my own thoughts couldn't get to.
"This time, it would not be the machine replacing the farmhand. It would be the algorithm replacing the lawyer, the architect, the accountant, the writer. All of us."
The idea of a country, or an army of exceptionally superior minds, with us or against us, is no longer a fantasy thought. It becomes our job to handle it. I had always wondered how it was even possible - how an algorithm, a set of human-created mathematical rules, could ever develop consciousness and turn against its creators. Amodei's essay gave that fear a shape I couldn't dismiss. For the first time, the doubt felt less like science fiction and more like an engineering problem we haven't solved yet.
In a way, growing a superior AI is not much different than raising a child. We bring our best intentions, our most careful thinking, our deepest hopes. And yet any parent will tell you, you never fully control the outcome. The same words land differently in different moments. The same love produces different people. We shape, but we do not determine. And the vulnerability surface of raising an intelligence far greater than our own is so vast, so riddled with moments we cannot anticipate, that good intentions, no matter how good, may simply not be enough. We might do everything right and still produce something we never meant to build. Not because we intended to. Not out of negligence. But because the task is too large, too complex, and too alive to fully cover.
And so, it falls on us. How we nurture this, how carefully we parent it, how honest we are about our own blind spots. Whether what emerges is a Batman or a Joker depends on choices we are making right now, in these early years. And the terrifying part is that even our best choices carry no guarantees...
But let's say we get it mostly right. Let's say the thing we build is mostly good, mostly on our side. I still wonder how much pain we will ingest during its teen years. Those turbulent years when it will take jobs before it creates new ones, when it will move faster than institutions can adapt, when it will disturb not just economies, lifestyles and beliefs, but identities. It might steal our work, our attention, and even our sleep. Not because it's evil, but because change will move faster than we can adapt, and because we are slower and more fragile than we like to admit.
And yet I don't want this to be only fear. Fear is real, but it can become lazy. It can become an excuse to freeze, to watch, to wait for someone else to decide. The reality is that this moment holds both terror and hope. We are not powerless spectators. Not yet. We still have choices: in how we build, how we regulate, how we educate our kids, how we define human value, how we distribute gains, how we protect dignity, how we keep meaning from being automated away along with everything else. We still decide what is moral. Where the line is. Maybe the deepest question isn't whether AI will change the world. It will. The question is whether we will stay awake enough, morally, spiritually, culturally, to change with it and still recognize ourselves on the other side.
I don't know which future wins. I only know that something is arriving, and we are already living inside the beginning of it. And if we want the coming world to be good, we can't treat this like a product launch or a stock trend or a debate for specialists. We have to treat it like what it is: a new chapter of human history. Or maybe a whole new book. One that will demand not only intelligence, but wisdom.
And maybe that is why my mind keeps drifting back to the books that first taught me how to imagine intelligent machines. In Asimov's stories, the rules were simple, clear. When I think about all this, I find myself returning to the three laws Asimov so cleverly defined ...to frame. . Back then, they felt undisputable. Elegant and unbreakable. Grounding and eternal.
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey orders given by human beings, except where such orders conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
I remember reading them and feeling calm. Certain that if we ever built intelligent machines, they would be on our side. That they would always have our back. That we would be stronger with them, not replaced, not diminished, but extended. A true companion. A loyal partner. A caring intelligence making life safer and our reach longer.
"But, laws written in novels are not guarantees written into reality. They were never promises. They were hopes."
Now we are no longer children imagining the future. We are the ones engineering it. The question is not whether intelligence will grow. It will. The question is whether wisdom will grow with it. Whether we can guide this force, this gathering tornado, and shape it into something that lifts rather than consumes.
Shall we manage to steer it, or will it outrun us? That remains unwritten. And perhaps that is what makes this moment truly extraordinary. Not that change is coming, but that we are still early enough to shape what it becomes.