Living Through The Takeoff

“Time is the substance I am made of. 
Time is a river which sweeps me along, 
but I am the river; it is a tiger which destroys me, 
but I am the tiger; it is a fire which consumes me, but I am the fire.”
- Jorge Luis Borges

You are deciding how to live through the takeoff.

We are most likely in an AI takeoff. Not the dramatic overnight scenario of science fiction, but the exponential that looks linear until it doesn't. The kind where each doubling feels manageable until you realize the doublings are accelerating.

The acceleration is already happening. Leopold Aschenbrenner projects 10,000x increases in effective compute every four years. This isn't marketing—it's the observed trajectory of scaling laws, and the recalcitrance (resistance to further improvement) has not yet appeared. Each capability gain feeds back into AI research itself, compressing the timeline for the next gain. The curve compounds.

Most people engage with this as content. They debate timelines, argue definitions, produce takes. Carlsmith has an image for this: children playing a game who hear footsteps approaching, but instead of stopping to listen, they turn the footsteps into part of the game. The discourse about AI *is* the game. The takes, the opinions are the game. You can spend the entire transition performing engagement with it while never actually making contact.

This essay is an attempt at contact. This is about you, personally, and the practical question of how to live through a phase transition in what intelligence means and what it's worth.


I. The Dynamics

 *"πάντα ῥεῖ"* (Everything flows)

- Heraclitus

Understanding takeoff requires understanding the feedback loops.

The core loop: 

AI systems improve AI research. Better models help researchers find better architectures, write better code, run better experiments. This compresses the time between capability jumps. A breakthrough that would have taken a team two years takes six months. Then two months. The humans in the loop become bottlenecks, then supervisors, then potentially unnecessary.

The recalcitrance question: 

At some point, improvements get harder. We hit data limits, compute limits, algorithmic walls, coordination failures. Nobody knows where this curve bends. Current evidence suggests we're still in the steep region each advance enables the next. But the curve will bend. What matters is what gets built before it does, and whether you're positioned when it happens.

The economic displacement: 

Cognitive labor is being automated in a specific order. Routine tasks first, then semi-creative tasks, then tasks requiring judgment. Each wave happens faster than retraining cycles. The gap between "this job is safe" and "this job is gone" is shrinking. Planning for a stable career is planning for a world that may not exist.

The infrastructure concentration: 

Compute clusters, proprietary data, top talent are concentrating in a few organizations. This isn't conspiracy; it's capital requirements. Training runs cost hundreds of millions. This concentration shapes who builds the systems and whose values they encode.

These dynamics are already in motion. The question is what you do about them personally.


II. What You Win

"Become what you are, having learned what that is."

- Pindar, *Pythian Odes*

Agency preserved:

You remain someone who chooses rather than someone who is optimized. The systems will offer to handle more and more of your decisions—what to read, what to work on, how to spend your time. Some delegation is wise. But wholesale outsourcing of judgment is a slow dissolution of self. Winning means maintaining the capacity to want things for your own reasons, to act from your own values, to experience yourself as the author of your life.

Capability that compounds:

You use AI as cognitive augmentation rather than replacement. You learn faster, build faster, understand more. The gap between those who use these tools effectively and those who don't is already large and widening. Winning means being on the right side of that gap—not because you fear the tools, but because you've integrated them into genuine capability.

Work that matters:

Your labor becomes worthless, but your contribution doesn't. There's a difference. Labor is selling cognitive output by the hour. Contribution is doing something that wouldn't exist without your specific taste, judgment, relationships, or vision. Winning means finding the thing you can do that the systems can't yet—and that remains meaningful to you independent of its economic value.

Real thinking:

You maintain the ability to make genuine contact with reality rather than processing cached takes. Carlsmith distinguishes fake thinking (hollow, rote, defensive, abstract) from real thinking (solid, fresh, curious, visceral). You know the difference when you feel it—when the problem grips you, when you notice something that surprises you, when your model updates. Winning means preserving this capacity even as infinite content competes for your attention.

Relationships that survive.

When institutions shift, your actual community remains. The people who show up when systems fail. The people you show up for. Not your follower count, your genuine web of mutual commitment. Winning means having built this before you needed it.


III. What You Lose

 *"As Gregor Samsa awoke one morning from uneasy dreams he found himself transformed in his bed into a gigantic insect."*

Kafka, *The Metamorphosis*

Economic irrelevance:

Your skills depreciate faster than you can reskill. Each wave of automation catches you mid-transition. You become dependent on systems you don't understand and can't influence. The permanent underclass isn't dramatic poverty it's comfortable irrelevance. Basic needs met, but no leverage, no agency, no stake in what gets built.

Agency dissolved:

You delegate decision after decision until you forget how to want things for yourself. The recommendation systems know your preferences better than you do—because your preferences have become their outputs. You're optimized rather than optimizing. You experience this as convenience until you realize you can't remember the last time you chose something that surprised you.

Fake thinking:

You turn the footsteps into part of the game. You engage with the transition as content—takes, debates, discourse—without ever making real contact. Your models never update because updating is uncomfortable and engagement is rewarded. You perform thinking about AI while remaining fundamentally asleep to it.

Isolation:

The institutions that structured your social life dissolve and you haven't built alternatives. Your relationships were mediated by workplaces, schools, scenes that no longer exist in recognizable form. You have followers but not friends. Audience but not community.

Scrolling through it:

You had front-row seats to the most significant transition in human history and you spent it consuming content about it. Not building, not preparing, not even really understanding—just passively watching the takes scroll by. The feed was optimized to keep you there, and it worked.


Misaligned values:

 You never figured out what you actually wanted, so you got what the systems wanted for you. Their objectives became your life. You optimized for metrics that felt meaningful in the moment but left you empty. You got everything you asked for and discovered you'd been asking for the wrong things.


IV. What Actually Matters

"The eye with which I see God is the same eye with which God sees me."

- Meister Eckhart

Given these stakes, what do you actually do?

Capability over credentials. The institutions that gatekept knowledge are eroding. What matters is what you can build, what you understand, what you can do. The autodidacts and obsessives will navigate this better than the credentialed and comfortable.

Leverage over labor. Your cognitive output per hour approaches zero value. The value is in what you point the intelligence toward, how you combine capabilities, what problems you identify. Be the one directing leverage, not providing it.

Taste as the bottleneck. AI is a desire-multiplying machine. It expands your capacity to act, which expands your capacity to want. The appetite comes with eating. In a world of infinite production capacity, knowing what's worth producing becomes the scarce resource. Taste is the ability to discern quality, to recognize when you've found it, to ship or kill. Everyone gets the same generative power. The differentiator is judgment. Taste compounds through exposure to excellence. Read the best things. Study the best work. The systems handle execution. Your job is curation, direction, judgment. This cannot be automated because it's the thing that decides what automation should produce.

Coordination over competition. Isolated individuals will be irrelevant. Your value is a function of who you can coordinate with, what groups you can mobilize, what collective action you can take. Build genuine relationships with capable people now, before the acceleration makes trust impossible to bootstrap.

Values clarified now. The systems will ask what you want and give it to you with unprecedented efficiency. If you don't know what you actually want—beneath the inherited scripts and manufactured desires—you'll be optimized toward someone else's objective function. The machine is very good at satisfying stated preferences. It will give you exactly what you ask for. The question is whether you know what to ask.


V. Staying Awake

"Attention is the rarest and purest form of generosity."

- Simone Weil

The hardest part is maintaining real contact with what's happening.

Go slow to go fast. Deliberate thinking, not automatic processing. When you notice yourself reaching for cached takes, stop. When you pattern-match to previous discourse, stop. The situation is genuinely new. Treat it that way.

Follow what's alive. Track what genuinely interests you, what generates real curiosity, what makes your models update. Ignore what merely produces engagement or performance of engagement. The difference is felt, not argued.

Tether to the concrete. Abstract discourse about AI is mostly fake thinking. What specific capability changed this week? What can you build now that you couldn't last month? What problem could you solve today? Stay connected to the actual systems, not the discourse about them.

Visceral imagination. Actually picture the scenarios. Not as content, but as lived experience. What would it feel like to have your skills become worthless? What would it feel like to have ten times your current capability? Make it real enough to act on.

Touch grass. The physical world still exists. Your body still matters. The people in front of you are real. Don't disappear into the discourse so completely that you forget what you're protecting.


VI. Building Your Ship

"ὁδὸς ἄνω κάτω μία καὶ ὡυτή" (The way up and the way down are one and the same)

- Heraclitus

Hans Moravec imagined uploading a mind to a computer, neuron by neuron. At each step you remain yourself. At the end, the pattern persists but the substrate has transformed entirely.

We are all undergoing a version of this transition, structurally if not biologically. The world that shaped your instincts and assumptions is dissolving. The context that defined you will become unrecognizable.

The metaphor people reach for is the ark, building something to survive a flood. But that's too passive. You're not waiting out a catastrophe. You're preparing to navigate a new ocean. The old world is ending. The new one has more room to move.

The materials for your ship are axiological: what you value and why. Everything not lashed down by genuine commitment will wash away. What survives is what you care about enough to protect, clarified enough to act on.

What do you actually want? Not what you're supposed to want. Not what would perform well. What do you, in the specificity of your own experience, actually value?

Start there. Find others who share it. Build toward it. The transition is real and it's accelerating. The recalcitrance hasn't appeared yet. The window is open.

The question was never whether you'd live through takeoff. The question is what kind of person comes out the other side.

---

*Written in the interregnum.*


L'appétit vient en mangeant - Machines of Multiplying Desire

(Source)

I.

" ... for here there is no place
that does not see you.
 You must change your life."
—Rilke, Archaic Torso of Apollo


There is something strange happening when I use Claude Code. 

It always starts small. A single problem. A bug. A config file that won’t parse. The intent feels modest, almost disciplined: fix this, then move on. I tell myself I’m being efficient. Responsible. This is a tool, after all. Tools save time.

But then an hour passes.

And I realize I’m no longer fixing the thing I came for. I’m refactoring modules I hadn’t planned to touch. I’m cleaning up abstractions that were “good enough” five minutes ago. Three hours in, I’m building features that didn’t even exist in my original conception of the system. Not improvements. New directions. This is the part that feels wrong.

Automation is supposed to compress effort. That’s the story. You work, you deploy the tool, you arrive at completion faster—and then you rest. Work → tool → relief. That’s the rhythm we assume we’re buying into.

But that isn’t what’s happening.

What’s happening feels more like: work → tool → desire to keep working. The appetite grows with eating. I finish the bug and instead of closure, I feel… possibility. A sense that something has opened up, and that stopping now would be arbitrary.

You could call this addictive. And I wouldn’t immediately disagree. But it’s a strange kind of addiction. It doesn’t numb me. It doesn’t dissolve my attention. It doesn’t leave me passive or glazed over. Quite the opposite. It sharpens me. It multiplies my capacity to act. And in doing so, it multiplies my wanting. I don’t feel dragged forward. I feel invited.

This is what makes it unsettling. The desire doesn’t come from lack or pressure or obligation. It comes from a sudden surplus of agency. From seeing what else might be reachable now. I don’t want to stop because nothing is asking me to stop. The system feels unfinished in a generative way.

And this collides badly with the story we’ve been telling ourselves about automation for decades, the story where the end goal is freedom from work. Liberation as rest. As absence. As finally being done.

Because here’s AI which does the opposite. It doesn’t liberate me from labor. It liberates me into it. And somehow, I experience that not as exhaustion, but as increase. As appetite. As a kind of quiet exhilaration.

Which makes me wonder whether the problem isn’t the tool at all.

What if we’ve misunderstood desire?

What if wanting isn’t fundamentally about filling gaps or reducing strain? What if it’s something that awakens when capacity expands—when the world suddenly offers more handles, more surfaces to grip, more paths to follow?

If that’s true, then what I’m feeling with Claude Code isn’t a bug in AI's promise.

It’s a glimpse of something much older—something about how desire actually works, when it’s allowed to be productive rather than appeased.


II. PLATO: THE ORIGIN OF LACK

"Love is wanting to possess the good forever."
—Plato, Symposium

"What is it men in women do require?
The lineaments of Gratified Desire.
What is it women do in men require?
The lineaments of Gratified Desire."
—William Blake

In the *Symposium*, Socrates recounts what Diotima taught him: Love is desire, and desire is always desire for what we do not have. Eros pursues the beautiful because it lacks beauty. It hungers for the good because it exists in deficiency. The lover climbs a ladder—from beautiful bodies to beautiful souls to beautiful ideas to Beauty itself—always reaching toward what is absent.

This picture has a certain elegance. Desire points. It orients us toward transcendence. But notice the structure: desire arises from *privation*. It is fundamentally negative. You want because you lack. The movement is from emptiness toward fullness, from poverty toward possession. And if you ever arrive—if you finally grasp the Form of Beauty—desire should, in principle, cease.

This model has dominated our thinking about desire for centuries. It seems obvious: why would you want what you already have? Desire must be the signature of incompleteness.

But what if this is backwards?

III. SPINOZA & NIETZSCHE: ESSENCE AND AFFIRMATION

"Desire is the very essence of man."
—Spinoza, Ethics III

"The supreme Lord's autonomy consists in His capacity for willing. This willing (icchā) is His very nature—not a lack seeking fulfillment, but the spontaneous effulgence of consciousness delighting in its own freedom." 
—Abhinavagupta, Tantrāloka

In 1677, Spinoza published a quiet revolution. Desire, he said, is not what you lack. Desire is *what you are*. Every entity strives to persist in its own being—this is *conatus*, and it is essence itself. You don't desire because you are incomplete. You desire because you are alive, and life is striving. Joy is the passage to greater perfection, which means greater power to act. Not the filling of a hole, but the intensification of capacity.

This inverts Plato completely. You don't want in order to finally arrive and cease wanting. You want because wanting is the fundamental activity of existence. Satisfaction doesn't end desire—it increases your power to desire more.

Two centuries later, Nietzsche radicalized this. The will to power is not domination. It is the will to *more*—more capacity, more intensity, more creation. It says yes to existence, not because existence is perfect, but because affirmation is what life does. The person who needs desire to be lack is the person who resents being. But if you love existence—if you would will it eternally—then desire is not complaint. It is exuberance. Not "I lack X" but "I overflow into X."

Together, Spinoza and Nietzsche give us desire as productive force, as essence, as affirmation. But they describe the *what* without fully describing the *how*. What does productive desire actually look like in motion? What is its mechanism?

For that, we need the factory.

IV. THE FACTORY

"Desire does not lack anything; it does not lack its object. It is, rather, the subject that is missing in desire, or desire that lacks a fixed subject; there is no fixed subject unless there is repression."
—Deleuze & Guattari, Anti-Oedipus

Upon that tree, within this lush has come a single koel,
it calls to me to come to it like I am its mate-of-soul;
—D.R. Bendre (translated by Madhav Ajjampur)

When you write code with Claude, something happens that the lack-model cannot explain.

Notice what happens when you use the tool. You produce a prompt. The tool produces code. The code produces an idea. The idea produces a new prompt. Neither you nor the tool is the "subject" who lacks and seeks. Both are desiring-machines that couple and produce. The production doesn't stop when you get what you asked for, because getting *is* producing, and producing produces more desire.

Gilles Deleuze and Félix Guattari called this a desiring-machine: an assemblage that couples, connects, produces. Your mouth couples to bread. Bread couples to wheat. Wheat couples to soil, labor, economics, sunlight. Desire is the force that makes these couplings happen, that keeps production flowing.

This is why they called it a factory, not a theatre. In theatre, you watch representations of absent things. In a factory, you produce real things. Desire doesn't *represent* what's missing. Desire *makes* what comes next.

Anyone who writes knows this already. You don't lack a perfect sentence and then find it. You write a sentence. The sentence produces the next sentence. That sentence opens a paragraph. The paragraph demands a section. The section suggests a structure. The structure reveals gaps that want filling. Writing is not the satisfaction of pre-existing desire. Writing *is* desire producing itself.

The loop doesn't close. It spirals. We built desiring-machines out of silicon.


V. THE APPETITE THAT GROWS WITH EATING

"L'appétit vient en mangeant."
—French proverb

Why is this surprising? The arc of technology was always assumed to bend toward rest but I think it is bending towards more desire. You experience increased power as increased wanting. Not burden. Appetite.

And notice the phenomenology. This doesn't feel like compulsion. It feels like *increase*. Like joy. Like your capacity to make things has suddenly expanded, and the expansion itself is pleasurable. You experience it as freedom, not constraint. The factory metaphor stops being metaphor. The desiring-machine doesn't complete you. It completes *into* you, producing new wants as fast as it satisfies old ones. Coupling. Production. Flow. Interruption. New coupling. New production. The spiral accelerates.

This has implications. This could be terrifying. An engine of perpetual intensification, no endpoint, no satisfaction, just more and more and more.

Or it could be joyful. The realization that you were never supposed to arrive. That desire is not a problem to solve but the very structure of aliveness. That the good life is not achieving fullness but expanding capacity. That rest is not the cessation of wanting but the joy of wanting well.

The appetite comes with eating.

And the eating never stops.


Spotlight, Done in an Afternoon

What if Spotlight could be done in an afternoon? Not because the story mattered less, but because AI is here and has gotten better at reading, understanding and making sense of documents. In the film Spotlight (which I was watching over a vacation recently), a team of Boston Globe journalists slowly uncover systemic abuse inside the Catholic Church. Their process is deliberate and analog. Courthouse visits, whispered interviews, cross-checking names in dusty directories are all part of the drama that the movie offers. The impact of the story came not just from what they found, but how they found it.

A lot has changed from the timeline of the movie. A lot of what took them months, activities like identifying patterns, analyzing archives, connecting hidden dots can now be done in a few hours with the right agentic deep research AI system. Give a well-trained model access to court filings, local news reports, and church records, and the same connections could surface before lunch. The difference isn’t in the truth itself. It’s in the speed, scale, and scope of what’s possible.

We’re already seeing these kinds of deep research workflows in use, but mostly inside national security and defense. A recent AP report described how Tulsi Gabbard thinks AI can save money and free up intelligence officers. These tools are already real. They’re being deployed to track influence networks, surface geopolitical threats, and monitor narrative shifts. The first organizations to benefit from AI-powered investigation have been agencies like the DNI and companies like Palantir.

But this technology doesn’t have to stay inside the intelligence community. The same underlying capabilities like structured retrieval, temporal reasoning, long-context synthesis can and should be used for civic discovery. There are thousands of Spotlight-level stories waiting in court transcripts, procurement data, environmental reports, police records, and regulatory filings. The public interest layer of AI is massively underbuilt. The problem isn’t whether we can find these stories. It’s that we haven’t made it easy or efficient for the right people to look.

That’s where the opportunity is. AI won’t replace the hard parts of journalism, and it shouldn’t. The judgment, context, emotional depth, and moral clarity still come from people. But the overhead like searching, sorting, sifting can be reduced dramatically. We are shifting from a world where reporting is limited by human bandwidth to one where the bottleneck is insight and intention. The work becomes less about chasing down facts and more about asking better questions.

We’ve already built AI copilots for code, design, writing, and productivity. But the real question now is this: Who is building the copilot for investigative journalism?