On AI, Jobs, and Why We’re Asking the Wrong Questions
Photo by Sage Friedman on Unsplash
Everyone’s panicking about AI taking our jobs. Artists worry it’ll replace creativity. Developers wonder if coding will become obsolete. Writers fear they’ll be automated away.
My two cents on the matter: we’re just not having the right conversation.
The narrative that we’re all fed is: “AI is coming for your job. AI is coming for your art. AI is coming for your creativity. AI is coming for your livelihood.”
From my experiences building a decentralized identity system, writing publicly about Python, and using AI as a tool in my own work.
AI doesn’t replace creative work. It changes what “creative work” means.
When calculators were invented, the world didn’t banish mathematicians. It made a transition from needing human calculators to needing people who could solve different problems. The creative labor shifted from exact calculation to actual application.
When photography was invented, painters didn’t become extinct. Painting moved away from photographic documentation and into a whole new world of creativity, expression, interpretation, and more.
AI is doing the same thing. It’s not killing creativity as a whole; it’s killing certain types of creativity and making space for ones we haven’t even thought of yet.
Learning Technical Skills Isn’t About Competing With AI
And that’s precisely where the fallacy begins. The idea is that, with the rise of AI, learning to code or acquiring any form of technical skill is really about “staying ahead of AI.” That’s not really the point, though.
Learning to code, acquiring any form of technical skill, is really about understanding the infrastructure of the world that’s being built. When I learned Python, for instance, I was not learning how to write better than any AI. I was learning how to think, how to understand the systems being built, how to understand the flow of data, and how to determine whether a problem needed a loop, a function, or a complete architectural overhaul.
AI can write code, yes. But it can’t, as of now, understand the context for which you’re needing a solution for a specific problem. The point isn’t really the code itself. The point is knowing what to ask for.
Building Publicly Is the New Literacy
The unspoken shift in the way we build things is verification. In an age where an AI can produce almost anything, be it code, art, words, or credentials, how do you actually prove you created something? How do you actually prove you have skills? How do you actually distinguish between understanding something and just regurgitating it?
You build publicly.
My journey to learning Python, however disorganized on Medium, is proof that I’m actually learning. My DID project on GitHub is proof that I can actually build something.
AI can replicate results. It can’t replicate process.
Building publicly is about creating a verifiable process to prove you’ve been on a journey. In an age dominated by AI, process is now worth more than product.
The Real Question: What Can Only Humans Do?
From my own experience building a decentralized identity system, here are some things that humans are uniquely good at:
Asking questions that no one else would think to askCombining ideas from completely unrelated fieldsCaring about something without a clear return on investmentCreating for the sake of interest, rather than for maximum efficiencyMaking mistakes and learning from them out loud, in public
AI learns from what already exists. Humans create something that does not exist. When I decided to build a DID system on a Chromebook, it wasn’t because it was optimal or efficient. It was something that no AI would have suggested. It was mine. It was unique. It was born from a mix of constraints, curiosity, and stubbornness. This is something that AI cannot replicate. Not because it cannot, but because it lacks the stubborn unreasonableness that defines human creativity.
How Humans Evolve: We Build the Infrastructure for What’s Next
Every technology shift creates its own set of infrastructure requirements:
The printing press created the need for libraries, copyright laws, and literacy programs.
The internet created the need for web standards, cybersecurity, and digital literacy programs.
Now, AI is driving the need for:
Verification infrastructures: how do we verify human involvement?Collaboration infrastructures: how do we collaborate with AI?New forms of value: what’s valuable if AI can do everything?Ethical infrastructures: who owns AI-created content, and who’s liable for its misuse?
These are the opportunities that create jobs. Not by competing against AI. By building a world that includes both humans and AI. And that’s impossible to do unless you have a technical foundation. Learn technical skills. Not to compete. To build.
The Contrarian Conclusion
AI is not here to take over our work as creators. It’s pushing us toward a more profound, more extensive concept of what “work as a creator” can really entail. The individuals who will prosper are not the ones with the most incisive prompts or the most sophisticated AI tools. The individuals who will prosper are the ones who:
know the systems they are working withinbuild things in the open, so they can be verifiedpose inquiries that the AI itself would not think to posebuild the infrastructure for problems that have not even been conceived
Yes, learn to code. Learn to build. Learn to work with AI.
Not because you are competing with it.
But because you are designing the world that the AI lives within.
And that’s a job only humans can do.
Contrarian Thoughts was originally published in Coinmonks on Medium, where people are continuing the conversation by highlighting and responding to this story.
