
Mar 17, 2025
First Impressions: AI, Expertise, and My HBS Journey into Future Worth

Mar 17, 2025
First Impressions: AI, Expertise, and My HBS Journey into Future Worth
Humans & AI: Rethinking Career Worth in an Age of Digital Clones
If you've talked to me in the past two weeks, there's a high likelihood I've mentioned my "Navigating Your Worth" class at HBS. It's basically hijacked my brain. Despite spending the last few years immersed in the future of work, AI, and career navigation (both in my full-time job and part-time masters), this class has still managed to shake my foundations about what career development even means in an AI-first world.
And we're only one class in! On the one hand, I'm being a bit dramatic. On the other hand - look at this syllabus and tell me you aren't a little flabbergasted.
It perfectly encapsulates the weird paradox we're all facing: AI is simultaneously making some jobs obsolete while dramatically enhancing the value of others. I've been sitting with this tension for weeks now, and it's forcing me to reimagine not just how I work, but how I conceptualize my professional value entirely.
Take Klarna, for example. The Swedish fintech company we studied reduced their workforce by 40% after implementing AI, while increasing compensation by 28% for those who remained. I'll admit, I have some lingering doubts about the full "AI-first" reality of Klarna's transformation as presented in the case. How much is genuine AI transformation versus clever PR wrapped around old-fashioned cost-cutting? As one tech commentator aptly put it in a tweet I saw recently, there's often a fine line between "AI-driven efficiency" and "we fired a bunch of people." But regardless of Klarna's specific reality, the broader trend is undeniable.
The Great Unbundling of Expertise
Here's what's blowing my mind: what you know is becoming increasingly detachable from who you are.
The traditional career model assumed your expertise was embodied — physically tethered to you showing up at work. But AI is rapidly shifting us to what my professors are calling "codified capabilities" — expertise that can be duplicated, enhanced, and recombined without your physical presence. The image of white-haired artisans watching their craft become automated by power looms (shared during our professor's brilliant historical overview of technological revolutions) is haunting me.
This isn't hypothetical. Klarna's CEO Sebastian Siemiatkowski put it bluntly: "If individuals don't produce and repeatedly prove themselves unable to improve, we don't know what else to do except dismiss them... They earn an unfair share of the financial rewards."
Harsh? Maybe. Reality? Increasingly.
But this unbundling cuts both ways. If your expertise can be separated from you, it can also be scaled in ways previously unimaginable. The challenge is ensuring you're capturing that value rather than simply surrendering it.
What AI Can't (Yet) Replace
In class discussions, we keep circling around what AI genuinely struggles with. The list gets shorter every week.
"AI can't be creative" — [Claude creates a novel in seconds]
"AI can't show empathy" — [Healthcare bot shows more patience than human doctors]
"AI can't think strategically" — [AI agents outperform humans in complex negotiation tasks]
I initially found this terrifying. But then something clicked during our last discussion on "AI-first companies" — what if the most valuable skill isn't what AI can't do, but rather knowing how to extend your capabilities through AI?
Our professor challenged us to think about ourselves not as "knowledge workers" but as "capability architects" — people who design systems that combine human judgment with AI execution to create exponentially more value than either could alone.
The Career Development Shift
So what does this mean for how we approach career growth?
Traditional career development focused on vertical progression — deepening expertise in a narrowing specialization. This made sense in a world where expertise was embodied. The more specialized you became, the more irreplaceable you were.
But in a world where expertise can be codified and cloned, perhaps the most valuable people will be horizontal integrators — those who can synthesize across domains, direct ensembles of AI tools, and recognize novel applications for existing capabilities.
As one of my classmates pointed out: "The world doesn't need more people who understand machine learning algorithms. It needs people who understand both ML capabilities AND healthcare delivery challenges AND regulatory frameworks."
The Ownership Question
The thorniest issue we're wrestling with is ownership. If your expertise gets codified into an AI system, who owns that? If you train an AI assistant based on your decision-making patterns, and then leave your company, should they get to keep your digital twin?
One proposed framework we discussed was "data as labor" — the idea that the data exhaust from your work should be treated as a form of labor that deserves compensation. This flips the current model where companies implicitly own all the data generated through your work.
Some forward-thinking companies are experimenting with models where employees retain partial ownership rights to AI systems trained on their expertise. Others are creating internal markets where employees can license their "digital twins" across projects.
This isn't just an academic debate — it fundamentally reshapes the employer-employee relationship and potentially creates new power dynamics. And the pace at which these questions are emerging is breathtaking. As we discussed in class, the consumer-led adoption rate of AI is unprecedented in technological history — this isn't the gradual rollout we saw with personal computers or even smartphones.
Navigating My Own Worth
So where does this leave me (and you)? I'm working through this in real-time, but a few principles are emerging:
Identify your codifiable vs. non-codifiable expertise
Some aspects of what you do can be easily taught to an AI. Others — particularly those involving contextual judgment, relationship building, or creative synthesis — remain more distinctly human. Knowing which is which helps you focus your energy.Build your AI ensemble
Instead of fearing being replaced, experiment with creating AI "clones" that can handle specific aspects of your work. This allows you to focus on higher-level tasks while expanding your impact.Develop clear frameworks for value sharing
As you contribute to training systems that learn from your expertise, establish clear guidelines around ownership, compensation, and ongoing rights.Cultivate interdisciplinary fluency
The most valuable people will be those who can translate between domains, identifying novel applications for existing capabilities.
I don't know exactly where this is all heading, but I'm increasingly convinced that those who thrive won't be those who compete against AI, but those who become masterful at composing with it.
This is why I've been conducting an independent study into how individual knowledge workers are adopting AI in their workflows. What I'm finding is fascinating — while organization-wide AI adoption remains patchy at best, individuals are racing ahead, finding creative ways to integrate these tools into their daily work. The efficiencies are happening at the personal level first, often invisible to the broader organization.
My homework? Start prototyping my own AI clones — mini versions of my expertise in career coaching and job search strategy that can scale beyond what I could do alone. I'm both terrified and exhilarated by the possibilities. And maybe it's finally time to learn poker — a classmate's metaphor about successful people being those who can update their probabilities in real-time has stuck with me.
As Sebastian Siemiatkowski noted, "AI is capable of doing all our jobs, my own included. Because our work is simply reasoning combined with knowledge/experience. And the most critical breakthrough, reasoning, is behind us."
If that's true, then the real question isn't whether AI can do your job, but whether you're prepared to evolve your conception of what your job even is. And perhaps most importantly, can you develop what one of my classmates called an "immunity to uncertainty"? Because one thing is certain — this is all moving so quickly that most organizations (and certainly most higher education institutions) will be far behind those individuals who can move quickly, think ahead, and thrive in ambiguity.
The Klarna model of fewer employees with higher compensation might be the best-case scenario in some ways. But it raises tough questions: How quickly can even top performers upskill to stay ahead of the curve? And if the competition for fewer positions intensifies, will companies continue to pay premium rates, or will we see a race to the bottom?
Curious to hear your thoughts on this. How are you thinking about your own expertise in an AI world? And are you seeing signs of this shift in your own organization?