Back to ArchiveCompany Culture & Operations

The Three Ways to Fire a Human Being (And What They Reveal About 2026)

James HuangMay 13, 20269 min read
AI Generated Cover for: The Three Ways to Fire a Human Being (And What They Reveal About 2026)

I was having breakfast in a Shibuya café last Tuesday when a friend texted me a screenshot. It was an email. Subject line: "Your Position Has Been Eliminated." Sent at 6:00 AM Pacific Time. The body was three sentences long. No name. No phone number. Just: "Today is your last day. Your system access has been revoked. Do not attempt to enter the building."

My friend had worked at Oracle for eleven years. He found out he was unemployed the same way he found out about Amazon package deliveries.

That was Oracle's method. Surgical. Instant. Dehumanized by design.

But here's what keeps me up at night: Oracle wasn't dying. They made $3.7 billion in profit last quarter. They fired him not because they couldn't afford him, but because they needed his salary to buy Nvidia chips. The person was liquidated to fund the machine.

And he's not alone. We're watching the largest corporations in the world experiment with different flavors of the same thing: how to remove humans from the balance sheet with maximum efficiency and minimum legal exposure. If you want to understand what's coming for the rest of us, you need to study their methods.

Method One: The Execution (Oracle)

Oracle didn't bother with theater. On March 31st, they sent automated emails to roughly 30,000 people—18% of their global workforce. No manager conversations. No HR exit interviews. Just instant deactivation. Keycards stopped working. Laptops were bricked remotely. Entire departments vanished between coffee and lunch.

The brutality was the point. When you're trying to extract $10 billion from payroll to pour into AI infrastructure, you don't have time for empathy. You need speed. You need the remaining employees to understand, viscerally, that the old social contract is ash.

My friend told me the weirdest part wasn't the email. It was that his Slack was still showing him as "active" for three hours after he was fired, because the automation hadn't synced with the status system yet. He was a ghost in the machine, technically online but legally nonexistent.

Method Two: The Gentle Euthanasia (Microsoft)

Three weeks later, Microsoft took the opposite approach. They called it the "Rule of 70." If your age plus your years of service equals 70 or more, you were eligible for a "voluntary buyout." Generous severance. Accelerated stock vesting. Extended healthcare. The Chief People Officer wrote a memo about "taking your next steps at your own pace."

It sounds humane. It sounds like Microsoft cares.

But look at the math. This targets roughly 8,750 senior employees—the veterans who built the Windows era, the cloud migration, the enterprise sales engine. These are brilliant people in their fifties who remember when the company had 10,000 employees instead of 200,000. They're also incredibly expensive. Decades of raises and stock grants have made them luxury items on a balance sheet that needs to fund $145 billion in AI data centers.

The "voluntary" framing is legal architecture, not kindness. Firing older workers at scale invites age discrimination lawsuits. A "voluntary buyout" neutralizes that threat. The employees "choose" to leave. The company avoids litigation. The headlines read "Microsoft Offers Generous Exit Packages" instead of "Microsoft Purges Veterans."

I know someone who took the package. He's 54, spent 19 years at Microsoft, helped build Azure's early infrastructure. He told me the worst part wasn't leaving. It was realizing he'd spent two decades becoming an expert in systems that the company no longer considers strategic. His institutional knowledge was valuable right up until the moment it wasn't.

Method Three: The Gamified Purge (Meta)

Then there's Meta, which managed to make layoffs feel like a competitive video game.

In April, an internal leaderboard leaked. Employees had created it themselves—called "Claudeonomics"—ranking all 85,000 Meta staff by how many AI tokens they burned per month. The top users consumed $1.4 million worth of compute. It started as a joke, then became a survival metric.

Because Meta's HR had already mandated that "AI-driven impact" was a core performance dimension. If you weren't using AI tools, you weren't getting promoted. If you weren't "tokenmaxxing"—burning through API calls to look productive—you were falling behind. Employees started running prompts just to inflate their numbers, like office workers leaving lights on to fake overtime.

Then the purge came. 8,000 jobs cut. 6,000 open roles frozen. The criteria wasn't tenure or loyalty or even raw output. It was algorithmic leverage. Could you do the work of three people with an AI copilot? If yes, you stayed. If no, you were optimized out.

A friend who survived told me the psychological damage was worse than the layoffs. "Everyone's running scared, but not of being fired. We're scared of not being useful enough to the algorithm. It's like living in a company where your humanity is a bug, not a feature."

The Ghost of 1998

Watching these three methods unfold, I kept thinking about something my uncle told me. He was a factory worker in Northeast China in the late 1990s. State-owned enterprise. Iron rice bowl. Lifetime employment guaranteed by the Communist Party.

Then in 1998, the bowl shattered. Between 1998 and 2000, China laid off up to 9 million state workers per year. My uncle got a meeting with his manager, was handed a meager lump sum—"buying out his seniority" (買斷工齡)—and told to find his own way. He was 47. He'd spent 25 years mastering lathe operation in a factory that was now bankrupt.

At first glance, Microsoft's "Rule of 70" looks identical. Both involve shedding masses of 40-to-50-year-old veterans who gave their prime years to a single institution. Both use polite language to mask brutal economics.

But my uncle and my Microsoft friend are living through fundamentally different tragedies.

My uncle was thrown overboard because the ship was sinking. The state factories were bloated, inefficient, losing money to market competition. The entire economic model had failed. His layoff was an act of salvage—ugly, desperate, but defensive.

My Microsoft friend was pushed off a ship that's sailing faster than ever. Microsoft's stock is at all-time highs. They're spending $145 billion on AI infrastructure because they have too much money, not too little. His layoff wasn't defensive. It was offensive. He was liquidated to buy ammunition for a war he didn't know he was fighting.

As Larry Ellison put it with characteristic charm: "AI coding tools allow engineering teams to deliver more complete solutions with fewer people." Translation: we don't need as many of you anymore, and we'd rather spend your salary on silicon.

The Architecture of Disposable

What connects all three methods—Oracle's execution, Microsoft's gentle euthanasia, Meta's gamified purge—is a shared architectural assumption: human tenure is no longer an asset. It's a depreciating liability.

For decades, corporate loyalty was supposed to be a two-way street. You gave the company your best years; the company gave you stability, progression, and a soft landing. That equation is being rewritten in real time.

Your twenty years of institutional knowledge? Misaligned with the AI era. Your deep relationships with clients? Replaceable by an agent that never sleeps. Your mastery of internal processes? Those processes are being automated.

The only variable that matters now is: how much of your job can only be done by a human, and how much of that human value can an AI replicate for 2% of your cost?

If the answer tilts toward the machine, you're not being "laid off" in the traditional sense. You're being converted into capital. Your salary becomes GPU budget. Your benefits package becomes training compute. You are being liquidated, not fired.

What I'm Actually Telling My Friends

I've stopped giving the standard career advice. "Upskill" feels insulting when the skill you spent a decade building was deprecated in a model release. "Network" feels hollow when everyone in your network is also updating their LinkedIn to "Open to Work."

Instead, I've been asking people one question: What part of your job, if you described it honestly, would make an AI say "I can't do that"?

Not "what's hard for AI"—because AI can do most hard things. What's impossible for AI? What requires your specific scar tissue, your particular weirdness, your accountability for a decision that could cost someone millions?

For my Oracle friend, the answer came slowly. He wasn't just a database administrator. He was the person who, at 2 AM during a production outage, could look at a dashboard and feel that something was wrong before the metrics showed it. He could smell a cascading failure. That intuition—built from eleven years of near-misses and 3 AM war rooms—isn't in any training data. It's in his nervous system.

He's consulting now. Not because he learned a new skill, but because he finally understood what his old skill actually was: pattern recognition from damage. The machines have the patterns. He has the damage.

The Honest Ending

I don't have a happy conclusion. I don't think 2026 is going to be a year where everyone finds their place in the new economy. I think a lot of people are going to discover that their "safe" careers were built on assumptions that expired quietly while they were working.

Oracle's method tells us that speed matters more than dignity. Microsoft's method tells us that politeness is just litigation avoidance. Meta's method tells us that your human value is being measured in API calls.

And the 1998 parallel tells us that this isn't a recession. It's a restructuring. The ship isn't sinking. It's just decided it doesn't need a crew anymore.

The question isn't whether you'll get an email at 6:00 AM. The question is whether you've built something that can't be sent in an email.

— James, Mercury Technology Solutions, Tokyo, May 2026

Company CultureLeadershipHR TechnologyDigital TransformationCrisis ManagementTeam Management

Originally published on MTS Blog & Research