Manifesto
AI is supposed to steal jobs
Yes, the AI revolution will bring job losses. That’s the whole point. It’s why we invented technology in the first place. Our ancestors created the wheel in 4000BC when they discovered that work sucked. Six-thousand years later, nothing’s changed on the sucking front.
Why you want to 'lose' your job
Since 1980, company profits have increased 5.1% per year, while wages increased by 1.8% per year (Prof Steve Keen).
Those escalating profits push prices up faster than wages, so you end up working harder and harder to afford the very things you make. And not to belabor the point from before, but you don’t enjoy making those things in the first place. So tell me again, why would anyone want to work so bad?
It’s all about the money, of course. Companies have it and people need it. That’s why citizens keep signing up to produce more widgets for fewer wages year after demoralizing year.
And it’s why those same citizens jealously defend their shitty jobs against machines. But what if I told you that sweeping redundancies will actually re-balance the economic scales.
Pew’s 2023 workplace survey found just 42% of American under-50s find their work truly fulfilling. You can safely assume it's actually less than that, given that most humans don't admit they’re wasting their lives.
Those respondents who said they were satisfied with their jobs also happened to earn more, which suggests it’s the pay packet – not the job – that drives their satisfaction. And guess what’s happening to that one redeeming feature of work? Your pay per unit of output is shrinking dramatically.
Real wages are virtually unmoved between 1973 and 2013 despite worker output rising 74% over the same period (Wage stagnation in nine charts, Economic Policy Institute).
Why did those 40 annual salary reviews amount to an effective raise of nothing? Because when choosing between lifting wages or profits, employers always, always, always plum for the latter.
A proper revolution is here
One of the problems with technological leaps like the industrial revolution is that they displaced rank-and-file workers. That's not a revolution. It's just a whole lot more of the poor getting poorer, which is about as far from a revolution as you can get.
Those displaced laborers and toilers may have included some artisans and deep thinkers, but they weren’t college educated meritocrats. And so they had little influence over the economy from which they were being shunted.
In AI, we have a technology that is capable of doing more than cutting, drilling, bolting and welding.
This technology works with knowledge, the very purview of the college class. It will, for the first time, impose ‘efficiencies’ on the upper echelons of the economy.
The business leaders who have historically driven workers out of jobs in favor of machines may finally face the same reckoning themselves.
We can expect this to create very different politics and a softer landing for tech-displaced workers. Governments will roll out the safety nets for rich plutocrats. They'll have to save the rest, too.
Artificial intelligence was born to manage
AI is purpose built for business. It's excellent at recognizing and harnessing patterns to achieve a single end. It's prescient that this technology has come along at a time when the role of a CEO has become similarly reductive.
In today's uber-financialized world, CEO's have been reduced to single-variable decision-making. Spoiler: that variable is profit.
If it goes up, it’s good. If it goes down, it’s bad. All the other variables merely enable (or challenge) profit. That’s how all other aspects of business are now permanently framed – not as ends in themselves, but as means toward the one true end.
Want an example? Customer satisfaction is good but if it goes too high then you’re over-servicing them. A smart company would either raise prices or scale back services to move that value to its own bottom line.
And so it goes with employee satisfaction (cut wages and benefits), regulatory compliance (outsource or offshore dodgy functions), and everything else. They all become a financial calculus.
Do you know what does calculations well? Talking calculators. Once you reduce anything to a single variable, it's an ideal candidate for AI. And so here we are.
Finding a model on which to build MilkMonie
Our first instinct was to treat the market as a sort of commercial democracy, where customers vote with their wallets and popular companies are effectively endorsed to run rampant. In this scenario, CEOai would switch to full profiteering mode only once a company became a market leader. But after running simulations, we found that CEOai wasn’t as fast to start gouging as real-world CEOs, which meant our product would be leaving money on the table.
We were clearly missing some nuance in how companies assumed a mandate to exploit their market. The search for a replacement paradigm was long and extensive, but we ultimately settled on the school playground.
Every school lunchtime, bullies of the world exert control over a reluctant cohort even though they may not be liked by a single member of that group. And once ascendant, the bully extracts rent (aka, milk money) simply for not making things worse.
This proved to be an accurate analogy for the relationship between corporations and their customers. In particular, the bully’s clarion call – What’cha gonna do about it? – seemed to perfectly capture the dilemma of a customer (or employee or supplier) who’s been enslaved by a big corporation.
Our beta version of MilkMonie assumed that customers become trapped in an abusive vendor relationship because the switching costs are too high. In that scenario, the oppressed party simply can’t stomach the burden of phone trees, break fees, and loss of data that characterizes an exit from an old provider – only to go through an exhaustive onboarding process with a new provider. And then it hit us. The new provider is the problem, too.
A bully, you see, is not an individual. It’s a fixture. A phenomenon that’s as inevitable as playgrounds themselves. Escape from one bully is merely capture by another. Even kids know that.
MilkMonie's tacit collusion framework (TCF)
A sane customer isn’t going to endure a black eye to rebel from an extractive vendor only to be given two black eyes by the next corporation. MilkMonie beta’s big failing was that it assumed consumers had choice when in fact there’s no choice at all.
So we created a category tracker that monitors the deteriorating behavior of each company’s major competitors.
In effect, it shows how extractive particular industry verticals have become. All CEOai has to do is make sure it doesn’t surpass the accepted rate of value decline for the industry it’s in. You can give customers three black eyes a year so long as none of the other major players are giving only one.
We called this the Tacit Collusion Framework (TCF). You can, and should, be on the leading edge of value decline for your industry. But you can’t surpass it by more than one standard deviation. We’ve now built an index for measuring value decline in all major industries and baked the TCF into CEOai’s MilkMonie arrogance management system.
The software is now perfectly arrogant.
Or, put another way, everything sucks.
We needed a program to ensure MilkMonie (and by extension, CEOai) switched to extractive practices at the earliest opportunity. And that program needed to acknowledge that these sorts of decisions aren’t driven by individual companies but by whole industries at a time.