Are Our Tech Masters Really Free AI Hammer Distributors?
Handshake, or punch in the face?
Our Silicon Valley Overlords encourage us to seek conflict rather than empathy, but this fights against human nature.
As Rutger Bergman documents in his ‘hopeful history’, Humankind, ours is a sociable, cooperative, empathetic species. Humans have evolved to help each other, not beat each other up .
Bergman challenges an array of long-held beliefs about man’s savage nature, the kind of ‘nature red in tooth and claw’ narratives that cast homo sapiens as Lords of the Flies, with only a thin veneer of civilisation restraining us from our perma-violent instincts.
He points out that most of these go-to dystopias, like Lord of The Flies, are fictional. Bergman also demonstrates that much of the evidence we think exists for our dystopian self-image, has turned out to be fiction. Humankind comprehensively demolishes the scientific basis of the Stanford Guards v. Prisoners Experiment and Stanley Milgram’s Electric Shock Machine studies.
What should surprise us is that we believed them at all. The notion that humans are hard-wired to harm each other defies our everyday experience.
When we meet a stranger at a bus-stop, or a party, or at a coffee-break, our instinct is to seek areas of mutual interest, not conflict. If we stumble across an area of contention, we tend to let it lie, and find something we hold in common. When we part, we shake hands, rather than punch each other in the face.
Is AI Good or Evil?
AI models that are optimised for consensus, like the pol.is mediating interface that Taiwan is increasingly using to govern itself, consistently produce results of 90/10 rather than 50/50, i.e. we agree on most things.
When, as with pol.is, software has no ‘Reply’ option, we’re obliged to address the message, not the messenger, to deal in ad rem rather than ad hominem discourse.
Given the depth of our collaborative reflexes, overcoming humanity’s empathetic instincts may well be our SVOs’ most remarkable achievement.
The Key Performance Indicators on which Facebook’s software engineer bonuses are assessed, however, include optimising conflict. Quite literally, the closer the algorithm cleaves to a 50/50 split on any given issue, the more money the coder earns.
This makes capitalist sense. More conflict = more eyeballs = more advertising space = more money.
If an engineer makes Facebook more money, he or she increases the size of Facebook’s pie. It makes sense that they should deserve a few more crumbs. The more widgets Facebook helps sell, the more crap goes in the sky.
It’s sad, but true, that the smartest people from around the planet are hired by our SVOs to Sell Us More Shit. Buying stuff increases rather than diminishes the amount of atmospheric carbon.
Bemoan this fact if you wish – we defer to no-one in our bemoaning – but if you want to take any action to reduce carbon, it’s critical to understand this mechanism.
Because it is just a mechanism. AI is just technology. There’s nothing intrinsically evil, or for that matter carbon-increasing, about AI, social media or behavioural psychology. That depends on how these tools are applied.
AI and effective climate activism?
To repeat our favoured analogy, you can use a hammer to tap in the last hand-hewn peg to complete a sustainable eco-lodge, or to smash in a stranger’s skill. It’s just a hammer.
Our SVOs happen to optimise these tools for selling advertising space. Their entire ‘free’ infrastructure is based on the assumption everyone else using it shares their mercantile motivation.
In this sense, Facebook, Google, Twitter, TikTok et al are Free Hammer Distributors. When our SVOs are FHDs, it would be churlish not to use their remarkable free infrastructure.
And that’s what See Through News is doing. For more details and examples, see pretty much every other article on this website…