nash / projects

We Are All The Algorithms Now

Andrew Sullivan wrote a fascinating article that touches on points that I’ve understood for a long time now: the algorithm of the social network you’re on has now twisted your perspective of the world around you. You think you’re in control (because of course an algorithm can’t control your life), but look at the things that have your attention. It’s feeding you more of the same, and removing all the context and counterpoints. “Alternative views, unpleasant facts, discomforting arguments, contextualizing statistics, are, with ever-greater efficiency, filtered out of what our eyes can see and our minds absorb,” Sullivan says. If it doesn’t fit our narrative then it must be wrong.

This has been a fascinating year for our country. It’s hard for me to write anything, because I look around and see everyone’s senses raised to the highest level. We’ve all become so sensitive to anything that doesn’t agree with the way we think the world should be. We’ve lost all respect for the other side’s argument. It’s filtered out now so we hardly see it, and we’ve magnified our viewpoints so much that there is no other way but our own. We’ve become the algorithm.


Andrew Sullivan:

Attention, they meticulously found, is correlated with emotional intensity, outrage, shock and provocation. Give artificial intelligence this simple knowledge about what distracts and compels humans, let the algorithms do their work, and the profits snowball. The cumulative effect — and it’s always in the same incendiary direction — is mass detachment from reality, and immersion in tribal fever.

With each passing second online, news stories, graphic videos, incendiary quotes, and outrages demonstrate their stunning utility to advertisers as attention seizers, are endlessly tweaked and finessed by AI to be even more effective, and thereby prime our brains for more of the same. They literally restructure our minds. They pickle us in propaganda. They use sophisticated psychological models to trap, beguile, outrage, and prompt us to seek more of the same.

Alternative views, unpleasant facts, discomforting arguments, contextualizing statistics, are, with ever-greater efficiency, filtered out of what our eyes can see and our minds absorb. And what we therefore believe becomes more fixed, axiomatic, self-reinforcing, and self-affirming. We become siloed into two affective tribes, with dehumanization of each other deepening with every news cycle.

You don’t go down a rabbit-hole; your mind increasingly is the rabbit hole — rewired that way by algorithmic practice. And you cannot get out, unless you fight the algorithms to a draw, or manage to exert superhuman discipline and end social media use altogether.

But the thing about algorithms and artificial intelligence is that they don’t rest, they have no human flaws, they exploit every weakness we have, and have already taken over. This is not a future dystopia in which some kind of AI robot takes power and kills us all. It is a dystopia already here — burrowed into our minds, literally disabling the basic mental tools required for democracy to work at all.

If you watch video after video of excessive police force against suspects, for example, and your viewing habits are then reinforced by algorithms so you see no countervailing examples, your view about the prevalence of such excessive force will change, regardless of objective reality.


If you found some value here, consider becoming a member.