Uncategorized

At the Mercy of the Algorithm

Early in my tiktok usage the app started showing me videos of people trying to break into hotel rooms. There were a startling number of videos of frightened women watching someone sliding a hook on a long piece of wire under their door and trying to undo the lock. Initially I was gripped by a sort of horrified fascination, but then I managed to reboot my critical thinking and consider why these women were filming the process rather than calling hotel security, or the police. The end game appeared to be trying to sell various devices to keep you safe when travelling. Note that the end game was not to keep me safe, or make me feel safe, it was to exploit my fears and sell me stuff I don’t need.

It was a weird rabbit hole to go down, and it led to me changing my viewing habits to make sure I don’t watching anything I don’t actively choose to watch. Now my viewing is almost exclusively comedians, snippets of old tv shows I am fond of, like West Wing, and cats doing funny things. Also quite a lot of David Tennant and Michael Sheen, and, obviously, anything involving Hannah Waddingham.

When Tiktok tries to show me anything else I skip straight past it. It’s a positive experience overall. But it wouldn’t be if I let Tiktok determine my viewing. I would be down a frightened rabbit hole where danger lurks around every corner. Tiktok wants me to watch more Tiktoks, and the most effective way to get me to do that is to stoke my fear and rage.

That’s obviously terrible for my mental health.

In the recent flurry of political grandstanding about banning social media for children under 16, we seem to be taking for granted the idea that social media has to behave the way it does, be what it is. That these dark and frightening rabbit holes are a necessary part of the social media experience. An unavoidable consequence of the technology.

They are not.

Bluesky (at present) has a chronological timeline. You follow people (or look at topic/interest based feeds such as science, education, or sport), and you see what they have posted. It doesn’t grab you by the fear and try to drag you deeper into the morass. It shows you what you ask to see.

Tiktok, Youtube, Facebook, Instagram, Threads, Twitter, you name it – they could all do that. In some cases I force them to by creating my own lists and only looking at those, but I keep having to wrest back control. They really don’t want me to control my own viewing. They want me to see what they tell me to see. They want to control my attention, and manipulate my emotions, to make their systems more profitable.

What if we said no?

What if we outlawed those engagement algorithms?

What if we insisted that we all get to choose what we see?

I dare say social media companies would find ways around it – circumventing laws is one of the tech industry’s favourite past times, after all, whether they are labour laws, privacy laws, environmental laws, or anything else designed to prioritise people’s wellbeing over industry profits. But we have governments for a reason. In theory (I admit, often not in practice), the job of government is to maximise the wellbeing of its people, its lands, and its flora and fauna. Not to maximise the profits of any company, social media or otherwise.

Whether it’s fear, racism, misogyny, conspiracy theories, climate denialism, or anti-vax rhetoric, I don’t want it shoved down my throat. I don’t want to be dragged down into the mud. I want to use social media to connect with my friends, to meet like minded people, and to learn.

The EU has shown that governments can enact effective privacy laws, such as the GDPR. It’s not perfect, but nor has it brought about the techpocalypse that tech companies tried to persuade us it would. So why not double down on those privacy laws (which we don’t have in Australia, by the way), and add in some consumer wellbeing laws. Why not build a regulatory framework where personal and societal harms are unacceptable outputs of technology?

I’m not suggesting that regulation would be easy. But I didn’t start ADSEI because it would be easy. Sometimes we have to do the hard things in order to make progress.

Leave a Reply