When Elon Musk arrived with his sink, I departed. Late last year I finally made the break from Twitter. It had become so full of hatred, bigotry, vitriol, culture warring, trolling and general bad vibes. It leaves no room for nuance, and the more extreme the sentiment the further it will go. Musk’s track record signalled this was to get worse – if he didn’t just destroy it first.
He misused Twitter to manipulate share prices, to promote damaging and offensive conspiracy theories and to accuse a man of being a paedophile on absolutely no evidence – because he dared to turn down a publicity stunt offer of a submarine loan to rescue kids stranded in a flooded cave in Thailand. Now he’s running the platform, he’s dismissed key staff, haemorrhaged many more, even mocked one of his staff for his disability.
Musk is just the most prominent and divisive in a line of social media and other big tech leaders who have built or bought huge companies the size of countries that are increasingly looking like they’re too big to care. That’s why they’re now facing more government regulation, legal challenges and consumer pressure.
The coroner of the inquest into the 2017 death of teenager Molly Russell issued a range of recommendations last September, from reviewing algorithms to separate platforms for children to legislation. The NSPCC called his ruling a “big tobacco moment” for social media. It’s just the latest in a long list of serious concerns from consumers, regulators and politicians about the behaviour of big tech and social media in particular.
The UK’s Online Safety Bill is held up but still expected to become law this autumn. The European Union’s Digital Services Act will soon enforce online safety measures in the EU. Meanwhile the US, which is home to so much big tech, is lagging behind. But Paul Dempsey considers the likely impact of the Council for Responsible Social Media, a cross-party group of leaders that will try to balance freedom of speech with concerns about privacy, minors, national security and more.
Why can’t these big tech giants, with all their nation-like resources and investments in incredible engineering teams and cutting-edge artificial intelligence, do a better job of policing their own platforms? In this issue, we look at the scale of the problem and ways in which technology could help – if the will to use them is there. In addition, Chris Edwards looks ahead to what Web 3.0 will mean for today’s Web 2.0 tech giants.
Governments are also getting twitchy about national security. They’re concerned about the effects of platforms like TikTok on children, but they’re suspicious too about its relationship with the Chinese government and how the app works – at least enough to stop their own people using it on work devices. Hilary Lamb sets about answering ten key questions around TikTok.
Leaving Twitter was not that hard for me; I’m far from addicted to social media. In fact, mine is the generation that feels guilty about not using social media enough rather than too much. Most of what I did post shifted to the magazine’s account, which still persists. For fun stuff I moved to Instagram and for work LinkedIn, a far more civilised and indeed responsive environment. My posts on Twitter had become more and more infrequent. I won’t be missed there and my departure won’t bring it down. Sometimes it looks as if its new CEO is having a go at that.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.
Leave a Reply