Why escape X?

(and other toxic platforms) ?

Let’s take back control of our digital spaces.

Social media platforms derive their power from conventions: we want to be there because many others are. Now that X/Twitter is being leveraged by Elon Musk for political purposes, X has become dangerous for democracies. It’s time to change this convention.

Changing a convention on a global scale is a significant shift. By definition, no one can do it alone. When Swedes decided to switch their traffic direction in 1967, everyone changed direction on the same day, the Dagen H, with the support of authorities.

The election of Donald Trump was a wake-up call and created a unique opportunity to make our digital Dagen H: a global shift in convention toward using social networks that don’t promote misinformation, polarize society, manipulate global opinion and drive destructive over-consumption.

Black box algorithms Rise of far-right Disinformation Over-consumption

Algorithms: What content do you want to see ?

X aims to show you more content Elon Musk and his paymasters want you to see, and less content that you want to see.  While social media algorithms were designed initially to help users find content they were interested in, over time, ad-based, lock-in platforms, such as X, Meta platforms, Tiktok, Snapchat etc pivot towards their real customers – advertisers – both commercial and political. This means that they show you less and less content from accounts you follow, just enough to keep your habit of coming back and attempting to stay in touch with what you care about, but more and more content that they want you to see. They collect data on you to make sure they are successful at manipulating you over time into seeing the content they want you to see. As a user, you have no choice over the algorithm within lock-in platforms. The algorithm is a black box, determined by the owner of the platform, and can be changed at any time, with no transparency, in the interests of advertisers and owners, like Elon Musk and Mark Zuckerberg.

In contrast, Fediverse ActivityPub platforms (Mastodon, Pixelfed, PeerTube), and AT platforms like Bluesky, show you content from accounts you follow, with no ads or pushed content. If you choose to browse an algorithm to find new accounts or content on these platforms, you get to choose what algorithms or feeds to use. You can even use tools to design your own algorithm, or use algorithms people you trust have designed. Full information is available on how the algorithms are designed.

In alternative social media, like the Fediverse, you are not locked in to any platform. It’s like email – whoever said that just because you use gmail and someone else uses yahoo mail that you can’t email each other? Of course not, because they use the same email protocol (called SMTP). The Fediverse is like this for social media. You don’t have to limit your followings/followers to the same platform, because platforms are using the same social media protocol.

Why don’t X and other platforms allow this choice and control? Because they want to lock you in. Because their business model is structured around making it difficult to leave their server, with the threat of losing your followers, followings and content if you do so. So they can decide what you see, and make money off that. Fortunately, there are tools to help you escape (see the how page). And once you escape X and other lock-in social media, and move to the Fediverse, you won’t have to escape again, because the Fediverse is not designed to lock you in in the first place.

Want to read more? Here are some links

You are here: Black box algorithms Next: Rise of far-right Disinformation Over-consumption

Rise of the far right: Toxic social media promoting hate, far-right, interfering in elections

Although the algorithms are not transparent, researchers have found ways of verifying what most of us already know – that toxic social media platforms like X are promoting the rise of the far-right, and hate.

If you are not far-right yourself, you will find that your content is increasingly becoming less and less visible on X and similar platforms, even to your own followers, and your followers are forced to scroll through toxic content to get to your content.

By asking people to follow you on X, or having X icons on your website or in your email footers, you doing free marketing for X, and therefore you are indirectly aiding the rise of the far-right – pushing people into platforms where they will be manipulated over time into hate and the far-right.

Read more through the links below.

Europe:

United States:

Previous: Black box algorithms You are here: Rise of far-right Next: Disinformation Over-consumption

Disinformation: Fact checking and moderation being abandoned on toxic social media

Firstly, toxic platforms have no real interest in moderation or fact checking. Their sole goal is to increase engagement and screen time on their platforms. Sensationalist content continually grabs attention. It is much easier to continually grab people’s attention with sensation, than it is to sustain people’s attention with quality fact-checked content. More in this podcast: Chris Hayes on the attention economy . Moderation, fact-checking, calm debate and reasoned arguments go against the grain of the business model of toxic social media.

Secondly, there have been attempts to force social media platforms to moderate and fact check through regulation, but these efforts are largely failing. The European Digital Services Act is widely regarded as the most comprehensive effort at regulating social media. Some argue that this could be effective, it just needs to be better enforced. Others argue, however, that if algorithms are black box, and moderation decisions not transparent, there is no way of really knowing whether social media platforms are adhering to the rules, and therefore enforcing them. You will find both opinions in this webinar ‘Hostile Takeover? – How tech billionaires are undermining our democracies & what the EU can do’ . Attempting to impose responsible behavior on companies without algorithm or moderation transparency may be futile. Platforms may appear to be following the letter of the law while circumventing its spirit. Or other platforms simply don’t bother to even try to follow the law – they may simply pay the fines (small change to them) or use the argument that they are “too big to fail”. For as long as government institutions, politicians and the media remain wedded to X, there is some truth to that – our institutions need to escape X before allowing X to fail.

More on this in this talk by Everton Zanella Alvarenga who started this petition for European institutions to leave X.

Of course moderation presents a challenge even in the Fediverse. But the de-centralisation of the Fediverse means that each server is responsible for it’s own moderation. It makes for more human moderation decisions, and clearer lines of responsibility. Many Fediverse servers are run by not-for-profits. Users can report content to the moderation team, and moderators can then remove users or block other servers, in line with server’s policy. If users don’t like the moderation of their home server, they can switch to another server, without losing their followers and followings. Or they can even volunteer to take part in their own server’s moderation team.

Black box algorithms Rise of far-right You are here: Disinformation Next: Over-consumption

Over-consumption – concerned about it’s destructive effects on people and planet? – X and ad-based social media drive it

X and similar platforms like Facebook, Instagram, Snapchat, Tiktok and YouTube, make their money through ads and sponsored content.

Does advertising work? We may think, as individuals, that we are immune to advertising, that we can just ignore the ads, but why would consumer goods advertisers spend such huge amounts of money advertising if it didn’t work? Evidence shows advertising contributes to climate breakdown (Greenpeace article). Highly sophisticated targeted advertising, tracking people and leveraging more data on people than they have about themselves to manipulate them, is even more successful than traditional advertising at increasing consumption. As well as nudging individuals into making certain decisions, advertising has the effect of influencing what is considered “normal” and “acceptable” in society. People are influenced, not only by seeing an ad and then directly buying a product straight afterwards, but by the perception drawn from persistent advertising that “everyone else is doing it”. This influences people’s conversations, which in turn drives increased consumption, and further drives what is considered “normal”, in a vicious circle.

How does ad-based social media contribute to unhealthy consumer culture? The main purpose of ad-based social media is to entice people to spend more and more, to a large degree on products and services which are unhealthy and destructive for people and planet. Platforms like snapchat aim to get young people hooked to generate insecurities about appearance and status, to prompt more and more spending on image-based, wasteful, throw-away consumer goods which never quite fill that insecurity hole.

They make no secret of this. Here is content from Snapchat’s own website:

The effects of advertising of junk food have been well documented (refer to the book Ultra-Processed People by Chris van Tulleken, or listen to this webinar by researcher Dr Mimi Tatlow-Golden). The campaigns over decades to ban tobacco advertising eventually succeeding in reducing smoking rates, showing the power of advertising. Continuous advertising of high-fossil-fuel products, such as SUVs and cheap flights, changes the norms in society and promotes more fossil fuel consumption, as detailed in the book Badvertising.

Of course, digital advertising and surveillance capitalism is about more than just social media, it is about the internet generally. However, social media like X, tiktok, snapchat, facebook, instagram, youtube, designed to keep users “engaged” and sell manipulation-as-a-service of users seeking connection and information, plays a significant role.

How are Fediverse platforms and Bluesky different?

Fediverse platforms such as Mastodon (micro blogging), Pixelfed (image focused) and PeerTube (video focused) do not contain ads and are not funded by ads. Most are run by not-for-profit social enterprises, and run on donations or contributions from users or social-mission-driven funders. The software is open source, and can be used by anyone who sets up a server. Server running costs are not very high. So many servers are run, quite effectively, on small budgets, by volunteers or with a small staff.

Bluesky is a for-profit corporation. However, the design of it’s algorithm choice model means that it would be very difficult for it to generate revenue from ads even if it wanted to – if you have a choice between an ad-based algorithm and a non-ad-based algorithm, of course you will normally choose the non-ad-based one. The business model of Bluesky still remains to be seen – however it looks unlikely to be based on ads.

Black box algorithms Rise of far-right Disinformation Over-consumption

Convinced of the “why” ? Move on to the “how” !