I first joined Facebook in 2009 at age 10, when it was a very different platform to what it is today. The news feed was reverse chronological and featured posts from my friends, family and the pages I had found and followed. By October 2009, Facebook had started the development of its algorithm, the set of calculations that determine the content on its users’ news feeds and in what order that content appears. The role of the algorithm is to push content that optimises engagement, keeping users on the platform for as long as possible to increase advertising revenue.
Facebook and Instagram are centralised social media platforms owned by a single entity, Meta, that has control over each platform’s content and its users’ data. Meta tracks the posts you like, comment on and spend the most time looking at. It then pushes similar content to your feed indefinitely. Posts that draw on primal human emotions like anger and fear consistently engage users for prolonged periods of time.[1] The moral outrage that such posts can induce has led to an increase in extremist content appearing on mainstream feeds. For example, if you engage with content about homemaking, the algorithm will try to push you toward highly engaged topics like the anti-vaccine movement. As soon as a user shows interest in one conspiracy theory, the algorithm will push more radicalising disinformation towards them. A lack in media literacy leads many vulnerable users down hateful paths that often glorify white supremacy and misogyny. An inquiry into the 2019 Christchurch mosque attacks, which Australian terrorist Brenton Tarrant livestreamed on Facebook, found that both Facebook and YouTube’s algorithm were crucial to Tarrant’s radicalisation.[2]
The tendency for Meta’s algorithm to push hateful content has resulted in the company’s complicity in genocidal acts around the world including, the genocide of the Rohingya Muslims in Myanmar.[3] A consistent flow of disinformation, anti-Muslim hate and users’ dependence on Facebook for news, led to vigilante lynchings of the country’s minority.[4]Similar hate crimes and lynchings have taken place in Sri Lanka, Germany, Indonesia, India and Mexico, all with documented and consistent ties to extremist and false claims made on Facebook.[5] One study, by Karstan Muller and Carlos Schwarz, analysed every anti-refugee attack in Germany over a two year span and found that towns with higher Facebook usage reliably experienced more attacks on refugees.[6] When compared to general internet usage, no correlation was found. Despite the role that algorithm-driven disinformation and hate speech has played in off-screen violence, Meta’s CEO Mark Zuckerberg announced in January this year that the company will end its fact-checking program and loosen their hate speech rules.
Considering all of this, continuing to engage with Meta does not align with my values. However, a fifteen-year reliance on these platforms for social validation and connection makes leaving them a daunting and honestly, life-altering task. As an emerging artist, Instagram is a primary way I engage with contemporary art. Additionally, it serves as a free, easy-to-update portfolio of my artistic practice. Still, I am left wondering at what cost do I engage with these platforms? While I do not pay with my wallet, I pay with my time, attention, personal data and, when my engagement trains a morally defective algorithm, I pay with my servitude. Without legislation to hold these corporations accountable, it is crucial consumers are realistic about the role engagement with these platforms will play in the liberation of marginalised voices. A powerful realisation that may come from boycotting is that companies often manipulate us to make choices against our own interests. Many people will spend hours on Instagram despite being aware of the negative psychological and cognitive impacts.[7] This is not the consumers' fault, as social media has been constructed to keep users indefinitely addicted; attempting to moderate our time spent on algorithmically-powered platforms is a losing battle.[8]
Social media is not inherently harmful, but algorithmic feeds are. The leading alternative platform for photo sharing is Pixelfed. Pixelfed is decentralised as it spreads its data across multiple servers. This protects the privacy of its users while ensuring that there is no central point of content control. Pixelfed is maintained by a community of volunteer programmers who follow a code of conduct that emphasises empathy, inclusion and a focus on community needs. Its most stark difference to Instagram is the absence of any algorithm; your feed on Pixelfed is purely chronological. This will undoubtedly make your time on the platform, to put it bluntly, less interesting in comparison to Instagram (or Facebook, Tik Tok, YouTube and X). It is this deficit that cements my optimism about the possibility of any ethical social media at all. The platform functions in a similar way to old-school Instagram. You can post photos, videos and 24-hour long stories. You can also download your entire Instagram profile and upload it to your Pixelfed profile with the dates intact. Funding for this platform is crowd-sourced (ad-free), eliminating biases that are instilled in profit-driven corporations. Pixelfed meets artists needs for an accessible, social media-based portfolio, but lacks in its audience so far. To make migration to alternative platforms worth it, enough users in a social circle need to migrate. If you aren’t ready to leave Instagram, having an account on Pixelfed as well at least gives people in your community the option to delete their accounts with Meta.
If you are concerned with the global state of democracy, having a constant flow of enraging information, regardless of its accuracy, will not help. Engaging with algorithms that want you on their platforms indefinitely is counterproductive to any liberatory cause and the negative cognitive effects of prolonged social media usage will not radicalise you into action. For fifteen years I have been training, and trained by, dangerous algorithms made by multi-billion dollar companies. Companies that, despite knowing the effect these algorithms have, will not change them. The less we engage with these platforms the less we teach them how to control us. The less we engage with these platforms, the more we engage with each other. The revolution will not be posted on Instagram, because centralised social media companies do not want a revolution.