How Not to Be a Paranoia Peer Group
Nothing makes a group stick together more than a threat. The menace can be real, or it can be fake - we’ve seen many of those proliferate since the dawn of social media. More weirdness ensued when the issue of fake threats (or “conspiracy theories, misinformation”) was itself made into a menace.
At some point, even a non-existent fake threat got thrown in the mix, adding another layer of absurdity to the online weirdness. Various media have included a parody conspiracy theory in their articles on misinformation and YouTubers devoted their time to mocking or debunking it. A two minute Google search will unveil that “Finland doesn’t exist” was created as a joke, but that didn’t prevent the anti-conspiracy finger wagging.
Whether real, fake or double-fake, a threat seems to be the fuel that keeps an online group going. Internet pioneer Jaron Lanier calls people clustered on social media on the basis of fighting a menace “paranoia peer groups.”
You might ask: how is it paranoia if the threat is real? Judging by his book, I’d say he meant that an existent menace can become grotesquely amplified by social media algorithms, making a person preoccupied with it in an unhealthy way.
Lanier calls the paranoia-inducing computer programs BUMMER, an ingenious acronym for “Behaviors of Users Modified, and Made into an Empire for Rent.” BUMMER-machines are Facebook, Google, Twitter, Instagram, YouTube, or TikTok – all companies that modify behavior of their users and sell this feature to the highest bidders.
When we feel in danger, our senses are on high alert. If we learned about the threat from social media, logically we will spend time monitoring whether a threat is imminent – frequently checking “the socials” and trying to prevent the dangerous thing from happening by posting, retweeting and commenting. Not coincidentally, it is exactly this kind of “engagement” that brings social media companies money.
The importance of you and me writing “clown world” under outrageously idiotic posts becomes apparent from The Facebook Files. According to Facebook’s internal documents leaked in 2021, the firm faced a scare in 2017 – “engagement” on the platform was going down. People weren’t liking, sharing, commenting or posting updates about their lives as much as before.
“It's just horrific content. It's severed heads. It's horrible.”
As Wall Street Journal explained, although the time Facebook users spent on the platform was stable, their passive scrolling worried the management. According to Keach Hagey, they were afraid the social network was “turning people into zombies who were just passively sitting there spacing out, watching Facebook but not doing anything,” which could mean “they would sort of snap out of it and then stop using Facebook.”
At that point, the company overhauled their “video-heavy” algorithm and made a major change to what users saw in their newsfeeds. Not only would they show people less content by “professional posters,” and more by their “friends,” they’d also highlight the most engaging stuff – measured for example by the number of comments under a post.
Facebook whistleblower Frances Haugen told Wall Street Journal in 2021, that as part of the now defunct Civic Integrity Team, she would go to meetings called Virality Review. There, people would pull up top 10 posts that had gone viral in the last week in areas marked as at risk of genocide, explaining why each of them was a success.
“It's just horrific content. It's severed heads. It's horrible,” Haugen told WSJ. As is apparent from her explanation, the gruesome posts were a direct result of Facebook’s algorithm favoring the most “engaging” content - in other words, intense anger, fear and outrage-provoking stuff.
"..it is apparent that by tweaking their algorithm, Facebook has the power to dial up or down the intensity of movements."
The reason why she made her inside information public was a feeling of hopelessness. She saw that Facebook management was aware that their BUMMERs were destabilizing the world, and she didn’t see meaningful effort to change them.
As Haugen revealed to WSJ, most of the proposed changes were killed when the company realized they would curb engagement: “There are many internal documents that talk about (…) the trade-offs that people are willing to accept. If you had to decide between 1% few sessions and 10 or 20% more misinformation, Facebook is consistently saying 1% of sessions is worth 10% misinformation.” In social media speak, a“session” happens anytime you interact with a platform – you open your Twitter app, login to Facebook, read your notifications on TikTok or comment on a YouTube video.
From Haugen’s account, it is apparent that by tweaking their algorithm, Facebook (and other social media platforms probably as well) has the power to dial up or down the intensity of movements. For example, she revealed that before the 2020 US presidential election, Facebook put on measures to avoid riots so that the elections would run smoothly.
“These are things around like, how reactive is the platform? Is it viral? Those things about ranking, right? Some of those signals that make it easier for angry things to go out, they got tamped down a little bit for the election because they didn't want to have riots at the election.”
However, as these algorithm tweaks “made Facebook a little slower,” which lowered engagement and profit, in Haugen’s words, they “turned off all those safety mechanisms after or they went back to their old settings after the election.” Then, when Capitol riots happened, Facebook staff turned the mechanisms back on.
As Haugen explained, she saw proof that the algorithm tweaks had direct impact on the size of movements: “there was documentation that a lot of the Stop the Steal groups and all those things, they grew so fast because of choices Facebook made to prioritize growth over safety.”
TERF World
What does it mean for the radical feminist movement to depend on social networks that thrive on conflict and their staff secretly tampers with the growth of movements? The implications are nearly endless. We have all witnessed algorithm-amplified infighting and the divisions cause by such public skirmishes. We know what it’s like for our movement to be tampered with – by silencing us on social media – banning, shadow-banning, or making our posts invisible by opaque processes.
But we haven’t yet talked about how these algorithm are changing our personalities. Above, I’ve mentioned the importance of paranoia for social media engagement. Speaking of paranoia in the context of feminism is tricky – women have been accused of irrational fear for centuries simply for raising concerns about our rights, or for being in any way “difficult” to people in power.
To this day, we face mass disbelief and are blamed for causing “moral panic” because we fight for the sex-based rights of women and for child safeguarding.
In such a climate, it can become nearly impossible to discern when a fear is truly irrational. However, for the sake of our own mental health, and the sustainability of our movement, we should try. Alas, no radical feminist is immune to the impact of social media and our movement is just as susceptible to becoming a “paranoia peer group” as any.
"In such a climate, it can become nearly impossible to discern when a fear is truly irrational."
One example of a fake radical feminist scare spurred by social media is something I’ll call the “Primark Porn Hoax” (PPH). In October, radical feminist Twitter gave rise to a claim that there exists a category on PornHub and other adult video sites that contains secret footage of women and girls undressing in Primark changing rooms.
According to my colleague Andreia Nobre, who first alerted me to the case, PPH started when a Twitter user tagged the store and wrote:
“Hello @Primark, are you aware that your stance on changing room security in regards to safeguarding women and girls has now pretty much given you your own porn category on every single porn video site? Do you think you should be doing something to protect your customers?”
In the following tweet, he explained he blamed Primark’s unisex changing room policies for giving rise to the issue and claimed that the porn category included“ videos of women unknowingly being filmed, and some of males masturbating in female changing rooms.”
As is often the case with social media hoaxes, this 800 times retweeted claim can be easily debunked. Although there really exists a category called “Primark” on big sites, such as PornHub or YouPorn, the few videos under the category don’t contain anyone being unknowingly filmed, nor men masturbating. The only way a person could claim otherwise is if they only judged by the titles without watching the videos. In reality, after you force yourself to click play, you find out the “male masturbating” is a female and the “women unknowingly filmed” are porn actresses acting out staged scenes.
I don’t blame the author of these tweets of willingly misleading the public – I have zero idea about his motivation and no clue whether he is a real person or a bot. But, supposing the user is real and he based his Tweet on video titles only, it is a perfect example of a paranoia peer group-induced hoax.
Most causes promoted on social media have probably spawned hoaxes. The anti-conspiracy movement has done it; the trans activist movement is itself based on a false claim - so why does it matter that the radical feminist one has its own misinformation? Well, it matters to me, as I care about women’s rights and the rights of children.
"BUMMER has trained us to be on high alert, and in such a state, we consider critical thinking too slow.."
To that, you may answer that “a few hoaxes are irrelevant in the grand scheme of things. We must stop the mutilation of children and the erosion of women’s rights.” Further, you might argue that the radical feminist “engagement” on social media is the only thing that can stop scary stuff from happening. We saw that our Twitter backlash can contribute to changes for the better, especially in the UK. We may, in your opinion, need more such “engagement,” not less.
A year ago, I would have wholeheartedly agreed with this point of view, but now I’m not so sure. Hoaxes, although damaging, aside: working in radical feminist publishing, I saw what BUMMER can cause first-hand. I’ve seen our quality, uplifting, inspiring and smart articles get 100 views, while our most gruesome, outrageous, depressing stuff amassed a thousand times more. I saw some of my colleagues use or ponder using unethical means to raise views. I saw some of them excusing bullying by people who can bring thousands of retweets. I’ve seen a worship of numbers – of likes, views, comments (ratios), and of course, retweets.
I am not immune to being in awe of a 10 000-strong retweet. It does bring an issue into the spotlight. But there’s a downside. Let’s say that a 100 000 times retweeted post about children’s “top surgeries,” or unnecessary double mastectomies, would help stop that practice in the UK. On the surface, this seems great. However, if such a major change could be the result of a social media engagement number, the reasons for concern outweigh the reasons for celebration.
"..when political decisions are determined by social media engagement numbers, we’ll see more bad actors trying to manipulate the numbers."
What happens if after such a change, a post with a counter message goes viral? Then, what if the BUMMER-machines highlight and help normalize even more gruesome practices than unnecessary mastectomies?
Besides, when political decisions are determined by social media engagement numbers, we’ll see more bad actors trying to manipulate the numbers. Whether it’ll be social media staff tinkering with algorithms, or people employing bots, using paid advertisement, or amassing views by giving BUMMER what it wants – endless conflict, paranoia and partisanship. Women are not special and bad actors can also be radical feminists.
So, how not to be a Paranoia Peer Group? We could try to fact-check social media claims before we share them with others. However, I don’t think this is realistic. BUMMER has trained us to be on high alert, and in such a state, we consider critical thinking too slow. If a huge fish is lounging towards you at sea, you don’t study whether it’s a shark – you get out of the water.
The only remedy I see right now is ditching the BUMMER machines – especially Facebook, Twitter, TikTok, and Instagram altogether. In continuing our social media activism, we may avert one danger, but we’re also going to be used by algorithms to create a thousand more.
While your social media absence might seem like a drop in the ocean, believe me, the social media companies care, especially when you live in the US and Canada. As we’ve seen, Facebook management is more interested in making money than preventing social unrest. By choosing to abstain, you may motivate the BUMMER machines to reform.
And, some time after quitting, you may realize you’re less anxious, depressed, or agitated; your focus is better, you’re able to read books again and also to determine which issue really needs urgent attention.
4W provides a platform for over 70 feminist writers in countries spanning the globe. This work is made possible thanks to our paid monthly subscribers. Join today to support our work!