This article kind of disrupted my editorial plan, which is usually all about the FEDIVERSE and the apps and services that live there. But this topic couldn’t be ignored—not so much because of the ethical and moral issues (which more qualified people than me have already covered), but to really understand how social media algorithms work.
I think by now most of us know about the Facebook group “My Wife.” I found out about it from online sources, but among all the comments on the topic, one from a LinkedIn user really stood out to me, so I asked their permission to include their post in my article.
Here it is:
author Andrea Antelao – link to the post: https://www.linkedin.com/posts/andrea-antelao_facebook-32mila-uomini-che-condividono-foto-activity-7363555131201642497-xUfu?utm_source=share&utm_medium=member_desktop&rcm=ACoAAAKL4ScBHHNjQ3CVuCT6jMlJVsooEQKqQl0
For a while, let’s just forget about this case (I don’t want to add comments, partly because I’m not sure I could hold myself back). If you use Facebook or other social media, you’ve probably seen posts that spark endless debates, with comments on comments, sometimes heated arguments that go beyond manners and tone, even turning into threats. Yet no one seems to care or intervene, and this isn’t always by accident or oversight. The algorithms that keep these platforms alive (especially those from Meta, Facebook’s parent company, but not only) have one clear goal: to maximize engagement.
Why? What’s it for?
The answer’s simple: the more a post creates discussion, the longer it stays visible, the more time people spend on it, the more Facebook collects useful data about habits, political, religious and social beliefs, and how people communicate. This huge amount of data helps them build targeted ads, which are the real financial lifeblood of the platform and can be sold or shared with their commercial partners.
For advertisers, posts with lots of likes, shares, and comments (but especially those that spark flame wars or heated debates) are a godsend because they attract attention and free visibility. Pausing or censoring these dynamics could go against the platform’s economic interest.
Something similar happened with the infamous “My Wife” group on Facebook. Created in 2019 and reactivated in May 2025, it hosted months of photos of women posted without consent, causing outrage and serious psychological harm to the victims. Despite reports and the illegal nature of the content, Meta only shut the group after formal intervention by the Cyber Police in August 2025.
So how could such a group stay active for months?
The excuse “we didn’t notice” just doesn’t hold up. Try posting Botticelli’s Venus—a famous artwork showing a semi-naked woman—and Facebook will probably take it down fast.
So how could they not “notice” the photos circulating in that group for months (from May to August)? I believe they preferred to “look away” because the group generated tons of comments and engagement, drawing thousands of members. The slow action highlights a known problem: social platforms, while quick to censor harmless content (as seen many times in recent years), are sometimes much more tolerant of posts and groups that make noise.
There are many cases where “innocent” content gets penalized quickly, for example, the story of Jago (Jacopo Cardillo), a sculptor from Frosinone whose works are displayed in Italy’s Chamber of Deputies courtyard and other famous public places. His sculptures depict bodies, sometimes nude or semi-nude. Jago says his Facebook ban is pretty systemic and his appeals go nowhere; he stays “off” for weeks or months, without any review. Jago doesn’t make noise, even with many followers, so he can’t compete with a shameful group like “My Wife.”
Do I hate algorithms? Yes. With all my heart. I don’t mind promoting one’s products or services, I don’t mind ads, and I can even accept some targeting. For instance, if I run a book blog, I get why I’d see ads for writing courses. But this wild profiling, which prefers to ignore violence, abuse, and crimes just to get to know me better—no, I can’t stand that.
They say Meta and other big tech companies seek the “right balance” between censoring innocents and moderate content. Well, I don’t see much effort. The group we’re talking about was reported by dozens of people and everyone was told “It doesn’t break any rules.” Only after a famous person reported it to Cyber Police, and police ordered it shut, did Meta act.
Now risks some fines, but how much did it earn in those months while keeping the group alive despite reports?
Note, this isn’t a trial of Meta alone, they all do this. Remember that awful car accident caused by some barely-legal kids doing YouTube challenges? After a child died, those videos stayed up for 7-8 more days, getting millions of views and ad revenue, until police ordered Google to take them down. Who knows how long they would’ve stayed without that. Sure, Google then said they don’t tolerate those videos, like Meta now flaunts its quick responses.
People say you can’t fight this, that even if I leave social media, these bad behaviors will keep thriving. They say, “It’s the algorithm, baby…”
Well, I’m not giving up. For those who stay (probably holding their nose), at least don’t ever stop reporting!
CREDITI FOTO/PHOTO CREDITS: Pixabay.com + Linkedin (con permesso dell’autore del post)