What’s the deal with Meta’s “Threads, an Instagram app”?

Mark Zuckerberg’s Meta has seen a gap in the market and they’re exploiting it as you would expect a shrewd businessman like Mark Zuckerberg to do.
Twitter under Elon Musk has lurched from drama to disaster. CCDH documented the influx of hate and misinformation on the platform as Musk threw open the doors to previously banned users who spread hate and misinformation. Our research made the front page of the New York Times and was cited all around the world. This spooked Twitter’s advertisers who found their ads next to neo-Nazis, white supremacists, misogynists, and spreaders of dangerous conspiracy theories.
Mark Zuckerberg’s new “Threads” text-based conversation app seeks to exploit this chaos at Twitter and has garnered nearly 100 million users (as we write) since its launch.
But Mark Zuckerberg’s track record on tackling hate and misinformation on his platforms is just as bad as Elon Musk’s recent failures.
Meta’s failures
CCDH’s own research has shown repeated failures by Meta to tackle antisemitism and islamophobia, failures to deal with DM abuse on Instagram, and abuse towards children in the metaverse. Mark Zuckerberg’s apps have left a trail of hate and misinformation in their wake.
- In July 2021 we showed that Meta’s platforms fail to act on anti-Jewish hate: Facebook fails to act on 89.1% of reports, while Instagram fails to act on 81.2%.
- In April 2022 we showed how Facebook and Instagram failed to act on 89% of content promoting the “Great Replacement” conspiracy theory, which inspired the terrorists who committed massacres at the Christchurch mosque attack in 2019 and the Tree of Life synagogue shooting in 2018
- Our report ‘Hidden Hate’ exposed how Instagram failed to act on 90% of abuse sent to women via direct messages (DMs). Instagram failed to act on 9 in 10 violent threats over DM reported using its tools and failed to act on any image-based sexual abuse within 48 hours.
- When we went into Meta’s new Flagship VR product ‘Horizon Worlds’ we found bullying, sexual harassment of minors, and harmful content are rife. Reporting incidents to Meta did not seem to trigger any meaningful response.
Meta’s repeated failure to tackle hate and misinformation can, depressingly, often be linked back to protecting the bottom line. Controversial content drives engagement, engagement drives time on the platform, and that is attention Mark Zuckerberg can sell to advertisers. Facebook whistleblower, Frances Haugen, told Congress in testimony that Facebook repeatedly chose to maximize online engagement instead of minimizing harm to users even when those interactions exacerbated issues like addiction, bullying, and eating disorders.
And while the ‘Threads’ currently carries no ads, the platform has been designed from the ground up to add to Meta’s data harvesting capabilities.

So it may not be long before we see the well-trodden path of turning a blind eye to hateful or misleading content on Threads, as long as it turns a quick buck.
As a new app, you’d hope that ‘safety by design’ has been the cornerstone of the app’s creation and that users will not have to run a gauntlet of hate and misinformation. CCDH will be keeping a watchful eye on Meta’s new app, and will be holding them accountable and responsible for any failures to deal with online hate and misinformation.