moderation

Former Twitter Trust & Safety co-lead, Aaron Rodericks, joins Bluesky team

Bluesky 007
Emerging decentralized social network and X rival Bluesky has just landed a notable former Twitter leader as its new Head of Trust and Safety. On Wednesday, the company announced it has appointed Aaron Rodericks, who most recently co-led the Trust and Safety team at Twitter, to this new position. It’s an indication that the network will approach trust and safety similarly, if not better, than Twitter once did, before Musk’s takeover. Bluesky says Rodericks will lead the moderation team that provides 24/7 coverage to uphold the Bluesky Community Guidelines and promises reports are reviewed in under 24 hours. “Aaron’s expertise in trust & safety at global scale brings invaluable experience to our moderation team.

“Content Moderation’s Fate Lies in the Hands of the Supreme Court… or Does It?”

Gettyimages 1167833543
The Supreme Court could decide the future of content moderation — or it could puntThe Supreme Court is considering the fate of two state laws that limit how social media companies can moderate the content on their platforms. The two laws were both crafted by Republican lawmakers to punish social media companies for their perceived anti-conservative bias. “Supreme Court cases can fizzle in this way, much to the frustration in most cases to other parties,” Barrett said. “It’s clear that the Supreme Court needs to update its First Amendment jurisprudence to take into account this vast technological change,” Barrett said. “… The Supreme Court often lags behind society in dealing with these kinds of things, and now it’s time to deal with it.”

Bluesky Empowers Users to Host Personal Servers through New Federation Opening

Bluesky Felt
Social network Bluesky, a competitor to X, Threads, Mastodon, and others, is opening up its doors with today’s news that the network is now opening up federation, following its public launch earlier this month. The move will allow anyone to run their own server that connects to Bluesky’s network, so they can host their own data, their own account and make their own rules. That sent some former Twitter users in search of alternatives that were more sustainable, like Mastodon and Bluesky. While this model is similar to Mastodon, Bluesky uses a newer social networking protocol, the AT Protocol, while Mastodon and many other networks today use ActivityPub. “After this initial phase, we’ll open up federation to people looking to run larger servers with many users,” it says.

Meta’s Regulation Panel Expands Applicability to Conversation Chains

Instagram Threads Gettyimages 1795093602
Meta’s external advisory group, its Oversight Board, announced today that it is expanding its scope to Threads along with Facebook and Instagram to scrutinize Meta’s content moderation decisions. This means if users on Threads are unsatisfied with Meta’s decision on issues like content or account takedown, they can appeal to the Oversight Board. Having independent accountability early on for a new app such as Threads is vitally important,” Oversight Board co-chair Helle Thorning-Schmidt said in a statement. In 2018, Mark Zuckerberg formally talked about having an independent oversight board for the first time. The Oversight Board has ruled on some important cases over the years.

Substack Refuses to Take Action Against Nazi Content, Risking Further Consequences

Substack App Redesign
Substack has industry-leading newsletter tools and a platform that independent writers flock to, but its recent content moderation missteps could prove costly. Earlier last year, Substack CEO Chris Best failed to articulate responses to straightforward questions from the Verge Editor-in-Chief Nilay Patel about content moderation. The interview came as Substack launched its own Twitter (now X)-like microblogging social platform, known as Notes. Substack authors are at a crossroadsIn the Substack fallout, which is ongoing, another wave of disillusioned authors is contemplating jumping ship from Substack, substantial readerships in tow. It’s unfortunate that Substack’s writers and readers now have to grapple with yet another form of avoidable precarity in the publishing world.

Intrinsic: Leveraging Y Combinator Support to Revolutionize Trust and Safety Infrastructure

Content Moderation1
“Intrinsic is a fully customizable AI content moderation platform,” Mellata said. Intrinsic, he explained, lets customers “ask” it about mistakes it makes in content moderation decisions and offers explanation as to its reasoning. The platform also hosts manual review and labeling tools that allow customers to fine-tune moderation models on their own data. “Most conventional trust and safety solutions aren’t flexible and weren’t built to evolve with abuse,” Mellata said. “The broader slowdown in tech is driving more interest in automation for trust and safety, which places Intrinsic in a unique position,” Mellata said.

Elon Musk’s Company Under Investigation by EU for Illegal Content, Moderation, Transparency, and UX Deception

Twitter X Logo Musk Pattern 1
Elon Musk’s X marks the spot of the first confirmed investigation opened by the European Union under its rebooted digital rulebook, the Digital Services Act (DSA). Its earlier actions were focused on concerns about the spread of illegal content and disinformation related to the Israel-Hamas war. So the Commission’s official scrutiny of X could have real world implications for how the platform operates sooner rather than later. However the Commission obviously has doubts X has gone far enough on the transparency front to meet the DSA’s bar. The investigation may also test Musk’s mettle for what could be an expensive head-on clash with EU regulators.