Emerging decentralized social network and X rival Bluesky has just landed a notable former Twitter leader as its new Head of Trust and Safety.
On Wednesday, the company announced it has appointed Aaron Rodericks, who most recently co-led the Trust and Safety team at Twitter, to this new position.
It’s an indication that the network will approach trust and safety similarly, if not better, than Twitter once did, before Musk’s takeover.
Bluesky says Rodericks will lead the moderation team that provides 24/7 coverage to uphold the Bluesky Community Guidelines and promises reports are reviewed in under 24 hours.
“Aaron’s expertise in trust & safety at global scale brings invaluable experience to our moderation team.
The Supreme Court could decide the future of content moderation — or it could puntThe Supreme Court is considering the fate of two state laws that limit how social media companies can moderate the content on their platforms.
The two laws were both crafted by Republican lawmakers to punish social media companies for their perceived anti-conservative bias.
“Supreme Court cases can fizzle in this way, much to the frustration in most cases to other parties,” Barrett said.
“It’s clear that the Supreme Court needs to update its First Amendment jurisprudence to take into account this vast technological change,” Barrett said.
“… The Supreme Court often lags behind society in dealing with these kinds of things, and now it’s time to deal with it.”
Instead, he’s arguing with Tumblr users over an individual content moderation decision, which has sparked communitywide outcry and accusations of transphobia.
Aside from Elon Musk since he took over Twitter (now X), it’s uncommon to see the CEOs of social platforms commenting directly on individual content moderation decisions.
But no one on the trust and safety team was reassigned, so these moderation decisions likely weren’t impacted by the company shake-up.
However, Tumblr has a bad track record for content moderation decisions, especially those involving trans people.
“We did have an external contract moderator last year that was making transphobic moderation (and also selling moderation, criminally),” Mullenweg wrote on his blog.
Social network Bluesky, a competitor to X, Threads, Mastodon, and others, is opening up its doors with today’s news that the network is now opening up federation, following its public launch earlier this month.
The move will allow anyone to run their own server that connects to Bluesky’s network, so they can host their own data, their own account and make their own rules.
That sent some former Twitter users in search of alternatives that were more sustainable, like Mastodon and Bluesky.
While this model is similar to Mastodon, Bluesky uses a newer social networking protocol, the AT Protocol, while Mastodon and many other networks today use ActivityPub.
“After this initial phase, we’ll open up federation to people looking to run larger servers with many users,” it says.
Meta’s external advisory group, its Oversight Board, announced today that it is expanding its scope to Threads along with Facebook and Instagram to scrutinize Meta’s content moderation decisions.
This means if users on Threads are unsatisfied with Meta’s decision on issues like content or account takedown, they can appeal to the Oversight Board.
Having independent accountability early on for a new app such as Threads is vitally important,” Oversight Board co-chair Helle Thorning-Schmidt said in a statement.
In 2018, Mark Zuckerberg formally talked about having an independent oversight board for the first time.
The Oversight Board has ruled on some important cases over the years.
Substack has industry-leading newsletter tools and a platform that independent writers flock to, but its recent content moderation missteps could prove costly.
Earlier last year, Substack CEO Chris Best failed to articulate responses to straightforward questions from the Verge Editor-in-Chief Nilay Patel about content moderation.
The interview came as Substack launched its own Twitter (now X)-like microblogging social platform, known as Notes.
Substack authors are at a crossroadsIn the Substack fallout, which is ongoing, another wave of disillusioned authors is contemplating jumping ship from Substack, substantial readerships in tow.
It’s unfortunate that Substack’s writers and readers now have to grapple with yet another form of avoidable precarity in the publishing world.
“Intrinsic is a fully customizable AI content moderation platform,” Mellata said.
Intrinsic, he explained, lets customers “ask” it about mistakes it makes in content moderation decisions and offers explanation as to its reasoning.
The platform also hosts manual review and labeling tools that allow customers to fine-tune moderation models on their own data.
“Most conventional trust and safety solutions aren’t flexible and weren’t built to evolve with abuse,” Mellata said.
“The broader slowdown in tech is driving more interest in automation for trust and safety, which places Intrinsic in a unique position,” Mellata said.
Elon Musk’s X marks the spot of the first confirmed investigation opened by the European Union under its rebooted digital rulebook, the Digital Services Act (DSA).
Its earlier actions were focused on concerns about the spread of illegal content and disinformation related to the Israel-Hamas war.
So the Commission’s official scrutiny of X could have real world implications for how the platform operates sooner rather than later.
However the Commission obviously has doubts X has gone far enough on the transparency front to meet the DSA’s bar.
The investigation may also test Musk’s mettle for what could be an expensive head-on clash with EU regulators.
Meta has been restrained from offering moderation services to the social media giant, and it is unclear who is currently reviewing its platforms in sub-Saharan Africa. This lack of oversight…
Today, Facebook announced that it is introducing more comment moderation tools and controls to make it easier for creators to manage conversations on the social network. These new tools will…