Facebook is becoming a bit more like Nextdoor in an effort to boost its groups feature. The only problem is that Facebook appears to be borrowing one of Nextdoor’s more controversial concepts: giving more power to community moderators.

On Wednesday, the company announced it was making a major enhancement to the powers of its groups’ community moderators. Now, administrators can do a number of new things, like automatically block certain people from commenting in conversations based on factors like how long they’ve been a member of the group. Facebook says the new tools are meant to help “admins play a key role in helping maintain a safe and healthy culture.” The changes are part of Facebook’s broader shift toward relying more on unpaid community admins, who get special privileges in exchange for managing the conversation in individual groups.

There are other new powers now at admins’ disposal, like an AI-powered alert that flags “contentious and unhealthy” conversations, and new summaries that moderators can use to review any member’s activity in a particular group. When asked whether the new features were inspired by Nextdoor’s moderation system, Facebook spokesperson Leonard Lam said, “Our product team regularly talks to our admin community to better understand their needs, and the features we announced today reflect direct feedback that we’ve gotten from them.”

The approach largely resembles the way Nextdoor, the neighborhood-based media platform, has for years handled moderation. The problem is that Nextdoor’s model hasn’t really worked. Its communities are plagued by a haphazard approach to misinformation and complaints of toxic fights between group members, along with accusations of biased and inconsistent community moderators.

Maybe things will work out differently for Facebook. But the new approach to moderation isn’t the only example of Facebook

Continue reading – Article source

Posts from the same category:

    None Found