Facebook announced on the morning of March 17 that it will increase penalties for Facebook groups and their members who violate the rules and make changes to reduce the visibility of harmful content in the group.

The company says it will now exclude citizens and political groups in markets outside the United States from recommendations and further limit the reach of groups and members who continue to violate the rules.

The company has been cracking down on groups that create and share harmful, biased, or dangerous content, but many have been slow and ineffective.

Prior to the last US election, Facebook introduced a set of new rules aimed at penalizing users who violated community regulations and who spread disinformation through Facebook groups. These rules place a great deal of responsibility on the group itself and punish individuals who break the rules.

Facebook has also stopped recommending medical organizations to direct users to official sources of health information, including information about coronavirus infections.

In January 2021, Facebook took even more important steps against potentially dangerous groups. Following the riots in the US Congress on January 6, 2021, it announced that it would exclude citizens, political groups, and newly created groups in the United States from its recommendations.

The company had previously temporarily restricted these groups ahead of US elections.

As the WSJ article reported when the policy became permanent, Facebook’s internal investigation revealed that Facebook groups in the United States have polarized users and fueled the voice of violence that spreads after the election.

Researchers have pointed out that about 70% of the top 100 most active public Facebook groups in the United States have hate, misinformation, bullying, and harassment issues that should not be recommended. It seems that it led to the crackdown in January 2021.

And on March 17, the same policy applied not only to Facebook users in the United States, but also to global users.

This means that citizens and political groups, as well as health-related groups, will no longer appear as “recommended” when users around the world browse Facebook.

But keep in mind that the feature recommendations are just one of many ways users can find Facebook groups. Users can find them through searches, links posted by users, invitations, private messages of friends, and more.

Facebook has also stated that if a group violates Facebook rules, it will reduce the frequency of recommendations. This is a rank-down penalty that Facebook often uses to reduce the display of content in news feeds.

It also strengthens penalties for groups and their individual members who violate the rules through various enforcement measures.

Frame 5

For example, if you try to join a group that has violated Facebook’s community policy, a warning message (see figure above) will warn you of the group’s violation and users may be reconsidered joining.

Groups that violate the rules will be restricted from inviting notifications, current members will see less of the group’s content in the newsfeed, and the content will be displayed further down. These groups will also be demoted on Facebook recommendations.

If a group accepts a large number of members who violate Facebook policy or join other groups that have been closed due to violations of Facebook community regulations, the group itself will be temporarily needed to approve new posts by all members. And if admins and moderators repeatedly approve content that violates the rules, Facebook will delete the entire group.

The purpose of this rule is to prevent the group from reuniting after being banned and repeating cheating.

The last change announced today applies to group members.

People who violate many times within a Facebook group will be temporarily suspended from posting or commenting within the group, and will not be able to invite others to the group or create new groups. According to Facebook, this measure aims to reduce the reach of the bad guys.

The new policy gives Facebook a more transparent way to record the bad behavior of the group that led to the final closure. This kind of “paper trail” also helps to fend off allegations of prejudice that can occur when Facebook takes enforcement measures.

Social networks are often blamed by right-wing Facebook critics who believe they have a prejudice against conservatives.

But the problem with these policies is that they end up mercilessly hitting those who break Facebook’s rules. It’s not much different from what today’s users jokingly call “Facebook’s prison.”

If an individual or Facebook page violates Facebook’s community standards, they will be temporarily prohibited from interacting with or using certain features on the site. Facebook is about to recreate this formula with modifications for the Facebook group and its members.

There are other problems. For one thing, these rules depend on Facebook, and it’s unclear how effective they are. Second, it ignores searches, which is an important way to find groups. Facebook claims that lowering the order of search results for poorer groups will solve the problem, but the results of that effort are clearly chaotic.

Facebook has made a drastic statement about the ban on QAnon content across the platform in the fall 2020 crackdown on misinformation, but QAnon-related content (not named QAnon, but QAnon-style “patriots” and It’s still possible to search for groups that accept the plot).

In similar cases, a search for terms such as “antivax” or “covid hoax” would call the group “anti-RNA only” rather than “general anti-vaccine”.

Others are guided to problematic groups, such as a group called “parents who oppose vaccines” and a group of “people who dislike vaccines” who advocate disseminating “true” vaccine information ( We confirmed this on March 16th, before Facebook announced it).

Obviously, these aren’t official medical sources and aren’t recommended under Facebook policy, but they can be easily found by searching Facebook. However, the company is taking stronger steps against false information about the new coronavirus and vaccines.

According to the company, it will remove pages, groups, and accounts that have repeatedly shared false allegations, otherwise it will lower its rank.

As a reminder, Facebook has full of powerful technical means to block access to content.

For example, the company has banned plots such as “stop the steal” after the US elections. And even now, when I search for a group with the keyword “stop the steal,” I get a blank page stating that no search results were found.

Screen Shot 2021 03 16 at 12.55.29 PM 1

Facebook completely blocks “stop the steal”.

So why do banned topics like “QAnon” appear in search results?

Why does “covid hoax” come out? (See below)

Screen Shot 2021 03 17 at 9.39.57 AM

If Facebook wants to expand its list of problematic search terms and return a blank page for other types of harmful content, that’s possible. In fact, if you want to keep a block list of URLs that are known to spread disinformation, you can.

This prevents users from resharing posts that contain those links. You can keep these posts private by default, or you can flag users who repeatedly violate a rule or who violate some of the rules as users who can no longer publish their posts.

In other words, if Facebook really wants to make a big impact on the disinformation, negative effects, biases and other harmful content that is spread on the platform, there’s so much you can do.

Nonetheless, they are bluntly pursuing temporary punishment and punishment solely for the purpose of “repeated” violations as announced today. Perhaps the penalties are stronger than before, but not enough.

Categorized in:

Tagged in: