Facebook announced changes to the way it handles the content that goes against its policies. Starting from a people who manage a page will see a new feature that shows when Facebook removes certain content that goes against Facebook Community Standard.
Facebook will take a step against pages that have posted items that are rated false by a third-party fact-checking service.
Facebook further added– “We are updating our recidivism policy to better prevent those who have had Pages removed for violating our Community Standards from using duplicate Pages to continue the same activity. We’ll begin enforcing this policy in the weeks ahead.”
To resolve the issues, Facebook introducing a new tab ‘Page Quality’ tab on the Facebook page – which will let page manager know which content has been removed for violating standards and what was rated as “False,” “False Headline,” or “Mixture” by third-party fact-checker.
This section will provide the reason the content has been removed for being ‘hate speech, graphic violence, harassment and bullying, regulated good, nudity or sexual activity,” or being “support or praise,” of people and events that are not allowed to be on Facebook.
While this tab provides greater insight into content that was removed or demoted, it is not a comprehensive accounting of all policy violations. For example, we won’t be showing content removals at this time for things like spam, clickbait, or IP violations.