Facebook intends to penalize users who share incorrect information frequently. THE has placed new notices that will notify users that the repeated sharing of fake news can result in "your posts falling down in the news feed so other people are less likely to see them"
So far, the company's policy is to downgrade individual posts that are checked by fact checkers on application. However, they can go viral before that. With the change, Facebook will warn users about the consequences of sharing misinformation.
- Google collects patient data to develop health algorithms
- Social network that prohibits selfie? Meet the Poparazzi
- Find out why WhatsApp is suing the government of India
Pages found to be repeat infringers will include pop-up notices when users attempt to follow them. Individuals who share incorrect information will receive notifications that their posts will be less visible in the news feed as a result. In addition, notifications will include a link to verify the fact of the post.
After another year in which Facebook struggles to control misinformation about the coronavirus pandemic, the update arrives to validate the company's policy. “Whether it's fake or misleading content about Covid-19 and vaccines, climate change, elections or other topics, we're ensuring that fewer people see incorrect information in our apps,” he wrote in a blog.
On the other hand, Facebook did not indicate how many posts would be necessary for the news to be reduced. People who study disinformation say they are often the same individuals behind the most viral false claims.
Have you watched our new videos on YouTube? Subscribe to our channel!