Facebook said today that in March 2020, after the independent fact checker found incorrect information, it showed warning messages to users interacting with covid-19 in as many as 40 million posts.Facebook works with more than 60 of these fact checking agencies in more than 50 languages around the world. The number expanded in March, adding eight new checkers and a dozen more countries and regions within the warning range.
They found and tagged potentially fake news shared on Facebook. Facebook uses warning labels to add more context to the topic, and according to the social network, they don't view the original shared content about 95% of the time. When a fake story is highlighted in the system for the first time, it will be found as the transfer continues.
Facebook said today it has so far removed hundreds of thousands of false messages that could cause personal injury. Facebook has deleted bad information about claims that drinking bleach can cure a virus, and theories that keeping social distance doesn't help prevent the spread of the disease. Now, Facebook says it shows a new message to people who like, respond to, or comment on harmful error messages about covid-19 that Facebook has deleted them.
Mark Zuckerberg, Facebook's chief executive, said today that if a content contains harmful error information that may cause imminent personal injury, Facebook will delete it. For other error information, once it is judged as false by the fact checker, we will reduce its release and apply more contextual warning labels to remind users.