Home > News content

Facebook says the popularity of the new crown makes it harder to control content on the platform

via:cnBeta.COM     time:2020/8/12 8:30:50     readed:105

According to foreign media CNET,Facebook said on Tuesday that the new crown pandemic affected the work of employees who censored posts on social networks for violations of the rules against promoting suicide or self mutilation content.The covid-19 pandemic has also affected staff who can monitor Facebook's instagram for child nudity and sexual exploitation.

visit:

Ali cloud welfare cloud server ECS as low as 102 yuan per yea

Facebook in an articlePositionsFrom April to June, the company took less action on such objectionable content because most content censors work from home, according to China. Users can't always appeal content review decisions.

]SZDV3{I2C1S]Y9~AX0TOX5.png

Facebook relies on a combination of human censors and technology to tag offensive content. But some content is tricky to regulate, including posts related to suicide and sexual exploitation, so Facebook relies more on human censors to make these decisions. The company has faced criticism and lawsuits from content moderators who say they developed PTSD symptoms after repeatedly reviewing violent images.

Guy Rosen, who oversees Facebook's work on security and integrity, told a news conference call that content about suicide and child nudity cannot be censored at home because of its visual image. This is very challenging for content reviewers because when they work from home, family members may be around them.

"We want to ensure that audits are conducted in a more controlled environment, which is why we started bringing a small number of safe auditors back to the office." He said.

Facebook is also using AI to rank the likelihood of harmful content and to mark which posts people need to review first. Rosen said the company has always given priority to censoring live videos, but if users suggest suicide in ordinary posts, they will also be at the top of the list.

Facebook said it was unable to determine the popularity of violence and picture content, as well as adult nudity and sexual activity on its platform in the second quarter due to the impact of the new crown pandemic. Facebook regularly publishes quarterly reports on how it enforces its rules.

Facebook was also attacked for allegedly not doing enough to combat hate speech, a problem that sparked an advertising boycott in July. On Monday, NBC news reports said an internal survey found thousands of groups and pages on the Facebook supported a name

Monika Bickert, who oversees Facebook's content policy, said Facebook had removed qanon groups and pages because using fake accounts or content violated social network rules. "We will continue to look for other ways to make sure that we are dealing with this content appropriately," Bickert said.

Facebook said it took action on 22.5 million items of content that violated its anti hate speech rules in the second quarter, up from 9.6 million in the first quarter. Facebook attributed the jump to the use of automation technology, which helps companies proactively detect hate speech. From the first quarter to the second quarter, active detection of hate speech on Facebook rose from 89% to 95%, the company said.

Facebook said active detection of hate speech on instagram increased from 45% to 84% over the same period. Instagram took action on 808900 violations of hate speech rules in the first quarter, jumping to 3.3 million in the second quarter.

Facebook also took action on 8.7 million anti terrorist content in the second quarter, up from 6.3 million in the first quarter. The company said independent auditors would review Facebook's indicators used to implement community standards. The company hopes the audit will take place in 2021.

China IT News APP

Download China IT News APP

Please rate this news

The average score will be displayed after you score.

Post comment

Do not see clearly? Click for a new code.

User comments