Home > News content

Apologize for the machine Zuckerberg crisis continues Facebook

via:新浪科技     time:2019/3/16 17:02:47     readed:315

Leinwand's findings are: Facebook needs to look at negative attitudes toward false information. Prior to the Sri Lanka incident, Facebook had long endured false news and false information. Leinwand concluded, "For a private company, it is really difficult to determine the truth of the message."

But when Leinwand began investigating the Sri Lanka incident, she realized that she had to be alarmed by the policy of enduring false news. Since that summer, Facebook has removed user posts from high-risk countries, including Sri Lanka, but only if it is reported by local non-profit organizations and “immediately triggers violence”. In June, Facebook observed a series of familiar rumors that the new system seemed to work. Leinwand said she was very happy - indicating that Facebook has the ability to manage its own platform.

Is that true? It has been nearly a year since Facebook was exposed to share thousands of user data with Cambridge Analytics. The incident triggered an investigation by the US Department of Justice, followed by a grand jury trial. The destruction of privacy is not as serious as racist violence, but this scandal does mark a significant shift in public perception of Facebook's influence.

The company's business relies on filtering content and showing users what they might like, but it often also has the side effect of amplifying fake news and extremism. Facebook asked Leinwand and other executives to interview Bloomberg Businessweek to justify the company and say the company has made progress.

Unfortunately, the reporting system they describe is slow and resource intensive. This system still has a meager human examiner and software. Facebook could have paid more to the examiner or hired more inspectors, or set stricter rules on posts that users could post - but these would hurt the company's profits and revenue. With the attitude of adopting the above measures, the company is accustomed to trying to make rules after the problem arises. Although these rules are helpful, critics believe that Facebook should take more proactive measures.

Today, Facebook is managed by a 27-page document. This document is called Community Standards and was first publicized in 2018.Developmentcloth. The rules in the document clearly state, for example, if it is out of science oreducationPurpose, no explosives may be released on the platform.Tutorial. Similarly, images such as “visible anus” and “full-naked buttocks” should not be released unless they are superimposed on public figures and allowed to be posted as comments.

The clarity of this set of standards may seem ridiculous. However, Facebook executives say this is to solve the system in a scalable way.websiteThe hard work of the worst problem. In other words, these rules are very common and can be applied anywhere in the world – and it is clear enough that any low-paid worker at Facebook's content review centers in the Philippines, Ireland, etc. can decide how to handle the mark in seconds. s post. The working environment of Facebook's 15,000 employees and contractors has been controversial. In February, the technology media "Verge" reported that American examiners earned only $28,800 a year, and regularly reviewed images containing violence, pornography and hate speech.videoWait. Some people suffer from post-traumatic stress disorder. Facebook responded that the company is reviewing contractor suppliers and will maintain close contact to maintain higher standards and pay.

On the day of the interview, Bicot was managing Facebook’s response to the massive shooting incident that occurred the previous day. The shooting took place in Annapolis, Maryland. At the time of the massacre, Bicot told the content auditor to check all the praises of the gunmen and to block false accounts registered by opportunists in the name of gunmen or victims. After that, her team removed the gunman's homepage and turned the victim's homepage into the company's so-called "memory account". The page was the same as the regular Facebook page, but the word "mourn" was added to the name of the deceased.

A crisis like this happens almost every week. "It’s not just a shooting incident," Bicot said. "It could be a plane crash. We have to find out who is on the plane and whether it is a terrorist attack. It could also be a protest, claiming that someone was victimized, etc."

These are all simple examples. The line between good and evil is clear and clear, and Facebook has developed a response process for this. On her laptop, Bicot showed a slide of the company's Community Standards team. The team gathered every other Thursday morning to develop new rules. There are approximately 80 employees, either attending the meeting in person or virtual. The slide shows that on a Thursday last year, the team discussed how to handle the #MeToo post. These posts were posted by women, who named the men who harassed them. If the content of the post is not true, they can be considered as an attack harassment against innocent men. Also at that meeting, the company evaluated the viral stunts that younger users tried. For example, "condom use challenge", "chili challenge", etc., young people will try or pretend to try these challenges to win the eye. If these challenges can hurt others, should Facebook prevent others from promoting these posts?

In December, after months of discussion, Facebook added new rules. #MeToo accuses no problem as long as they do not encourage retaliation. There is no problem with the content of the challenge, as long as they do not encourage self-harm. "These problems are not black or white," Bicot said.

At a hearing in Congress and elsewhere, Facebook has deployed a response to criticism of its content decisions. If you inquire about certain content on the website that has been banned by community standards, the executive will assure the public that such content is “not allowed” or “prohibited”. If there is no corresponding rule, Facebook will explain that the company is trying to solve the problem, the company failed to find the problem in time, and the company is taking on its own responsibility. For the Russian intervention during the 2016 US presidential election and the ethnic violence in Sri Lanka, the company has said how many times it has failed to find the problem in time. However, "failure to find problems in time" can also be interpreted as a kind of euphemistic expression that deliberately ignores the problem until the complaints are taken seriously.

After working at a consulting firm, Kelly began to pay attention to this issue. Her company helped Purdue Pharma clean up fake drugs. When receiving Kylie’s reminder, many technology companies—including Alibaba, Craiglist, and eBay—will quickly agree to revoke the images. However, she said that Facebook and Facebook's Instagram is the exception.

Kelly currently lives in the Bay Area. She often participates in party activities with Facebook executives and then raises these issues. Since 2013, Kelly has been paying attention to this issue and spent a few minutes searching for and posting reports on drug sales on Facebook and Instagram. She said that she usually receives an automatic reply that she dismisses. Sometimes, there is no reply at all. At the time, technology companies advocated in the conferences and research reports to enforce stricter enforcement of drug sales on anonymous black networks. But in fact, Kelly is gradually convinced that most illegal purchases actually happen on normal networks, on social media and other online stores. "Life is disappearing, but Facebook doesn't care," she said.

Since then, Kelly has been following news reports from Kentucky, Ohio, and West Virginia. In these places, the number of people who died from overdose of opioids this year has begun to decline. Some articles speculate that the reason may be the increase in community treatment centers or mental health resources. But Kelly thinks: "One thing that really changes is that the relevant tags no longer exist."

Even so, the drug problem on Facebook still exists. In September, the Washington Post described Instagram as "a relatively large open market that allows illegal drugs to be advertised." In response, Bicot published a blog post explaining that Facebook has blocked hundreds of tags and drug-related posts and is working on computer imaging technology to better detect posts related to drug sales. She also listed a predictable route: "Our service never allows such content."

Zuckerberg, as the company's chief executive officer, chairman, founder and controlling shareholder, often faces public suspicions: whether he should have almost absolute power over the company's products. In addition to content review, he tried to deny this suggestion. “I am increasingly convinced that Facebook should not make too many important decisions about free speech and security,” he wrote in November. He said that the company will set up an "independent body" to make a final decision on whether any content should remain on Facebook in a "transparent and binding manner." The proposed institution will contain 40 “experts in various fields,” Zuckerberg wrote. “It is like our directors are responsible to our shareholders, which is only responsible for our community.”

At the same time, Zuckerberg wants to get the computer to undertake as much review as possible. Bicot said that content reviewers are currently training machines. "The examiner said, 'Very good, machine, this choice is correct. This is hate speech' or 'Oh no, machine, this time you chose the wrong one, this is not hate speech.'" Facebook said that the content examiner marked the hatred Of the comments, 52% are already recognized by the algorithm. The company also said it will use AI to try to delete drug-related posts.

Another way to transfer responsibility is to encourage users to try out the company's new encrypted messaging service, a service designed to make conversations between users invisible to anyone, including Facebook. "I believe that a privacy-focused communications platform will be more important than today's open platform," Zuckerberg wrote in a blog post on March 6. Some people think that this is a victory for Facebook critics since the Cambridge analysis scandal, but even Zuckerberg admits that this is a trade-off: new changes may make terrorists, drug addicts, and advocates more Unscrupulous. Facebook's WhatsApp has taken similar encryption measures. But in India last year, the false information about the kidnapping of children on WhatsApp caused the village to panic, causing some villagers to lynch privately. WhatsApp can't delete content; it only reduces the number of times a piece of information is shared. According to the BBC report, since the incident, a "WhatsApp lynching" incident occurred.

But Facebook executives are trying to reassure the world. Bicot went to Sri Lanka last September to meet with 60 civil society groups and listen to their concerns about fake accounts. There are many things to talk about. Sanjna Hattotuwa, a senior researcher at the Sri Lanka Policy Choice Center, said: "Their community standards are very specific, such as the explicit mention that some bad content needs to be removed immediately. However, some content focuses on the fact that it will deteriorate over a long period of time."

Recently, Hattohuwa's team reminded in a blog post that Facebook's relationship with the Sri Lankan government is too close to buy officials, and these officials have been accused of spreading false information for political purposes. The post quoted an official tweet, in which Facebook’s public policy director Ankhi Das forwarded a painting of a local artist to Mahinda Rajapaksa. Some supporters of Rajapaksa are believed to have incited anti-Muslim riots. According to Facebook, the gift is “public art” and there is no cash value. The company also presents paintings to other Sri Lankan leaders.

In view of the scandal that happened to Facebook, Alex asked Grande what the true nature of his work was. Alex recalled that he asked her at the time: "Do you need someone to make long-term changes or change channels?" He said, Grande told him that no candidate had ever asked this question before. She paused for a while and continued:

“I think everyone wants to be an idealist. But if we just respond to those negative emotions again and again, then no one cares about all the good things we do.” Facebook said, Grande’s memories The version of the conversation is different." Regardless, Alex rejected the job offer offered by Facebook.

China IT News APP

Download China IT News APP

Please rate this news

The average score will be displayed after you score.

Post comment

Do not see clearly? Click for a new code.

User comments