Tag Archives: Mark Zuckerberg

Mark Zuckerberg says social networks should not be fact-checking political speech

Mark Zuckerberg says social networks should not be fact-checking political speech

  • Facebook CEO Mark Zuckerberg told CNBC he does not think social networks should be fact-checking what politicians post.
  • “Political speech is one of the most sensitive parts in a democracy, and people should be able to see what politicians say,” Zuckerberg said in an interview. 
  • The company, however, does have lines that no one, including politicians, is allowed to cross, Zuckerberg said.

WATCH NOWVIDEO04:54Facebook CEO Mark Zuckerberg: Social networks should not fact-check politicians

Facebook CEO Mark Zuckerberg told CNBC he does not think social networks should be fact-checking what politicians post.

Zuckerberg’s comment came after “Squawk Box” co-host Andrew Ross Sorkin asked him for thoughts on Twitter’s decision to start fact-checking the tweets of President Donald Trump.https://7e0d11e2cf97f727b2a9574df22190ad.safeframe.googlesyndication.com/safeframe/1-0-37/html/container.html

Twitter’s move came on Tuesday after Trump tweeted that mail-in ballots would be “substantially fraudulent.” Earlier Tuesday, Twitter declined to censor or warn users after Trump tweeted baseless claims that MSNBC host Joe Scarborough should be investigated for the death of his former staffer. 

“I don’t think that Facebook or internet platforms in general should be arbiters of truth,” Zuckerberg told Sorkin in an interview that aired Thursday morning. “Political speech is one of the most sensitive parts in a democracy, and people should be able to see what politicians say.”

Although Facebook does use independent fact-checkers who review content on its social networks, the point of the fact-checkers is to “really catch the worst of the worst stuff,” Zuckerberg said. 

“The point of that program isn’t to try to parse words on is something slightly true or false,” he said. “In terms of political speech, again, I think you want to give broad deference to the political process and political speech.”

Facebook announced in October that it would allow politicians to run ads on the social network, even if they include misinformation. null

The company, however, does have lines that no one, including politicians, is allowed to cross, Zuckerberg said. No one is allowed to use Facebook to cause violence or harm themselves, or to post misinformation that could lead to voter suppression, Zuckerberg said.

“There are clear lines that map to specific harms and damage that can be done where we take down the content,” he said. “But overall, including compared to some of the other companies, we try to be more on the side of giving people a voice and free expression

Author: Shakir Essa

Advertisement

Facebook reportedly had evidence that its algorithms were dividing people, but top executives killed or weakened proposed solutions

Mark Zuckerberg

But Zuckerberg and Facebook’s policy chief, Joel Kaplan, repeatedly nixed proposed solutions because they feared appearing biased against conservatives or simply lost interest in solving the problem, The Journal reported.

Facebook live reactions

One report concluded that Facebook’s algorithms “exploit the human brain’s attraction to divisiveness,” according to The Journal.

Facebook reaction emojis

Facebook’s internal research found that it encouraged polarization, but Mark Zuckerberg and other top executives rejected ideas aimed at fixing the problem, The Wall Street Journal reports

Facebook had evidence that its algorithms encourage polarization and “exploit the human brain’s attraction to divisiveness,” but top executives including CEO Mark Zuckerberg killed or weakened proposed solutions, The Wall Street Journal reported on Tuesday.

The effort to better understand Facebook’s effect on users’ behavior was a response to the Cambridge Analytica scandal, and its internal researchers determined that, contrary to the company’s mission of connecting the world, its products were having the opposite effect, according to the newspaper.

One 2016 report found that “64% of all extremist group joins are due to our recommendation tools,” with most people joining at the suggestion of Facebook’s “Groups You Should Join” and “Discover” algorithms. “Our recommendation systems grow the problem,” the researchers said, according to The Journal.

The Journal reported that Facebook teams pitched multiple fixes, including limiting the spread of information from groups’ most hyperactive and hyperpartisan users, suggesting a wider variety of groups than users might normally encounter, and creating subgroups for heated debates to prevent them from derailing entire groups.https://tpc.googlesyndication.com/safeframe/1-0-37/html/container.html?n=0

But these proposals were often dismissed or significantly diluted by Zuckerberg and Facebook’s policy chief, Joel Kaplan, according to the newspaper, which reported that Zuckerberg eventually lost interest in trying to address the polarization problem and was concerned about the potential to limit user growth.

In response to the pitch about limiting the spread of hyperactive users’ posts, Zuckerberg agreed to a diluted version and asked the team to not bring something like that to him again, The Journal said.

The company’s researchers also determined that because of a larger presence of far-right accounts and pages publishing content on Facebook, any changes — including apolitical tweaks, like reducing clickbait — would have disproportionately affected conservatives.

That worried Kaplan, who previously halted a project called “Common Ground” that aimed to encourage healthier political discourse on the platform.null

Ultimately, many of the efforts weren’t incorporated into Facebook’s products, with managers telling employees in September 2018 that the company was pivoting “away from societal good to individual value,” according to The Journal.

“We’ve learned a lot since 2016 and are not the same company today,” a Facebook spokeswoman told the paper. “We’ve built a robust integrity team, strengthened our policies and practices to limit harmful content, and used research to understand our platform’s impact on society so we continue to improve.”

Facebook has repeatedly been scrutinized by critics who say the company hasn’t done enough to limit the spread of harmful content on its platform. That topic has come into sharper focus as coronavirus-related misinformation has run rampant on social media and as the 2020 presidential election approaches.

Author: Shakir Essa