Trending News

Get Your Daily Dose of Trending News

Technology

What to Expect From Facebook, Twitter and YouTube on Election Day

[ad_1]

On Tuesday, an operations center with dozens of employees — what Facebook calls a “war room” — will work to identify efforts to destabilize the election. The team, which will work virtually because of the coronavirus pandemic, has already been in action and is operating smoothly, Facebook said.

Facebook’s app will also look different on Tuesday. To prevent candidates from prematurely and inaccurately declaring victory, the company plans to add a notification at the top of News Feeds letting people know that no winner has been chosen until election results are verified by news outlets like Reuters and The Associated Press.

Facebook also plans to deploy, if needed, special tools that it has used in “at-risk countries” like Myanmar, where election-related violence was a possibility. The tools, which Facebook has not described publicly, are designed to slow the spread of inflammatory posts.

After the polls close, Facebook plans to suspend all political ads from circulating on the social network and its photo-sharing site, Instagram, to reduce misinformation about the election’s outcome. Facebook has told advertisers that they can expect the ban to last for a week, though the timeline isn’t set in stone and the company has publicly been noncommittal about the duration.

“We’ve spent years working to make elections safer and more secure on our platform,” said Kevin McAlister, a Facebook spokesman. “We’ve applied lessons from previous elections, built new teams with experience across different areas and created new products and policies to prepare for various scenarios before, during and after Election Day.”

Twitter has also worked to combat misinformation since 2016, in some cases going far further than Facebook. Last year, for instance, it banned political advertising entirely, saying the reach of political messages “should be earned, not bought.”

At the same time, Twitter started labeling tweets by politicians if they spread inaccurate information or glorify violence. In May, it added several fact-checking labels to President Trump’s tweets about Black Lives Matter protests and mail-in voting, and restricted people’s ability to share those posts.

[ad_2]

Sahred From Source link Technology

Leave a Reply

Your email address will not be published. Required fields are marked *