Breaking News

Disinformation Hearing: Live Updates – The New York Times

[ad_1]

Here’s what you need to know:

Jack Dorsey, Twitter’s chief executive, testifying remotely on Thursday.
Credit…Energy and Commerce Committee, via YouTube

Jack Dorsey, Twitter’s chief executive, said during his congressional testimony on Thursday that the site played a role in the storming of the U.S. Capitol on Jan. 6, in what appeared to be the first public acknowledgment by a top social media executive of the influence of the platforms on the riot.

Mr. Dorsey’s answer came after Representative Mike Doyle, Democrat of Pennsylvania, pressed the tech chief executives at a hearing on disinformation to answer “yes” or “no” as to whether their platforms had contributed to the spread of misinformation and the planning of the attack.

Mark Zuckerberg, Facebook’s chief executive, and Sundar Pichai, Google’s chief executive, did not answer with a “yes” or “no.” Mr. Dorsey took a different tactic.

“Yes,” he said. “But you also have to take into consideration the broader ecosystem. It’s not just about the technological systems that we use.”

Before supporters of then-President Donald J. Trump stormed the Capitol on Jan. 6, misinformation about the results of the presidential election had swirled on the social media sites. Lies that the election had been stolen from Mr. Trump were prevalent, as were false conspiracy theories about how President Biden had gained the votes to win.

After the attack, Twitter and Facebook barred Mr. Trump from posting on their platforms. Their actions suggested that they saw a risk of more violence being incited from what was posted on their sites, but the executives had not previously articulated what role the platforms had played.

Representative Jan Schakowsky, a Democrat of Illinois, later asked Mr. Zuckerberg about remarks that Facebook’s chief operating officer, Sheryl Sandberg, made shortly after the riot. In a January interview with Reuters, Ms. Sandberg said that the planning for the riot had been “largely organized” on other social media platforms and downplayed Facebook’s involvement.

Ms. Schakowsky asked whether Mr. Zuckerberg agreed with Ms. Sandberg’s statement.

Mr. Zuckerberg appeared to walk back Ms. Sandberg’s remarks.

“Certainly there was content on our services,” he said. “From that perspective, I think there’s further work that we need to do,” he added before Ms. Schakowsky interrupted him.

Mark Zuckerberg, chief executive of Facebook, testifying remotely on Thursday.
Credit…Energy and Commerce Committee, via YouTube

When it comes to Section 230 of the Communications Decency Act, a 1996 statute that shields online platforms from lawsuits over their users’ posts, the chief executives of Google, Facebook and Twitter all agree on one thing: It is a foundational law of the internet.

But as lawmakers on Thursday threatened to strip the liability protection encoded in the law, the chieftains of the biggest social networks couldn’t agree on how to fix it, or if it even needs fixing.

Section 230 has become a lightning rod for politicians unhappy with how internet platforms like Facebook, YouTube and Twitter handle content on their platforms. The law permits internet companies to moderate their sites without being on the hook legally for everything they host. The protections have helped internet companies grow with content posted by users.

In prepared testimony, Mark Zuckerberg, Facebook’s chief executive, urged Congress to take on “thoughtful reform” of Section 230. He said the law needed to be updated for the modern age.

“I believe that Section 230 would benefit from thoughtful changes to make it work better for people,” Mr. Zuckerberg said in the statement. “But identifying a way forward is challenging given the chorus of people arguing — sometimes for contradictory reasons — that the law is doing more harm than good.”

Credit….

He proposed that liability protection for companies be conditional on their ability to fight the spread of certain types of unlawful content. He said platforms should be required to demonstrate that they have systems in place for identifying unlawful content and removing it.

In the hearing, Mr. Zuckerberg clarified that the rules should be different for smaller social networks for “competition” reasons.

He suggested that a third-party be established to determine what is “adequate” based on the size of the platform and to identify fair and clear practices for companies to understand and implement.

Sundar Pichai, chief executive of Alphabet, Google’s parent company, said while regulation has a role to play in “addressing harm and improving accountability,” he cautioned that recent proposals to change Section 230 would have unintended consequences in harming free expression and limiting the ability of companies to police platforms.

“Without Section 230, platforms would either over-filter content or not be able to filter content at all,” Mr. Pichai said in prepared remarks. “Section 230 allows companies to take decisive action on harmful misinformation and keep up with bad actors who work hard to circumvent their policies.”

Mr. Pichai said at the hearing that there were “definitely good proposals” around transparency and accountability, but he seemed to stop short of endorsing Mr. Zuckerberg’s idea.

Jack Dorsey, the chief executive of Twitter, said he supported greater transparency and accountability. He said, however, that it would be very difficult to distinguish a large platform from a smaller one.

The Capitol riots “and the movement that motivated it started and was nourished on your platforms,” Representative Mike Doyle, Democrat of Pennsylvania, told the C.E.O.s.
Credit…Energy and Commerce Committee, via YouTube

Democratic lawmakers accused the chief executives of making money by allowing disinformation to run rampant online, reflecting their mounting frustration about the spread of extremism, conspiracy theories and falsehoods online in the aftermath of the Jan. 6 riot at the Capitol.

Their comments opened the first hearing since President Biden’s inauguration featuring Mark Zuckerberg of Facebook, Sundar Pichai of Google and Jack Dorsey of Twitter. They were a signal that scrutiny of Silicon Valley’s business practices will not let up, and may even intensify, with Democrats in the White House and leading both chambers of Congress.

Lawmakers expressed concern that the platforms had a financial incentive to keep users engaged by feeding them salacious or divisive content, fueling the spread of misinformation, conspiracies and extreme messages.

“You definitely give the impression that you don’t think that you’re actively, in any way, promoting this misinformation and extremism, and I totally disagree with that. You’re not passive bystanders,” said Representative Frank Pallone, the New Jersey Democrat who chairs the Energy and Commerce Committee. “You’re making money.”

The January riot made the issue of disinformation intensely personal for many lawmakers. Some participants have been linked to online conspiracies like QAnon, which the platforms have tried to stem in recent months.

Representative Mike Doyle, a Pennsylvania Democrat, pressed the executives on whether their platforms had responsibility for spreading disinformation related to the outcome of the 2020 election, which fueled the riot.

“How is it possible for you not to at least admit that Facebook played a leading role in facilitating the recruitment, planning and execution of the attack on the Capitol?” he asked Mr. Zuckerberg.

“I think that the responsibility here lies with the people who took the actions to break the law and do the insurrection,” said Mr. Zuckerberg, adding that people who spread the misinformation bore responsibility as well.

“But your platforms supercharged that,” Mr. Doyle said.

Lawmakers argued that the platforms also had enabled misinformation about the coronavirus pandemic.

The lawmakers’ growing frustration comes as they consider whether to more tightly regulate the business models of the platforms. Some have proposed modifying a legal shield that protects websites from lawsuits over content posted by their users, arguing that it allows the companies to get away with negligence in policing their products.

Representative Jan Schakowsky, Democrat of Illinois, said Thursday that the executives should take away that “self-regulation has come to the end of its road.”

Representative Bob Latta, Republican of Ohio, accused the platforms of a “commitment to serve the radical progressive agenda.”
Credit…Energy and Commerce Committee, via YouTube

Republican lawmakers came into the hearing steaming about the Jan. 6 Capitol riots, but their animus was focused on the decisions by the platforms to ban right-wing figures, including former President Donald J. Trump, for inciting violence.

The decisions to ban Mr. Trump, many of his associates and other conservatives, they said, amounted to liberal bias and censorship.

“We’re all aware of Big Tech’s ever-increasing censorship of conservative voices and their commitment to serve the radical progressive agenda,” said Bob Latta, the ranking Republican of the House’s communications and technology subcommittee.

After the Capitol riots, Mr. Trump and some of his top aides were temporarily or indefinitely banned on major social media sites.

Mr. Latta’s comments are expected to be echoed by many Republicans in the hearing. They say the platforms have become gatekeepers of information, and they accuse the companies of trying to suppress conservative views. The claims have been consistently refuted by academics.

Mr. Latta homed in on the legal shield known as Section 230 of the Communications Decency Act and whether the big tech companies deserve the regulatory protection.

“Section 230 provides you with the liability protection for content moderation decisions made in good faith,” Mr. Latta said. But he said the companies have appeared to use their moderating powers to censor viewpoints that the companies disagree with. “I find that highly concerning.”

Republicans pointed to Twitter’s decision to lock the account of The New York Post after an article about Hunter Biden, the son of President Biden, and his business dealings in Ukraine. Twitter later said that blocking the account was a mistake, but said the move had been made because the article relied on hacked materials, not because of political bias.

“This article was censored by Twitter,” said Representative Steve Scalise, Republican of Louisiana, who added that the story came from The New York Post, “a newspaper that goes back to 1801, founded by Alexander Hamilton.” He added that Twitter’s freeze of the account came at a high-stakes time, just weeks before the election.

Jack Dorsey, the chief executive of Twitter, reiterated it was a mistake.

“We had an incorrect interpretation,” he said. “We don’t write policy according to any particular political leaning.”

For many Republicans, the clearest harms from social media are seen on the impact on children, who they said are exposed to harmful content on the sites and experience loneliness and anxiety.

They interrogated the executives about the potential harms their sites have on young people. Representative Cathy McMorris Rodgers, Republican of Washington, said social media was her “greatest fear” as a parent.

“I’ve monitored where your algorithms lead them. It’s frightening. I know I’m not alone,” Ms. Rodgers said.

Democrats have been focused on the spread of disinformation on social media, especially in the wake of the Capitol riot. Republicans, meanwhile, have repeatedly questioned the companies about their decisions to remove conservative personalities and stories from their platforms.

Here’s what you need to know:

After his son was stabbed to death in Israel by a member of the militant group Hamas in 2016, Stuart Force decided that Facebook was partly to blame for the death, because the algorithms that power the social network helped spread Hamas’s content. He joined relatives of other terror victims in suing the company, arguing that its algorithms aided the crimes by regularly amplifying posts that encouraged terrorist attacks. Arguments about the algorithms’ power have reverberated in Washington.

Section 230 of the Communications Decency Act, has helped Facebook, YouTube, Twitter and countless other internet companies flourish. But Section 230’s liability protection also extends to fringe sites known for hosting hate speech, anti-Semitic content and racist tropes. As scrutiny of big technology companies has intensified in Washington over a wide variety of issues, including how they handle the spread of disinformation or police hate speech, Section 230 has faced new focus.

After inflaming political discourse around the globe, Facebook is trying to turn down the temperature. The social network started changing its algorithm to reduce the political content in users’ news feeds. Facebook previewed the change earlier this year when Mark Zuckerberg, the chief executive, said the company was experimenting with ways to tamp down divisive political debates among users. “One of the top pieces of feedback we’re hearing from our community right now is that people don’t want politics and fighting to take over their experience on our services,” he said.

As the Electoral College affirmed Joseph R. Biden Jr.’s election, voter fraud misinformation subsided. But peddlers of online falsehoods ramped up lies about the Covid-19 vaccines. Rep. Marjorie Taylor Greene, a Republican of Georgia, as well as far-right websites like ZeroHedge, have begun pushing false vaccine narratives, researchers said. Their efforts have been amplified by a robust network of anti-vaccination activists like Robert F. Kennedy Jr. on platforms including Facebook, YouTube and Twitter.

In the end, two billionaires from California did what legions of politicians, prosecutors and power brokers had tried and failed to do for years: They pulled the plug on President Trump. Journalists and historians will spend years unpacking the improvisational nature of the bans, and scrutinizing why they arrived just as Mr. Trump was losing his power, and Democrats were poised to take control of Congress and the White House. The bans have also turned up the heat on a free-speech debate that has been simmering for years.

Chief executives from Google, Apple, Amazon and Facebook testifying in July. Mark Zuckerberg of Facebook has testified six times on Capitol Hill.
Credit…Pool photo by Mandel Ngan

In the fall of 2017, when Congress called on Google, Facebook and Twitter to testify about their role in Russia’s interference with the 2016 presidential election, the companies didn’t send their chief executives — as lawmakers had requested — and instead summoned their lawyers to face the fire.

During the hearings, the politicians complained that the general counsels were answering questions about whether the companies contributed to undermining the democratic process instead of “the top people who are actually making the decisions,” as Senator Angus King, an independent from Maine, put it.

It was clear Capitol Hill wanted its pound of C.E.O. flesh and that hiding behind the lawyers was not going to work for long. That initial concern about how the chieftains of Silicon Valley would handle grilling from lawmakers is no longer a worry. After a slew of hearings in recent years, both virtual and in-person, the executives have had plenty of practice.

Since 2018, Sundar Pichai, Google’s chief executive, has testified on three different occasions. Jack Dorsey, Twitter’s chief executive, has made four appearances, and Mark Zuckerberg, Facebook’s chief, has testified six times.

And when the three men again face questioning on Thursday, they will do so now as seasoned veterans in the art of deflecting the most vicious attacks and then redirecting to their carefully practiced talking points.

In general, Mr. Pichai tends to disagree politely and quickly at the sharpest jabs from lawmakers — such as when Mr. Pichai was asked last year why Google steals content from honest businesses — but not harp on it. When a politician tries to pin him down on a specific issue, he often relies on a familiar delay tactic: My staff will get back to you.

Mr. Pichai is not a dynamic cult-of-personality tech leader like Steve Jobs or Elon Musk, but his reserved demeanor and earnestness is well suited for the congressional spotlight.

Mr. Zuckerberg has also grown more comfortable with the hearings over time and more emphatic about what the company is doing to combat misinformation. At his first appearance in 2018, Mr. Zuckerberg was contrite and made promises to do better for failing to protect users’ data and prevent Russian interference in elections.

Since then, he has pushed the message that Facebook is a platform for good, while carefully laying out the steps that the company is taking to stamp out disinformation online.

As the sessions have gone virtual during the pandemic, Mr. Dorsey’s appearances, hunched over a laptop camera, carry a just-another-guy-on-Zoom vibe when compared to the softly lit neutral backdrops for the Google and Facebook chiefs.

Mr. Dorsey tends to remain extremely calm — almost zen-like — when pressed with aggressive questions and often engages on technical issues that rarely illicit a follow-up.

Video

Cinemagraph
CreditCredit…By Sean Dong

In the On Tech newsletter today, Shira Ovide explains that the Section 230 debate reflects our discomfort with the power of Big Tech and our desire to hold someone accountable.

[ad_2]

Shared From Source link Breaking News

Leave a Reply

Your email address will not be published. Required fields are marked *