The Trump era may be receding in the rearview mirror, but the political battles that defined the former president's tenure are alive and well. 

On Thursday, March 25, Congress will hold a remote hearing with testimony from Facebook CEO Mark Zuckerberg, Twitter CEO Jack Dorsey, and Google CEO Sundar Pichai on the "misinformation and disinformation" that some argue continue to plague online platforms.  

This same cast of tech leaders appeared before Congress in late 2020 for a series of hearings that featured multiple back-and-forths with lawmakers, who made the case that tech companies were partially responsible for the contentious nature of the recent presidential election. 

For the upcoming joint hearing, multiple House committees are involved, including Energy and Commerce, Communications and Technology, and Consumer Protection and Commerce. The title, "Disinformation Nation: Social Media's Role in Promoting Extremism and Misinformation," reflects the hardline that many legislators have taken with the tech giants over content moderation. 

“Whether it be falsehoods about the COVID-19 vaccine or debunked claims of election fraud, these online platforms have allowed misinformation to spread, intensifying national crises with real-life, grim consequences for public health and safety,” the committee chairs said in a statement. “This hearing will continue the Committee’s work of holding online platforms accountable for the growing rise of misinformation and disinformation."

The time elapsed since the bruising election season has not softened legislators' rhetoric. 

"For far too long, big tech has failed to acknowledge the role they’ve played in fomenting and elevating blatantly false information to its online audiences. Industry self-regulation has failed," the chairs continued. "We must begin the work of changing incentives driving social media companies to allow and even promote misinformation and disinformation.”

This is only the latest in a seemingly unending series of hearings with America's tech leaders, with topics spanning from their relationship with news publishers to their role in the recent SolarWinds hack. Here's a look at what to expect from this Thursday's hearing.  

Facebook's War on Misinformation

In the lead-up to the hearing this week, Facebookfound itself embroiled in the debate over whether it's done an adequate job of preventing fraud, misinformation, and hate speech from spreading on the social media platform. 

Avaaz, a nonprofit advocacy group, released a report last Thursday finding that at least 267 Facebook groups and pages had spread "violence-glorifying material" to a total of 32 million users during the 2020 presidential election. 

Guy Rosen, vice president of integrity at Facebook, responded to the accusation that the company hasn't taken the issue seriously in a blog post breaking down the company's efforts. Those efforts have included disabling 1.3 billion fake accounts between October and December of 2020, shutting down more than 100 networks of "inauthentic behavior" over the past three years, and removing a total of 12 million pieces of misleading content about COVID-19 and vaccines.

"Despite all of these efforts, there are some who believe that we have a financial interest in turning a blind eye to misinformation," he wrote. "The opposite is true. We have every motivation to keep misinformation off of our apps and we’ve taken many steps to do so at the expense of user growth and engagement."

Facebook has also attempted to be proactive about spreading positive information. The company this month launched a tool to help users find vaccine centers. Also, in late 2020, it explicitly banned posts that deny or distort information about the Holocaust. 

Section 230

One issue that is set to return from the last set of tech hearings is the debate over Section 230 of the Communications Decency Act, which protects online platforms from legal liability for third-party content posted on their platforms.

Back in the fall, both Republicans and Democrats expressed interest in either repealing or modifying the law, a potentially devastating change for the tech companies' business models. 

"We would be more specific on who can use the liability protections in 230, how it can be applied, and when it can be applied," Sen. Marsha Blackburn (R-Tenn.) told Cheddar in October. 

The two parties, however, have different motivations for going after Section 230. 

Republicans have accused the tech companies of an anti-conservative bias, and would like to see a ban on censorship. At the time of the last hearing, Twitter had recently flagged one of former president Trump's tweets for being misleading, spurring a backlash from conservatives.

Other Republicans have been more hands-off:

Democrats, meanwhile, took aim at the platforms for their role in spreading the "Stop the Steal” campaign spearheaded by Trump, which culminated in the Capitol Hill riots on January 6. 

Indeed, witness testimony released ahead of Thursday's hearing show that the 2020 election and its aftermath are likely to be a major topic of discussion, as the tech companies highlight and defend their practices. 

"Our work is never done, and we continue to learn and improve from one election cycle

to the next, and continue to evolve our policies," said Google's Pichai in his testimony. 

According to Zuckerberg's opening remarks, Facebook has its own proposal for setting a higher standard for itself. 

"Instead of being granted immunity, platforms should be required to demonstrate that they have systems in place for identifying unlawful content and removing it," the CEO said. "Platforms should not be held liable if a particular piece of content evades its detection — that would

be impractical for platforms with billions of posts per day — but they should be required to have

adequate systems in place to address unlawful content."

Antitrust Laws 

Crucially, the regulatory tweaks under discussion are happening against the backdrop of a broader conversation around antitrust in the tech industry. 

Sen. Amy Klobuchar (D-Minn.) introduced a bill in February that reforms antitrust laws to shift the burden of proof onto major companies, require closer scrutiny of mergers and acquisitions, and provide additional funding to regulators. 

“Competition and effective antitrust enforcement are critical to protecting workers and consumers, spurring innovation, and promoting economic equity," Klobuchar said. "While the United States once had some of the most effective antitrust laws in the world, our economy today faces a massive competition problem. We can no longer sweep this issue under the rug and hope our existing laws are adequate." 

It's unclear if lawmakers will tackle antitrust issues directly at Thursday's hearing, but the potential for more aggressive federal action looms behind the discussion around content moderation and liability.

At the very least, there is a strong sentiment among the tech leaders that earning back its users' and the country's trust should be a priority. 

"As we look to the future, I agree with this Committee that technology companies have work to

do to earn trust from those who use our services," said Twitter's Dorsey in his testimony. "For Twitter, that means tackling transparency, procedural fairness, algorithmic choice, and privacy."

Share:
More In Business
Load More