Tech's biggest chief executives faced sharp criticism from members of Congress Thursday about how their companies are handling misinformation spread on their platforms about the COVID-19 vaccine.
Members of the House Energy and Commerce Committee pressed Facebook CEO Mark Zuckerberg, Twitter CEO Jack Dorsey and Google CEO Sundar Pichai about their companies' efforts to stem anti-vaccine content.
There is increasing concern from public health experts and legislators that misinformation about COVID-19 and the vaccines is threatening public health efforts to inoculate Americans against the virus.
Rep. Mike Doyle, D-Pennsylvania, chair of the House Subcommittee on Communications & Technology, said his staff easily found misinformation about COVID-19 vaccines on social media platforms including Facebook, YouTube and Twitter.
"You can take this content down. You choose not to. But time after time you are picking engagement and profit over the health and safety of your users," Doyle said during the hearing. "It seems like you just shrug off billion-dollar fines. Your companies need to be held accountable. We will legislate to stop this. The stakes are simply too high."
Twelve individuals or organizations are tied to up to 65% of anti-vaccine content circulating on major social media networking sites, according to a report by the Center for Countering Digital Hate and Anti-Vax Watch. These 12 are dubbed the "Disinformation Dozen."
"Why, in the middle of a pandemic, haven’t you taken these accounts down? Your unwillingness to unambiguously commit to enforcing your own policies and remove the 12 most egregious spreaders of vaccine disinformation from your platforms gets right to what I'm concerned about," Rep. Jerry McNerney, D-California, said.
He added, "You too often don’t act even though you have the resources to do that. There are real harms associated with this."
Vaccine hesitancy by some populations continues to hinder efforts to vaccinate Americans against the COVID-19 virus. About 30% of Americans continue to express some hesitancy around getting a COVID-19 vaccine, according to a recent NPR/PBS NewsHour/Marist poll.
There is increasing support in Congress for legislation to rein in Big Tech companies. Top Democrats in Congress have renewed their scrutiny of social media platforms following the Jan. 6 attack by Trump supporters on the U.S. Capitol, which was organized on social media.
"The time for self-regulation is over. It’s time we legislate to hold you accountable,” Rep. Frank Pallone, D-New Jersey, the committee’s chairman, said during the hearing.
The three CEOs defended their companies' efforts to combat COVID-19 disinformation on their platforms.
Zuckerberg testified that Facebook uses its platform to proactively connect people to authoritative information.
"We have directed over 2 billion people to our COVID-19 Information Center," he said. That information shows at the top of Facebook news feeds and on Instagram.
The social media platform has removed 12 million pieces of false content related to COVID-19, he testified.
In December, Facebook began removing false claims about COVID-19 vaccines that could lead to imminent harm, including false claims about the safety, efficacy, ingredients or side effects of the vaccines, according to Zuckerberg's written testimony.
Pichai highlighted Google’s role in connecting users with vaccine information and other COVID-19 resources.
The company has removed 850,000 YouTube videos related to dangerous or misleading COVID-19 medical information and has blocked 100 million COVID-19-related ads throughout 2020, he testified.
In his written testimony, Pichai called for clearer content policies and giving users a way to appeal content decisions as a way to combat misinformation.
Dorsey said Twitter is stepping up innovative efforts to counter harmful misinformation with a focus on user-led content moderation. The company launched a pilot project called "Birdwatch," which Dorsey called a crowdsourcing approach to tackle misinformation. Through Birdwatwch, people can identify information in tweets they believe is misleading and write notes that provide informative context.
RELATED: YouTube brings on CVS Health exec to launch healthcare content on its platform
To rein in the tech industry and to address online misinformation, federal lawmakers are considering making sweeping changes to Section 230 of the Communications Act of 1934. That law provides a liability shield that grants websites legal immunity for much of the content posted by their users.
Zuckerberg is calling for Section 230 reform. He has proposed a form of conditional liability, where online platforms gain legal protection only if they adhere to certain best practices established by an independent third party.
In response to questioning Thursday, Pichai said he was open to some of the Section 230 changes proposed by Zuckerberg.
"There are definitely some good proposals around transparency and accountability. We would certainly welcome legislative approaches in that area," he said.
But the tech giant's self-imposed efforts might not be enough to satisfy members of Congress.
"Self-regulation has come to the end of its road," said Jan Schakowsky, D-Illinois, chair of the Consumer Protection and Commerce Subcommittee, noting lawmakers are preparing to move forward with legislation and regulation.
"The regulation that we seek that should not attempt to limit constitutionally protected freedom of speech, but it must hold platforms accountable when they are used to incite violence or hatred, or, in the case of the COVID pandemic, spread misinformation that costs thousands of lives," she said.