Facebook is not doing enough to stop the spread of false claims about COVID-19 and vaccines, White House press secretary Jen Psaki said on Thursday, part of a new administration pushback on misinformation in the United States.
Facebook, which owns Instagram and WhatsApp, needs to work harder to remove inaccurate vaccine information from its platform, Psaki said.
She said 12 people were responsible for almost 65 percent of anti-vaccine misinformation on social media platforms. The finding was reported in May by the Center for Countering Digital Hate, but Facebook has disputed the methodology.
“All of them remain active on Facebook,” Psaki said. Facebook also “needs to move more quickly to remove harmful violative posts,” she said.
US Surgeon General Vivek Murthy also raised the alarm over the growing wave of misinformation about COVID-19 and related vaccines, saying it is making it harder to fight the pandemic and save lives.
“American lives are at risk,” he said in a statement.
In his first advisory as the nation’s top doctor under President Joe Biden, Murthy called on tech companies to tweak their algorithms to further demote false information and share more data with researchers and the government to help teachers, health care workers and the media fight misinformation.
“Health misinformation is a serious threat to public health. It can cause confusion, sow mistrust, harm people’s health, and undermine public health efforts. Limiting the spread of health misinformation is a moral and civic imperative,” he said in the advisory, first reported by National Public Radio.
False information feeds hesitancy to get vaccinated, leading to preventable deaths, Murthy said, noting misinformation can affect other health conditions and is a worldwide problem.
A Facebook spokesperson said the company has partnered with government experts, health authorities and researchers to take “aggressive action against misinformation about COVID-19 and vaccines to protect public health.”
“So far we’ve removed more than 18 million pieces of COVID misinformation, removed accounts that repeatedly break these rules, and connected more than 2 billion people to reliable information about COVID-19 and COVID vaccines across our apps,” the spokesperson added.
Facebook has introduced rules against making certain false claims about COVID-19 and its vaccines. Still, researchers and lawmakers have long complained about lax policing of content on its site.
Murthy said at a White House press briefing that COVID-19 misinformation comes mostly from individuals who may not know they are spreading false claims, but also a few “bad actors.”
His advisory also urges people not to spread questionable information online. The head of the Center for Countering Digital Hate, a group that tracks COVID-19 misinformation online, said it was inadequate.
“On tobacco packets they say that tobacco kills,” the group’s chief executive Imran Ahmed told NPR. “On social media we need a ‘Surgeon General’s Warning: Misinformation Kills.’“
US COVID-19 infections last week rose about 11 percent from the previous week, with the highest increases in areas with vaccination rates of less than 40 percent, according to the Centers for Disease Control and Prevention (CDC), and continued to tick up on Wednesday.
Cases plummeted in the spring as the vaccine rolled out following a winter spike in infections, but shots have slowed and just about 51 percent of the country has been vaccinated, Reuters data show.
“It’s been hard to get people to move” from not wanting the COVID-19 vaccine “to recognizing that the risk is still there,” Dr. Richard Besser, a former CDC chief who now heads the Robert Wood Johnson Foundation, told MSNBC.
Representatives for the nation’s largest tech companies could not be immediately reached for comment on the advisory.
In related news Facebook Inc. said on Thursday it was in talks with UK law enforcement officials to help support investigations into online racial abuse against English soccer players following their recent loss to Italy in the Euro 2020 final.
Black players in the England team were subjected to a storm of online racist abuse after their defeat in the final of the soccer tournament, drawing wide condemnation from the squad’s captain, manager, royalty, religious leaders and politicians.
The comments also prompted a police investigation, although critics accused some ministers of hypocrisy for refusing to support a high profile anti-racist stance the players took during the tournament.
Facebook said it was in discussions with Britain’s National Police Chiefs Council, the UK Home Office Football Policing Unit and local police forces to understand how it can support active investigations, while ensuring that valid data requests are submitted.
British Prime Minister Boris Johnson pledged on Wednesday to toughen measures against online hate, banning fans from games if they are found guilty of such offenses and fining social media for failing to remove it.
Social media companies including Facebook and Twitter Inc. have come under fire for amplifying hate speech and misinformation globally across their platforms.
Facebook also removed more than 25 million hate speech posts from its platform and more than 6 million posts on Instagram that contained words or emojis promoting racism, during the first three months of the year, it said in a blog post.
Source: Arab News