Dr Kamran Abbasi — editor of the British Medical Journal — slammed Facebook for ‘censoring’ its report into allegations of malpractice during Pfizer’s Covid vaccine trials
The editor of the British Medical Journal has slammed Facebook for ‘censoring’ its report into allegations of malpractice during Pfizer’s Covid vaccine trials.
Dr Kamran Abbasi accused the social media giant of suppressing ‘fully fact-checked’ journalism and ‘trying to control how people think’.
A BMJ investigation in November warned that a contractor which ran a number of Pfizer‘s original jab studies may have falsified data and skewed findings.
Its report was based on dozens of internal documents, photos, audio-recordings, videos and statements from three former employees.
But when some users shared the journal entry on Facebook, their post was automatically given a ‘missing context’ label.
The shared article was also accompanied by a warning that said it could ‘mislead people’ and a link to a fact-checking website.
The BMJ is lodging a complaint with Facebook’s Oversight Board this week after a failed appeal to Mark Zuckerberg to have the tags removed made via an open letter.
Dr Abbasi wrote in the BMJ today: ‘We should all be very worried that Facebook, a multibillion dollar company, is effectively censoring fully fact checked journalism that is raising legitimate concerns about the conduct of clinical trials.’
He added: ‘Facebook’s actions won’t stop The BMJ doing what is right, but the real question is: why is Facebook acting in this way? What is driving its world view?
‘Is it ideology? Is it commercial interests? Is it incompetence?
The BMJ has slammed Facebook after its article on mistakes made during the Pfizer vaccine trials was labelled as ‘missing context’. Above shows a post where a user shared the article, which is accompanied with a note from Facebook
People who shared the BMJ article on Facebook also received these notifications. The BMJ has already complained to Mark Zuckerberg, and will be taking the case to its Oversight Board this week. Facebook said the article was labelled because it was being shared by anti-vaxxers
‘Users should be worried that, despite presenting itself as a neutral social media platform, Facebook is trying to control how people think under the guise of “fact checking”.’
The BMJ’s article related to an arm of Pfizer’s trial in Texas which involved 1,000 participants. There is no suggestion that it skewed the overall findings of the broader trial.
The prestigious medical journal has submitted a complaint to the International Fact-Checking Network (IFCN), which helps to police misinformation online.
Social media bosses face tough criminal sanctions for hosting extremist content
Social media bosses could face ‘criminal sanctions with tough sentences’ if they allow extremist content to appear on their platforms, Boris Johnson has said.
He told MPs that the forthcoming Online Safety Bill would tackle web giants if they allow ‘foul content’ to circulate.
And he promised the long-awaited legislation would make quick progress in the Commons, with the bill receiving its second reading before Christmas.
But a Whitehall source later said the second reading might not take place until early next year.
Published in May, the draft bill gives regulator Ofcom the power to impose multibillion-pound fines on technology giants that fail to show a duty of care to users.
But it stops short of bringing criminal sanctions against bosses.
Instead, a new criminal offence for managers has been included as a deferred power that can be introduced if Ofcom finds that firms are failing to keep to their new responsibilities.
Some campaigners have raised fears that the rules risk stifling the free press, ‘silencing marginalised voices’ and introducing ‘state-backed censorship’.
Facebook says the article was initially labelled as ‘missing context’ because it was being used by anti-vaxxers as ‘proof’ that Covid jabs were unsafe.
Social media giants like Facebook, Twitter and Instagram have come under intense scrutiny over misinformation during the pandemic, particularly related to vaccines.
Earlier this month, Boris Johnson said anti-vaxxers were being allowed to spread ‘mumbo-jumbo’ and ‘complete nonsense’ online.
Gary Schwitzer, from the University of Minnesota and who runs a website evaluating health journalism, said Facebook’s fact-checking process was not transparent or consistent enough.
The BMJ’s Pfizer trial report, published in November last year, warned corners had been cut in an arm of the study run by medical company Ventavia.
Whistleblower Brook Jackson, who briefly worked for the business in 2020, told the BMJ Ventavia did not always test patients with symptoms, potentially masking how well jabs performed.
She added that it was ‘falsifying’ data, and said underqualified staff had been hired as vaccinators and to follow-up on side effects.
Her statements were corroborated by dozens of internal documents and two former Ventavia employees, who wished to remain anonymous.
The Texas-based contractor was responsible for 1,000 participants at three sites in the state, or just two per cent of all those involved.
But the report raised concerns that similar issues may have crept into other parts of the trial, although these remain unsubstantiated.
The BMJ’s head of journalism Rebecca Coombes and investigations editor Madlen Davies said they had ‘serious concerns’ about the fact checking being undertaken for Facebook.
They said the lack of accountability and oversight of third-party ‘fact-checkers’ was leading to the censorship of accurate information.
Researchers running a small number of Pfizer’s original Covid vaccine trials may have ‘falsified’ study findings, the BMJ has reported (file)
Pfizer has since rehired Ventavia to work on four other trials of its jabs, including for children, young adults and pregnant women, as booster doses.
Ventavia has denied the claims, saying an internal investigation had found they were ‘unsubstantiated’.
They said Ms Jackson was employed by the company for approximately two weeks in September 2020, and was not responsible for the Covid vaccine trial as part of her job.
Meta, which owns Facebook, told MailOnline that its fact checkers were responsible for reviewing content and applying ratings to stories.
A spokesman said: ‘We are transparent with publishers when their content is fact checked, and we have an appeals process in place for publishers who wish to issue a correction or dispute a rating directly with a fact checker.’
Facebook has about 2.85billion users, equivalent to more than a third of the world’s population.
The UK’s upcoming Online Safety Bill is set to make tech giants accountable for ‘harmful’ content being circulated on their platforms.
This will include illegal content, such as terrorist propaganda and child abuse, and ‘legal but harmful’ content, such as cyberbullying and misinformation — including around Covid vaccines.
News organisations have raised concerns, however, that it will lead to legitimate journalism being taken down because companies will use algorithms to sift through posts.
Peter Wright, editor emeritus of DMG Media, has previously said social media platform’s algorithms for monitoring content are ‘very poor’.
He told a Parliamentary Committee hearing that attempts by Facebook to moderate journalism in the US was leading to articles being blocked before anyone had a chance to read them.
Mr Wright said; ‘it’s arbitrary, it often fails to understand the nature of the content, it’s imposed without any sort of process, it is not in line with English legal thinking on journalism, which is that the editor must take responsibility for what he or she publishes, and pay the consequences afterwards’.