In the most recent release of the Facebook Files by Rep. Jim Jordan (R-OH), it has been revealed that a UK-based nonprofit, closely tied to the government and with affiliations to the far-left British Labour Party, allegedly engaged in the fabrication of statistics.
This organization, known as the Center for Countering Digital Hate (CCDH), led by Imran Ahmed, purportedly disseminated misleading information regarding the ‘disinformation dozen’ – a term used to refer to individuals with significant social media influence who expressed skepticism towards vaccines or lockdown measures.
The CCDH asserted that Robert Kennedy Jr. and eleven others were responsible for a substantial 65% of what they classified as “anti-vaccine content” circulating on social media. Rep. Jim Jordan, however, contends that this claim constitutes disinformation in itself.
He points out that Facebook employees expressed a high degree of skepticism towards the CCDH’s assertions, noting that many of these posts were simply reflections of “vaccine hesitancy.” Under Facebook’s policies, this does not necessarily equate to misinformation.
This revelation highlights the complex nature of content moderation and the challenges in discerning genuine concerns from misinformation in the realm of vaccines and public health.
It also underscores the importance of transparency and accuracy in information dissemination, particularly in such critical matters as public health measures during a global pandemic.
Additionally, it has come to light that the Biden White House consistently promoted the 65% statistic, despite growing internal concerns among Facebook employees.
These employees were on the verge of crafting a memorandum addressed to CEO Mark Zuckerberg, expressing their grievances about perceived “pressure from … the White House” to take action against the so-called Disinfo Dozen.
Importantly, they expressed doubt about the feasibility of immediate removal, indicating that they did not believe there was a straightforward approach to addressing the situation.
This situation highlights the complex dynamics between social media platforms and political entities in the context of content moderation and ‘misinformation.’
According to internal emails, Facebook continued to closely monitor the individuals dubbed the ‘disinformation dozen’ in an effort to substantiate censorship measures. However, their findings revealed that the “majority” of these individuals were not, in fact, spreading misinformation.
Despite Facebook’s clear policy against widespread cross-platform penalties, except in exceptional cases, the Biden White House persisted in advocating for increased censorship. Specifically, they urged Facebook to remove all URL links leading to websites outside of the platform.
This pressure extended to Facebook’s President of Global Affairs, Nick Clegg, who faced calls to further censor the ‘disinformation dozen’. This move caused frustration among Facebook staff, who felt that President Biden’s involvement played a significant role in ensuring that the influence of the Center for Countering Digital Hate (CCDH) persisted without relenting.
This revelation underscores the intricate dynamics between political influence and content moderation within social media platforms.