Police authorities’ struggle to counter malign online content will be always an uphill battle until social media that allow such content to thrive and spread take more accountability and start implementing effective measures to prevent it. The launch of the novel Transparency Centre by the signatories of the 2022 Code of Practice on Disinformation (CoP), including all major online platforms like Google, Meta, Microsoft, TikTok, and Twitter, signals a positive step towards greater responsibility and cooperation, but much more work remains to be done.
The new Transparency Centre is designed to promote visibility and accountability by providing a centralised repository of information on signatories’ efforts to combat disinformation and implement CoP commitments. This resource will be available to a wide range of stakeholders, including EU citizens, researchers, and NGOs. The launch of the Centre also marks the first time all signatory online platforms have published baseline reports on how they are turning their CoP commitments into practice. These reports provide valuable insights into the platforms’ efforts, such as the prevention of advertising revenue flowing to disinformation actors, the number or value of political ads accepted or rejected, and the detection of manipulative behaviors, such as the creation and use of fake accounts. The reports also include information on the impact of fact-checking, enabling a better understanding of the effectiveness of different approaches to countering disinformation.
In general, having access to this data provides the expert community with valuable insights into the actions being taken by online platforms to combat disinformation, as well as identifying which platforms are falling behind and require closer scrutiny. Additionally, it allows the experts to gain a deeper understanding of the tactics employed by disinformers on various platforms, providing valuable data to inform future research and develop more effective countermeasures. Ultimately, this information can be utilised also by the VIGILANT team to enhance our work to provide law enforcement officials with the necessary tools to effectively combat disinformation with criminal implications.
The consortium member GLOBSEC, as a signatory to the CoP, has been trying to push for greater accountability and effective solutions by participating in the continuous dialogue with the platforms since 2021. And while the creation of the Transparency Center and submission of reports by online platforms is a step in the right direction, the reports still lack crucial data. Twitter’s report, for example, was flagged by the European Commission for being too brief and not providing enough detail.
Many countermeasures put forward by the platforms so far have been half-baked. Hence, it is clear that much more work needs to be done in the future to address these challenges effectively.