The original disinformation code, which was introduced by the European Commission in 2018, neglected a few critical elements in terms of combating manipulating practices used to propagate false information, such as deep fakes and bots. Due to this, EU officials recently set stricter guidelines for how digital companies handle disinformation.
The EU’s enhanced Code of Practice on Misinformation replaces its prior standards, focusing on transparency in political ads, removing avenues for disinformation producers to benefit from ads, enabling investigators, and encouraging the fact-checking movement, among other promises.
Google, Meta, Twitter, Amazon’s Twitch, Microsoft, and TikTok are among the digital companies that have agreed to the Commission’s updated and tougher standards. Vimeo and Clubhouse, two smaller internet venues, have also contributed. The amended rules have 34 signatories and are backed by a network of fact-checkers and academics.
If these signatories fail to live up to their promises, they could pay a penalty of equal to 6% of their global income. Participating organizations have 6 months to show compliance with the increased standards now that the new code of ethics is in place. They must, however, pick which one of 44 promises and 128 steps to implement.
By 2023, the participating companies must have implemented the amended rules. The rule will also be incorporated into existing anti-disinformation legislation, like the Digital Services Act and the Transparency and Targeting of Political Advertising Act.