This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Fake news defence must not cede to global power shift

Fake news leads to consumer detriment and can undermine trust, cohesion and democratic processes.Â
Over the past few years, for example, individual electoral officers in the US have been subjected to hate campaigns amongst the widespread opinion by many that there was widespread fraud in the 2020 election.Â
In the flow of misinformation and fake news, suspicion has grown about whether things like evidence-driven public health messages, or long-established consensus on climate science are true.
False information about food and wellness supplements can cause direct harm if people buy products that are bad for their wallets and dangerous to their health.
And, while it can be tricky to isolate direct cause and effect, fake news and widely viewed, frequently repeated online misinformation can undermine people’s trust in their fellow citizens and in the institutions they elect.Â
How do consumers feel about fake news?
We know that people take fake news very seriously – over three-quarters of respondents (78%) in a recent Euroconsumers survey are worried about the impact of disinformation on their fellow citizens.Â
Euroconsumers’ latest cybercrime survey also found that fake news is a common part of consumers’ online experience:
- 59% of respondents reported they’ve come across fake news onlineÂ
- The majority 84% of encounters happened in the last 12 months.Â
The survey also found 1 in 10 consumers said they don’t know how to report fake news when they see it, they are very worried but lack the agency needed to take action.
The EU Code of Practice on Disinformation
Since 2022, a voluntary Code of Practice on Disinformation has been in place in the EU to combat disinformation and misinformation online. Its 40 signatories originally included Twitter/X, Meta, TikTok and Google as well as ad tech companies, civil society and fact-checking organisations.
The Code expected signatories to work with independent fact-checkers, remove any financial incentives for spreading fake news and analyse any fake accounts, bots, and malicious deep fakes that spread disinformation. Â
The Code of Practice also encouraged platforms to create consumer-facing tools that would make it easier for people to recognize and report any fake news or disinformation they come across.Â
Last week, as planned, the Code was formally included in the Digital Services Act as one such route for the very large online platforms (VLOPs) or very large search engines (VLOSEs) to demonstrate compliance and show they are mitigating the risks of disinformation.Â
Fake news lines of defence are under fireÂ
But how will this fare? The formalization of the code of conduct under the DSA comes at a time when some social media firms’ are signalling they are ready to claw back from previous commitments. Â
A signal of discontent with the European view of content moderation came when Elon Musk purchased Twitter, rebranded it as X and promptly withdrew from the Code.Â
Very soon after the Trump administration took office in January 2025, Meta announced that it would withdraw from using third party fact checkers and rely instead on the same system that X uses called ‘community notes’ which relies on users of the service to decide what may be misleading. Â
Later that month it was reported that Google would not be integrating fact-checking into its Search or YouTube services, as was expected to happen.Â
It appears that some of the very largest online platforms and search engines have been emboldened to make changes that align with the current values of their government.
There is also the question of fake news being spread by smaller platforms who are not captured under the definition of ‘very large’ and how they can be held to account for the spread of fake information.Â
From what we know of the risks and what our survey shows about the strength of consumer feeling about fake news, this is very concerning.Â
DSA: 12 months in and still no sanctions?
What will happen in July 2025, when the code is applied. Are national and EU authorities ready to take swift action on DSA infringements?Â
It feels that we have been waiting too long and all the while the problem of fake news is not going away. Today, one year on from the DSA coming into force, there have been numerous complaints, some investigations but crucially, no fines issued.Â
We have called on platforms and regulators to ramp up efforts to equip consumers with the agency to join the fake news battle.Â
But the enforcers must now take the lead, dissuasive fines can incentivise companies to change behaviour and manage the volume of damaging fake news and allay people’s fears about its impact in their communities.Â
If we are not brave enough to do this then who will be?Â