The trouble with the truth
Meta's abandonment of fact-checking for community notes exposes an uncomfortable truth about ourselves
One of the first conspicuous news events of 2025 was Meta’s announcement that they would be ending their third party fact-checking program to deal with misinformation. Instead, the social platforms it operates (Facebook, Instagram and Threads) will adopt a Community Notes approach (except for posts that are “illegal or high-severity violations”). Already in operation at Twitter/X since Elon Musk acquired the company two years ago, it relies on contributing users to identify potentially misleading or incorrect posts and adding context, based on “agreement between people with a range of perspectives to help prevent biased ratings”. The reactions split sharply along ideological lines: liberals thought it was a terrible abandonment of truth, while conservatives celebrated it as a terrific way to democratize verification. But if it is possible to establish unequivocally whether a proposition is true or false, surely we could do so for “fact checkers are superior to address misinformation than community notes”? Isn’t it odd that this was never done?
The ambiguity of truth
When we talk about truth, we can mean two very different things: something that is empirically demonstrable ("it is raining" – to verify, just look outside), but also something that is consensus-based ("ultra-processed food is unhealthy" – a belief that is widely shared, but also widely contested). Our failure to distinguish between these types leads us to treat both with equal certainty, and that can be problematic.

We are practitioners of belief-based evidence, rather than of evidence-based beliefs. Take inflation: few of us have studied monetary policy in depth, yet that does not stop us holding a strong view about whether it is true that government spending is the primary driver of price increases. We don’t start with a blank slate, but with initial convictions, then seek out information that confirms them.
Our initial beliefs can stem from multiple sources: authority figures like parents and teachers, consensus among our peers, and most powerfully, through association – and only rarely genuine primary evidence. Consider ultra-processed food again: someone already suspicious of Big Food's profit motives easily concludes such products must be unhealthy. Meanwhile, someone wary of government overreach readily dismisses health concerns as baseless scaremongering.
This association-based reasoning turns us into reductionists. Complex realities get forced into simple true or false categories. The nutritional impact of food processing varies by ingredient, method, individual response and more – but such nuance requires more mental effort than sticking a simple "healthy" or "unhealthy" label on foods.
Our interpretation of facts suffers from the same binary thinking. We see Elon Musk forcefully stretch out his right arm during a speech – that's the fact. But some see an unambiguous Nazi salute, while others see an innocent (albeit perhaps ill-considered) gesture emphasizing "my heart goes out to you." Same fact, different truths, but equally certain.
A classic series of experiments by Solomon Asch illustrates how powerful a consensus view of the truth can be: participants were asked to say which of three lines matched the length of a target line, while part of a group of confederates. When these intentionally gave wrong answers, three quarters of the participants went along with this incorrect 'consensus' at least once (a quarter even every time). If we feel compelled to conform even when we can clearly see we are wrong, just imagine how much stronger the pull of the consensus is when it aligns with our prior beliefs. What matters to us more than the truth is what our peers believe to be true. Consensus trumps evidence.
The honest truth
Is independent expert fact-checking better than community notes to help us navigate between truth and falsehood? The former has long been accused of pushing left-leaning, filtered truths, the latter denounced as right-wing propaganda devoid of scientific rigour. Both critiques reveal less about the methods' effectiveness than about our tendency to reject any conclusion that challenges our beliefs.
<musk gesture>
[Evidence-based belief, or belief-based evidence? (image: screenshot)]

But neither approach can escape their fundamental limitations. Fact-checkers lack direct evidence for many claims, and must rely on selective expert opinion that inevitably reflects their own biases. Community notes may draw on diverse perspectives, but balancing contradictory views is not the same as finding truth. An acknowledged expert's interpretation is still just that – interpretation. And a thousand uninformed opinions, however balanced ideologically, cannot produce genuine insight. Both methods ultimately reflect what people believe rather than what is demonstrably true.
The debate about fact-checking versus community notes reveals an amusing irony: we all believe it is others that need to be protected from misinformation, while we ourselves can surely tell truth from falsehood. Yet we are all equally susceptible to embracing 'truths' that merely confirm our existing beliefs.
The path forward starts with intellectual honesty: recognizing our own tendency towards belief-based evidence, and recognizing that certainty does not equate truth. This is uncomfortable – it's easier to believe we alone see clearly while others need fact-checkers to guide them.
Once we acknowledge this, both fact-checking and community notes can serve us better, each in their own way. Instead of dismissing conclusions that challenge our beliefs, or uncritically accepting those that support them, we can use them as they were intended: not as adjudicators of truth, but as invitations to question our own certainties.
The fact that this tried to be an ambiguous gesture is exactly the point. Although I agree: if you cannot recognize the hints nor the consequences, there's definitely some belief-based evidence going on here.