You couldn't create a super microphone that would broadcast what you want to say all over town at 4AM. And if someone stopped you, you wouldn't say, “well, that violates my first amendment rights!”
Or would you?
In 2024 about two thirds of democracies will have elections, something we won't see again for over two decades.
In June alone, 27 member countries of the European Union will vote in its parliamentary election.
This multitude of election activity walks hand in hand with the increase in misinformation and it's all amplified by artificial intelligence.
As part of our series on misinformation on the Click to Trust Podcast, I sat down with Steven Brill, the Co-Founder and Co-CEO of NewsGuard.
Steven is also renowned journalist, author and pioneer in the fight against misinformation. His book "The Death of Truth," addresses the current misinformation crisis and proposes solutions to restore faith in factual reporting.
Steven shared his thoughts on the role that human judgment, AI, and societal measures play in fighting online misinformation during an election year.
Today, I’ll share some insights from our chat, which you can listen to in full, here.
We’ll start by looking at how NewsGuard’s approach to misinformation sets it apart in this ongoing battle.
NewsGuard's Human-Centric Approach to Fighting Misinformation
When I asked Steven Brill how NewsGuard is fighting misinformation, he sat back and said:
"We have a very strange way of doing it. We use human beings.”
NewsGuard employs trained journalists to review and rate news websites based on nine apolitical journalistic criteria such as the separation of news from opinion, ownership disclosure, and the clear labelling of advertisers.
“We read all of the websites responsible for 95% of engagement online in the countries where we operate."
This refreshingly human-centric approach allows NewsGuard to provide transparent, accountable scores for news sources, which are then made available to users, companies and advertising agencies.
NewsGuard also creates "misinformation fingerprints," cataloging false narratives and providing tools to track and counteract disinformation, helping brands and organizations protect their reputations.
In 2019, the University of Michigan’s CSMR updated its "Iffy Quotient" metric, using NewsGuard’s ratings to track unreliable news URLs on Facebook and Twitter (now X).
The Triad of Misinformation: Social Media, Demagogues, and Programmatic Advertising
Misinformation thrives in an ecosystem fueled by social media algorithms, demagogues exploiting uncertainties, and programmatic advertising.
The latter, in particular, has allowed false narratives to profit from advertising revenue without advertisers’ knowledge – which further propels the spread of misinformation.
1) Social Media Algorithms
Social media platforms are designed to maximize user engagement, which often means promoting sensational or divisive content.
Algorithms often prioritize content that keeps users on the platform longer. Inflammatory posts, which can include misleading information are prone to high engagement, so the more likes, the more eyes they will get.
This is called “algorithmic amplification” (we’ve talked about it before) and is a silent actor in the spread of misinformation.
"If your whole business model is advertising revenue, and advertising revenue derives from the time that the highest number of eyeballs spend looking at something, then you're going to produce the content that the most people will want to look at."
2) Programmatic Advertising
On average, Americans are exposed to as many as 10,000 advertisements every day.
Programmatic advertising allows ads to be placed across thousands of websites, often without advertisers knowing where their ads will appear.
This automated buying and selling of online ads makes the process faster and more efficient, but this system can inadvertently fund misinformation sites, as ads are often placed on the basis of cost and reach rather than content quality.
The good news is, many programmatic advertising companies are actively working towards solving this issue – working with companies like Newsguard to identify websites that should be avoided.
3) Demagogues and Hucksters
Bad actors exploit platforms’ reach to spread their own agendas.
Political demagogues use misinformation to manipulate public opinion and garner support, while hucksters promote false narratives to gain financially or socially.
We wrote about the different types of misinformation spreaders, here.
Who Killed the Truth?
"The erosion of trust in basic facts is largely the result of too many people getting their news from social media platforms, sorted out and presented by algorithms designed to engage people by offering content that excites rather than informs."
Steven Brill’s "The Death of Truth," delves into the deeper, philosophical aspects of the current misinformation crisis.
During our conversation, Steven shared a story that involved an older couple who, despite being scammed out of thousands of dollars by Trump University, still supported Donald Trump. Why? They chose to believed alternative explanations to the scam - explanations they read online.
Steven explained how this phenomenon reflects a broader issue where people are increasingly inclined to reject inconvenient facts in favor of comfortable falsehoods.
He recounted another story of a man dying of COVID, who, even in his last moments, insisted the disease was a hoax.
These anecdotes underscore the profound challenges we face in restoring trust in objective reality.
Steven argues that the solution lies not only in better regulation and accountability but also in fostering a culture of critical thinking and media literacy.
If you’ve been keeping up with our last couple of blog posts, I think you’ll start to see a pattern.
💡 A couple of Resources for Media Literacy
NewsGuard’s Reality Check Newsletter: How misinformation online is undermining trust—and who’s behind it.
Factually Newsletter: Factually highlights interesting fact-checks, dives into global misinformation trends and features fact-checking and media literacy training opportunities.
Combatting Misinformation: Solutions and Personal Responsibility
"What citizens can do is ask their favorite brands or their employers, where are you advertising? Are you advertising on real journalism or the opposite?"
While tools like NewsGuard play a crucial role in countering misinformation, broader societal measures are needed.
This includes holding internet service providers accountable for carrying non-watermarked AI content and ensuring that social media platforms take responsibility for their content.
As we navigate the complexities of misinformation in the digital age, we must stay vigilant and proactive.
By advocating for greater transparency and accountability in media and advertising, we can contribute to a more informed and responsible online community.
Stay informed, challenge misinformation, and remember to "panic responsibly" as we approach a pivotal period in global democracy.
You can watch our full episode of Click to Trust with Steven Brill here: