Public Understanding of Misinformation


Jenny Benz, NORC at the University of Chicago

As we head into an election season full of unknowns, there is one gamble I am willing to take: Misinformation will be a central character in the 2024 election story. Any number of indicators suggest that’s a safe bet.

Research on the volume of misinformation spread online reveals that it peaks during election years. Many measures of news consumption show that about half of adults in the U.S. regularly get their news from social media, even though they lack confidence in the information and think social media is bad for our democracy. Meanwhile, some social media companies are reducing their efforts to combat the spread of misinformation in 2024.

Layer on the evidence that generative AI is already being used to mislead voters, and the conditions for misinformation becoming a lead player in the 2024 election story are all present. Misinformation will likely be a factor in understanding people’s knowledge about the election, their views on the candidates, opinions on the issues, and their fundamental trust in the democratic process.

So, what does the public understand about misinformation? The Associated Press-NORC Center for Public Affairs Research has been collaborating with scholars and journalism organizations to address that question. Here’s what we’ve learned:

Awareness of the problem is high.

A large majority of adults in the U.S. think misinformation is a problem, and there is concern about misinformation across the political spectrum, age groups, and race and ethnicity. Based on a national survey from AP-NORC and the Pearson Institute at the University of Chicago conducted just before the 2022 midterm election, 74% of adults considered misinformation a major problem for getting news and information about current events and important issues. Another 16% considered it a minor problem.

That survey also found a high level of public awareness about the impact of misinformation. About half of adults recognized that misinformation can fire people up and increase political engagement. Even more felt that it could increase extreme political views and reduce trust in government.

The impact of misinformation is viewed similarly by Democrats and Republicans. Democrats are slightly more likely than Republicans to believe misinformation increases extreme political views, but there’s still a solid majority in both parties.

Similarly, a 2023 AP-NORC survey conducted with the Harris School of Public Policy found that 58% of U.S. adults expected AI to increase the spread of misinformation during the 2024 presidential election. And while few adults reported personal experience using AI, those who did use it were even more likely to be concerned about the risks.

People recognize they are susceptible.

Most people are aware that they encounter misinformation online, and many admit they may have shared misinformation, whether purposefully or unwittingly. In a 2023 AP-NORC survey conducted with Robert F. Kennedy Human Rights, a third of adults reported encountering stories containing false claims from politicians (32%) and misleading headlines (31%) on a daily basis. Nineteen percent reported encountering conspiracy theories in stories daily.

In the AP-NORC/Pearson poll leading into the midterm elections, 71% were concerned that they had been exposed to misinformation. And 43% acknowledged they may have spread misinformation themselves, even if it was unintentional. Again, these concerns about exposure to and spread of misinformation were similar across demographic and political groups.

Agreement that action is needed.

Many feel that individuals have a role to play in curbing misinformation. Across studies, people are just as likely to blame social media users for spreading misinformation as they are to blame social media companies, politicians, and the press. And people do report taking personal actions to prevent the spread of misinformation – only sharing if they are confident in the information, verifying with multiple sources or fact-checking websites, or paying close attention to the source of the information.

Although it is a positive sign that people know what they should do to combat misinformation, research  from the Media Insight Project shows they don’t always do it. For example, an experimental study showed that how much people trust information from a news story they encounter on social media depends more on how much they trust the person sharing the content rather than on the source of the news article. People who saw a news item from a made-up news source but shared by someone they trust were more likely to have confidence in the information and engage with the content than people who saw the same article from a reputable source but shared by someone they did not trust.

The public also places responsibility for stopping the spread of misinformation on companies and the government. Majorities feel that social media companies, news media, politicians, and the government have a responsibility for addressing the spread of misinformation. Again, this is an issue where there is bipartisan support. For example, at least half of both Democrats and Republicans were in favor of eight different proposed actions to control the use of AI in the 2024 election.

A role for AAPOR

The public opinion industry can help prevent the spread of misinformation. High-quality polling that delves deeply into people’s news behaviors, understanding of issues, and perceptions of candidates can provide timely signals of how misinformation is penetrating the election narrative before it is too late. And a collective focus on responsible and transparent measurement and reporting of our work is necessary, albeit insufficient, to prevent public opinion and election polling from being misused to fuel misinformation narratives.