Skip to main content
It looks like you're using Internet Explorer 11 or older. This website works best with modern browsers such as the latest versions of Chrome, Firefox, Safari, and Edge. If you continue with this browser, you may see unexpected results.
How Americans Navigated the News in 2020: A Tumultuous Year in Review (Feb. 22, 2021)
How Americans Navigated the News in 2020: A Tumultuous Year in Review
Americans are divided – that much is obvious after a contentious presidential election and transition, and in the midst of a politicized pandemic that has prompted a wide range of reactions.
But in addition to the familiar fault line of political partisanship, a look back at Pew Research Center’s American News Pathways project finds there have consistently been dramatic divides between different groups of Americans based on where people get their information about what is going on in the world.
Combating Misinformation In Under-Resourced Languages (Dec. 17, 2020)
Combating misinformation in under-resourced languages
The languages we speak greatly determine our access to reliable information and fact checks that debunk half-truths or false claims. A recent webinar by First Draft and Global Voices investigates how linguistic minorities can overcome this information inequality.
Gen Z Is Eroding The Power of Misinformation (Sept. 15, 2020)
Gen Z Is Eroding The Power of Misinformation
Gen Z may be more immune to the lure of misinformation because younger people apply more context, nuance and skepticism to their online information consumption, experts and new polling suggests.
Americans Who Mainly Get Their News on Social Media Are Less Engaged, Less Knowledgeable (July 30, 2020)
Americans Who Mainly Get Their News on Social Media Are Less Engaged, Less Knowledgeable
A new Pew Research Center analysis of surveys conducted between October 2019 and June 2020 finds that those who rely most on social media for political news stand apart from other news consumers in a number of ways. These U.S. adults, for instance, tend to be less likely than other news consumers to closely follow major news stories, such as the coronavirus outbreak and the 2020 presidential election. And, perhaps tied to that, this group also tends to be less knowledgeable about these topics.
'Fake News' Increases Consumer Demands for Corporate Action (April 7, 2020)
'Fake News' Increases Consumer Demands for Corporate Action
In today’s society with polarized opinions, fake news has significantly affected people’s trust in online news. Informed by the third-person effect (TPE) and influence of presumed influence (IPI) theories, this study examined a theoretical model to understand the antecedents and consequences of the presumed effects of fake news on others (PFNE3). Data were collected from 661 respondents through survey research based on fake news about a company shared on Facebook. Results showed the significant impacts of self-efficacy, social undesirability, and consumer involvement on PFNE3. Furthermore, PFNE3 positively predicted public support for corporate corrective action, media literacy intervention, and governmental regulation. Findings demonstrated the mediating role of PFNE3 in the model. Theoretical and practical implications were discussed.
On Twitter, False News Travels Faster Than True Stories (March 8, 2020)
Study: On Twitter, false news travels faster than true stories
Research project finds humans, not bots, are primarily responsible for spread of misleading information.
The study provides a variety of ways of quantifying this phenomenon: For instance, false news stories are 70 percent more likely to be retweeted than true stories are. It also takes true stories about six times as long to reach 1,500 people as it does for false stories to reach the same number of people. When it comes to Twitter’s “cascades,” or unbroken retweet chains, falsehoods reach a cascade depth of 10 about 20 times faster than facts. And falsehoods are retweeted by unique users more broadly than true statements at every depth of cascade.
No, We’re Not Living in a Post-Fact World (Jan. 7, 2020)
No, We’re Not Living in a Post-Fact World
Evidence we’ve gathered over the previous four years—involving more than 10,000 participants and spanning from the 2016 election to well into the Trump presidency—illustrates that the most pessimistic accounts of the decline of facts are, well, not entirely factual. We found that when presented with factually accurate information, Americans—liberals, conservatives and everyone in between—generally respond by becoming more accurate.
Americans Trust Local News. That Belief Is Being Exploited. (Nov. 4, 2019)
Pew Research Center- Journalism & Media (June 17, 2019)
Many Americans Say Made-Up News Is a Critical Problem That Needs To Be Fixed
Indeed, more Americans view made-up news as a very big problem for the country than identify terrorism, illegal immigration, racism and sexism that way. Additionally, nearly seven-in-ten U.S. adults (68%) say made-up news and information greatly impacts Americans’ confidence in government institutions, and roughly half (54%) say it is having a major impact on our confidence in each other.
Industrialized Disinformation: 2020 Global Inventory of Organized Social Media Manipulation (Jan. 13, 2021)
AI Can Predict Twitter Users Likely to Spread Disinformation Before They Do It (Dec. 14, 2020)
COVID-19 Related Infodemic and Its Impact on Public Health: A Global Social Media Analysis (Aug. 11, 2020)
COVID-19 Related Infodemic and Its Impact on Public Health: A Global Social Media Analysis
Coronavirus misinformation has been circulating across at least 87 countries in 25 languages, a new study published in the American Journal of Tropical Medicine and Hygiene found. Most of the misinformation was from India, the United States, China, Spain, Indonesia and Brazil. Splitting the 2,311 reports analyzed into “rumors,” “stigmas” and “conspiracy theories,” researchers concluded that rumors made up the vast majority of Covid-19 misinformation. "Rumors can mask themselves as credible infection prevention and control strategies and have potentially serious implications if prioritized over evidence-based guidelines," they wrote. The dangers of coronavirus-related misinformation are well-documented. Several people in the United States were poisoned this year after rumors that consuming household cleaning agents could kill the virus. Coronavirus stigma in India has resulted in attacks against healthcare workers.
Exposure to Social Engagement Metrics Increases Vulnerability to Misinformation (July 28, 2020)
Exposure to Social Engagement Metrics Increases Vulnerability to Misinformation
News feeds in virtually all social media platforms include engagement metrics, such as the number of times each post is liked and shared. We ﬁnd that exposure to these signals increases the vulnerability of users to low-credibility information in a simulated social media feed. This ﬁnding has important implications for the design of social media interactions in the post-truth age. To reduce the spread of misinformation, we call for technology platforms to rethink the display of social engagement metrics. Further research is needed to investigate how engagement metrics can be presented without amplifying the spread of low-credibility information.
Analytic-Thinking Predicts Hoax Beliefs and Helping Behaviors in Response to the COVID-19 Pandemic (March 30, 2020)
Analytic-Thinking Predicts Hoax Beliefs and Helping Behaviors in Response to the COVID-19 Pandemic
The coronavirus (COVID-19) outbreak was labeled a global pandemic by the WHO in March of 2020. During that same month, the number of confirmed cases and the death rate grew exponentially in the United States, creating a serious public-health emergency. Unfortunately, many Americans dismissed the pandemic as a hoax and failed to properly engage in helpful behaviors like social-distancing and increased hand-washing. Here, we examine a disposition—willingness to engage in analytic-thinking—that might predict beliefs that the pandemic is a hoax and failures to change behavior in positive ways. Our results indicate that individuals less willing to engage effortful, deliberative, and reflective cognitive processes were more likely to believe the pandemic was a hoax, and less likely to have recently engaged in social-distancing and hand-washing.
Fake News Shared by Very Few, But Those Over 65 More Likely to Pass on Such Stories, New Study Finds (Jan. 7, 2020)
Fake News Shared by Very Few, But Those Over 65 More Likely to Pass on Such Stories
A small percentage of Americans, less than 9 percent, shared links to so-called “fake news” sites on Facebook during the 2016 presidential election campaign, but this behavior was disproportionately common among people over the age of 65, according to a new analysis by researchers at Princeton University and New York University’s Social Media and Political Participation (SMaPP) Lab.
AP-NORC/USAFacts poll: Americans struggle to ID true facts (Nov. 14, 2019)
AP-NORC/USAFacts poll: Americans struggle to ID true facts
A new poll from The Associated Press-NORC Center for Public Affairs Research and USAFacts finds that regardless of political belief, many Americans say they have a hard time figuring out if information is true. Nearly two-thirds of Americans say they often come across one-sided information and about 6 in 10 say they regularly see conflicting reports about the same set of facts from different sources.
The Global Disinformation Order: 2019 Global Inventory of Organised Social Media Manipulation (Oct. 8, 2019)
2019 Global Inventory Of Organised Social Media Manipulation
Cyber troops’ are defined as government or political party actors tasked with manipulating public opinion online (Bradshaw and Howard 2017a). We comparatively examine the formal organization of cyber troops around the world, and how these actors use computational propaganda for political
purposes. This involves building an inventory of the evolving strategies, tools, and techniques of computational propaganda, including the use of ‘political bots’ to amplify hate speech or other forms of manipulated content, the illegal harvesting of data or micro-targeting, or deploying an army of ‘trolls’ to bully or harass political dissidents or journalists online. We also track the capacity and resources invested into developing these techniques to build a picture of cyber troop capabilities around the world.