Advertisement

Do 1 in 5 adults under 30 really believe the Holocaust didn't happen? Not so fast. How some online polling practices are fueling misinformation.

Pew Research Center found that adults under 30 were most likely to be "bogus respondents" to online polls and surveys.

Doing an online test or questionnaire icon. (via Getty Images)

Looking to make a little extra cash? Searching online for ways to make money could lead you to one of several survey sites that promise financial rewards for simply answering a series of questions.

But if you’re planning to race through the polls without giving your responses a second thought, you may be unintentionally inspiring an inaccurate headline or news story — a concern that’s especially relevant during an election year.

A recent investigation by the Pew Research Center analyzed the results of “opt-in” online polling, which has become an increasingly popular method for publications and companies to survey the general public.

The research concluded that these surveys have resulted in huge discrepancies and errors that don’t accurately represent the general public’s opinion because of “bogus respondents” who don’t answer the questions sincerely. Pew also found that these inaccurate responses are mostly attributable to adult users who identify as under 30 years old.

Opt-in polling describes online surveys that users actively choose to participate in rather than ones for which researchers pull a random sample of the population to poll. Some polling sites may offer small rewards, which is what Pew argues is the incentive for people to lie while answering — that is, the quicker they move through the questions, the faster they’ll get their reward.

Pew’s investigation found that across several opt-in sample surveys that had “yes or no” questions on topics like smoking, hypertension, Social Security and workers’ compensation, opt-in adult respondents under 30 were more likely to answer “yes” to the majority of questions, “claiming combinations of characteristics that are virtually nonexistent in reality.”

However, bogus respondents have done more than sway polling results. In some cases, they’ve influenced news stories that overestimate opinions and behaviors, especially within the Gen Z demographic.

The Economist used the results of an opt-in poll for its December 2023 story alleging 1 in 5 young Americans didn’t believe the Holocaust happened. Reportedly 20% of adult survey takers who identified as being under 30 had agreed with the statement, “The Holocaust is a myth.”

The Economist did not respond to Yahoo News’s request for comment.

Screenshot of Dec. 7, 2023 headline. (via The Economist)
Screenshot of Dec. 7, 2023 headline. (The Economist)

Reexamining sensational headlines inspired by poll results

Pew challenged the conclusion and used a survey panel, a group of people who have agreed to fill out mail-sent polls, to see if the numbers correlated. Based on the survey panel, Pew said that only 3% of respondents who said they were under 30 agreed with the statement and that small percentage was the same for other age groups too.

“Had this been the original result, it is unlikely that it would have generated the same kind of media attention,” the Pew researchers suggested. The researchers clarified that the point isn’t to prove that Holocaust denial or antisemitism doesn’t exist in the U.S., but that “reporting on complex and sensitive matters such as these requires the use of rigorous survey methods to avoid inadvertently misleading the public.”

Another opt-in polling result suggested that 48% of young adults said abortion should be illegal or only legal in special circumstances. This was subsequently compared to Pew’s 2023 survey that found 26% of young adults felt abortion should be illegal, which aligned with the AP-NORC survey’s findings (27%).

Pew also revisited a 2022 survey — which researchers called an “experiment” — that had 12% of opt-in adult respondents under 30 claiming that they were licensed to operate a specific nuclear submarine. In reality, Pew argued, the percentage of Americans who have this particular license rounds to 0%.

Why do publications use online opt-in polls in the first place?

There are several reasons why opt-in polling has become such a popular option, a major one being that the number of pollsters who rely exclusively on cold calling has declined since 2012. Another Pew analysis found the surge in automated telemarketing calls — roughly 3.4 billion per month “in recent years” — is likely responsible for the public’s lack of interest in answering unknown or unfamiliar phone numbers.

It’s also a lot cheaper to conduct online opt-in surveys compared to other options.

Matthew Dardet, the lab manager and survey methodologist at Harvard’s Digital Lab for the Social Sciences (DLABSS), told Yahoo News that “opt-in online surveys of nonprobability samples are much cheaper and quicker for organizations to administer than their probability sampling-based counterparts.” The difference being that with probability sampling, a random group of people have an equal chance of being selected.

“For news publications and companies with limited budgets, nonprobability online samples are often the most cost-effective option in the short term,” he said. “In the long run, the utility of these surveys is questionable.”

DLABSS was founded in 2014 and is a volunteer online laboratory for experiments and surveys. Researchers administer nonincentivized polls to tens of thousands of volunteers who are willing to take the time to answer questions. (While DLABSS has between 13,000 and 14,000 people on the listserv, Dardet notes around 1,000 to 1,200 are likely to respond to the weekly surveys.)

“Compared to results from incentivized, opt-in online surveys that I have conducted with various nationally representative and highly regarded survey firms, our DLABSS surveys receive lower rates of these sorts of bogus responses or suspicious responses from bots,” Dardet added.

What can polling companies do to get more accurate results?

Even Dardet admits that there are pros and cons to using a system like DLABSS for polling results.

“People who take our surveys may be more politically knowledgeable, have more free time or have distinct personalities that make them more willing to expend the cognitive effort required to respond accurately to survey questions,” he said. “So I would be cautious about declaring nonincentivized volunteer panels to be the panacea for the bogus responding issues.”

According to Dardet, the way to pull the highest quality of data is to conduct surveys face-to-face. But he said that can cost hundreds of thousands or even millions of dollars.

“The only way to fully check for these types of extremely low-quality responses is to manually scrutinize a survey’s data, particularly data from open-ended questions, which can be time-intensive and costly,” he said. Dardet also named Pew’s American Trends Panel as an example of a way to get higher-quality responses.

“It’s possible that, with recent and future advances in artificial intelligence and textual data analysis, companies may be able to create more sophisticated screening techniques,” he said. “But this technology has yet to be developed and implemented.”