Poll Reliability

Can you trust the poll results in Americans Agree?

Key Points

  • Americans Agree uses polling data to identify issues where Democrats, Republicans, and Independents have common ground.

  • Polls are more reliable in this context because they identify large, shared majorities across parties, not narrow margins between opposing sides.

  • To avoid bias, we exclude advocacy and partisan polls. We only draw from reputable, politically independent polling organizations that follow established best practices.

  • Although polling has many limitations, our focus on broad agreement and high-quality polling make the results trustworthy indicators of real common ground.

December 18, 2025 • 7 min read
A poll result with one of the numbers under a magnifier

Americans Agree is based on polling data. That naturally raises a question: Can the data be trusted?

The short answer is yes—the poll results listed by Americans Agree are reliable indicators of agreement across party lines on particular issues. This is not because all polling is reliable; it is because we carefully curate poll results to avoid the problems many people associate with polls being “wrong,” and we take account of other concerns that cannot be fully avoided.

Measuring Conflict versus Common Ground 

People who say they don’t trust polls often point to the 2016 presidential election. Many polls suggested a comfortable win for Hillary Clinton, but Donald Trump prevailed. Although the outcome hinged on very narrow margins in a few states, the public takeaway was simple: The polls were “wrong.”

What’s often missed is why polls seem fragile in cases like this. When two sides are in close conflict, small errors in sampling, assumptions about who will turn out to vote, or late opinion shifts can easily flip the apparent result. Measuring close contests is inherently difficult, and small mistakes loom large.

Americans Agree uses polling to measure something fundamentally different. We are not trying to determine which side will win a close contest. We are looking only for issues with broad agreement—where Democrats, Republicans, and Independents all support or oppose a policy by at least 55%. In practice, the average level of agreement is 68%. (This average is calculated using the lowest of the three party percentages for each poll result.)

This distinction between measuring conflict and common ground matters. Closely contested political campaigns are difficult to “call” with polls. In contrast, polls are more reliable when identifying large, shared majorities, where modest errors typically don’t change the conclusion. Americans Agree focuses exclusively on the latter, more reliable case. 

Advocacy versus Independent Polling

Another source of distrust about polls is advocacy polling: surveys commissioned by organizations seeking to promote a particular policy outcome. For example, consider this headline: Poll shows 80 percent of older voters concerned prescription drug reform will hurt drug innovation. It is about a poll sponsored by American Commitment, a group “dedicated to restoring and protecting the American Commitment to free markets, economic growth, Constitutionally-limited government, property rights, and individual freedom.” The pollster was McLaughlin & Associates, which was, at that time, the pollster for Donald Trump’s presidential campaign.

Regarding the headline’s “80%” number, here is the question from the poll:

Critics of the [Inflation Reduction Act] plan have warned that placing the government in charge of negotiating prices for medicines in Medicare by using the threat of a 95% tax on drugmakers could lead to fewer lifesaving drugs being made available to patients. How concerned are you that the [Inflation Reduction Act]  plan could lead to patient access restrictions to newer cutting-edge medicines?

This question presents only the critics’ message. As a result, the poll measures reaction to an advocacy framing rather than respondents’ uncued views about the policy. 

Americans Agree avoids this kind of situation by excluding polls sponsored by advocacy organizations or other partisan players. We instead focus on results from reputable, politically independent polling organizations like Gallup, Pew Research, and YouGov, as well as academic and media organizations whose polls meet strict standards. The most rigorous pollsters not only follow best practices in the field, but also periodically check their techniques against other, relatively authoritative sources.

The 55% Threshold

For a poll result to qualify for Americans Agree, it must have at least 55% of Democrats, Republicans, and (if polled) Independents aligned. We set the threshold at 55% rather than 50% because polls have margins of sampling error, more so when comparing subgroups like Republicans, Democrats, and Independents. The extra five percentage points above a bare majority of 50% helps avoid the margin-of-error zone.

Policy versus Attitude Questions

Americans Agree focuses on questions about policies—about what should be done—not on general attitudes or beliefs. For example, “Should there be stricter limits on how much an individual can spend in support of a political candidate or campaign?” asks about a specific policy choice. By contrast, “Do you think the rich have too much political power in the U.S.?” is a broader attitudinal question.

Attitudinal questions can reveal important values and concerns, but agreement on them can mask differences about solutions. That is, many people may agree that a problem exists while disagreeing sharply about what, if anything, should be done about it. For Americans Agree, we are interested in points of agreement that are directly connected to policies and actions, not just shared sentiment.

Other Concerns

By measuring common ground rather than conflict, and by excluding advocacy polling, Americans Agree avoids much of what people distrust in polling. And by focusing on policy questions, we ensure that answers are tied to actions. But there are still other valid concerns. Below are a few we’ve been asked about, with our perspective.

Public Ignorance

Research in political science has repeatedly shown that most people are poorly informed about public affairs. This raises a question: If most people are relatively ignorant about an issue, are their responses meaningful?

If the goal is to reflect public opinion as it actually exists, then the answer is yes. Ignorance is part of it. The same can be said of elections: Voters routinely decide ballot initiatives with incomplete information, yet those votes still shape law and policy. That is how it’s always been.

But if there’s a lot of ignorance, why don’t elections and polls just have random results? It turns out that when you add up the results of many people’s opinions—in a poll or in an election—individuals’ ignorance tends to cancel out because it is not systematically concentrated on one side. Put another way, ignorance is mostly randomly distributed, whereas purposeful opinions tend to have non-random patterns. The random responses offset themselves, leaving the non-random patterns. This does not happen perfectly, but it happens enough to create coherent results.

A simple check is available in the Americans Agree poll results. Policy concepts with conservative or Republican origins almost always show the highest support among Republicans; Democrat support is almost always lower (but at or above 55%), and Independents are usually in between. The reverse is true of liberal/Democrat-originated policy concepts. Such consistent partisan patterns would not appear if responses were dominated by random noise or pervasive confusion. Instead, it suggests that even with imperfect information, polls are capturing meaningful differences in how groups evaluate policy choices.

Question Framing, Wording, and Structure

Even when pollsters are trying to be neutral, it is easy for framing and wording choices to affect poll results. In an example going back to 1941 and since replicated many times, people were asked either:

  • Do you think that the United States should allow public speeches against democracy?

  • Do you think that the United States should forbid public speeches against democracy?

The only difference in wording was “allow” versus “forbid,” but the difference in equivalent responses was 16 points: 62% said no when asked whether such speeches should be allowed, but only 46% said yes when asked whether they should be forbidden.

This does not mean that every small wording change will produce a large effect, only that it’s possible. Also, wording effects are less likely to matter the further cross-party support is above 55%. For example, a 16-point difference at 76% versus 60% wouldn’t affect inclusion in the Americans Agree poll list because the lower number still clears the 55% threshold.

Other than rare situations where we believe an independent pollster is asking an obviously biasing question, we tend to be permissive about variations in framing, wording, and structure, as long as the concept is the same. We accept different approaches, knowing they will often have different numerical results but seldom in a way that affects inclusion. We take this approach because trying to judge which questions are “best” would introduce yet more problems.

Bad News, Good News

It’s likely apparent by now that polling brings with it many limitations—those we mentioned and others. That’s the bad news.

The good news is, for reasons we review, the poll results listed by Americans Agree are far more reliable than a random poll result from the internet. This means that while Americans Agree is not perfect, you can be reasonably confident that when we show agreement across party lines on an issue, it reflects a meaningful convergence of public opinion.

Like this Insight? Share it on , , or
Get “Where We Agree”

A free email of what’s new in agreement,
delivered to your inbox weekly.

Free Subscription
Follow Us

Stay connected on your favorite platform.