Vol. 22, No. 5,514 - The American Reporter - September 7, 2016



by Proloy Bhatta
American Reporter Correspondent
Los Angeles, Calif.
May 8, 2008
Campaign 2008
WHY POLLS DON'T WORK

Back to home page

Printable version of this story

LOS ANGELES -- As polling information is distributed widely in the news, and often proved wrong, Americans start to wonder whether the information polsters produce tand they consume is credible. The answer is yes - and no.

As an estimation theorist who makes a living analyzing empirical data for political polls, I see a great disconnect between consumers of polls, the media, and those who conduct the polls. And until these three forces begin to start thinking on the same wavelength, polling will always be questioned.

So let's characterize these three groups by showing you how they think.

Consumers of polls have a classic love-hate relationship with them; they love the polls when they agree, but hate them when they disagree. This phenomenon is caused by human nature's desire to be right! Many believe this phenomenon is what causes public opinion to be altered after reading polls, as they gravitate towards a more popular opinion.

The biggest flaw here, of course, is that consumers look to the poll whose results most closely match their preferences and cite them every chance they get. By not looking at how the data was retrieved or how credible the source is, consumers allow misinformation to be spread because of their desire to diseminate polling data that reflects their views.

The biggest example is the inaccurate "70% of American disagree with the war" statement widely distributed in the news. It would be accurate to say that "70 percent of Americans disapprove of Bush's handling of the war" but how they feel about the merits of the war itself is debatable - meaning that it is possible to approve of the war but still disapprove of how Bush is handling the war. As an aside, if you add "Bush's handling of..." in front of anything, it has been known to be met with 70 percent disapproval.

Numbers and details do not concern the media outlets as much as the "story." New Organizations are sometimes driven to create sensational stories, and only afterwards do they look to find a poll to complement them.

"Question wording" is one of many factors that has been proven to incite different responses from individuals. For example, individuals are more likely to support government-funded health care when told of how many Americans are uninsured, but far less likely when told of a different scenario, in which a new 10 percent tax hike would go to fund such a program.

Media outlets ignore the various factors - but not necessarily to dupe you. They are limited in what they can say by space constraints, and they may not have all the time in the world to do the necessary research. Lastly, they may not even be qualified to analyze the data with the details they have.

A second avenue through which the media clearly has an impact is in how they process the results. I will give you an example from a Gallup poll conducted in November 2007.

Question A: "Thinking again about health care in the country as a whole: Are you generally satisfied or dissatisfied with the total cost of health care in this country?" The results: Satisfied 17%, Dissatisfied 81%, Unsure 2%.

Question B: "Are you generally satisfied or dissatisfied with the total cost you pay for your health care?" The results: Satisfied 57%, Dissatisfied 39%, Unsure 4%.

The difference between these two questions is this: the first focuses in on the total cost for the whole country, while the latter is asking about the cost you personally pay.

Just look at how drastically different two potential statements about these answers can be.

When Gallup released their results, they entitled it "Majority of Americans Satisfied With Their Health Care Plans," which is definitely one way to look at the results. But the liberal bloggers found another story to produce. Taken from a random liberal blog, this is how it read: "A Gallup Poll ... shows that the vast majority (81%) of Americans are dissatisfied with costs of health care".

Both statements are completely accurate but both can be used to support significantly different stories.

What should be done by both sides is to try and dissect why it is that Americans say the country is paying too much for their health care, but feel satisfied with what they personally pay?

Polling 101 says that those who are surveyed are not always honest when talking about their own lives. Individuals may feel embarrassed to tell a pollster "Yes, I am struggling to pay for health care, and it is a burden on me," yet find it easy to say "Health care is definitely burdensome for the American people".

This leads us to how pollsters conduct themselves. They understand human psychology and know how to drum up the kinds of responses that they want - specifically, knowing when people are most likely to embellish the truth and/or lie.

Pollsters are motivated by their desire to be referenced by the media, so they pick out simplistic headlines for the media to gobble up. Then the media is motivated by their desire to plug in these simplistic headlines to match the story they are writing. Then consumers of polls are motivated by their desires to pick out the information that backs them up and makes them look like part of the majority. And because of these motivations, it becomes nearly impossible to seek good data from the hodgepodge of bad data.

So you have this vicious cycle of non-transparency in the polling world that trickles down from the professional pollster to the consumer. And unless we start questioning what we are presented with, and start looking into the details of polls, we will be unable to really understand public opinion at a scientific level.

Proloy Bhatta is the owner of USA Election Polls, and has an M.S. in Engineering from UCLA. He has extensive experience in number-crunching at NASA and Raytheon, and is based in Los Angeles.

Copyright 2016 Joe Shea The American Reporter. All Rights Reserved.

Site Meter