Thursday, October 29, 2009

Methinks They Poll Too Much

We are a nation driven by polls. Much of the news every day oozes with the results of polls. Some politicians appear to govern according to polling. Anyone promoting anything repeatedly cites polls backing them up. Experts spend time slicing and dicing polling data and our news organizations regurgitate the results.

Yes, we are a nation awash in polls. And all too often we give them more credence than they are due.

Once in an undergraduate statistics course as we struggled with the concepts of the chi-square test and the implications of Bayes’ theorem, one exasperated student asked the instructor, “Why do I have to know this? When am I ever going to use it?!”

The professor, who actually did a very good job, explained that most of us would never in our professional careers use the technical skills we were learning in that course. He expected that within a few years (or even months), most of us wouldn’t remember that basics of the Student’s t-test. However, he said it was important that we should be aware of these methods because many sources that seek to influence us are based on them. He said that the exercises we were undertaking were also designed to help us improve our logic processes.

Then the professor said something that caught my attention. “If you get nothing else out of this course, I want you to come out with a strong skepticism of any quoted statistic, especially if it comes in sound bite fashion.” He said that you know nothing about a statistic unless you know how the data was gathered, what assumptions were applied, what statistical methods were used, who sponsored the study, and why the study was undertaken.

According to the professor’s definition, most of the polling data to which we are exposed daily amounts to nothing more than meaningless drivel. We don’t really know who was asked what, what conditions applied, what assumptions and statistical methods were used, who was really behind the poll, or why the poll was conducted. While we are frequently given bits of this information, we are almost never given all of it. Besides, most people would tune it out if more than a few details were provided.

Most polling that comes to our attention has been undertaken and/or published to promote a specific agenda. Rarely is that agenda made clear.

Faulty incentives
But this is not the only reason that we should consider polling results suspect. Perhaps the most important factor in this regard is that those being polled have no skin in the game, as it were. They can respond whether they prefer answer A, B, C, or D to a question posed by a pollster, and they can even be perfectly honest. But it usually doesn’t matter that much because the incentives for answering a pollster differ greatly from the incentives for dealing with the same matter in real life.

I remember an instance years ago when two young women where I worked were discussing the case of a popular celebrity that was not known for being kind to women. He had divorced his wife and then a short time later married a woman that was their same age that he had just met at a gathering. It was a scandal, but the guy was rich, buff, and popular.

Both of these young women told each other that they would also marry the guy if he were to ask them. The one that was married said that she’d leave her husband and child to do so. Such conversation makes for interesting workplace banter, but it is hardly representative of the choice either of these women would have made had she actually been faced with such an opportunity.

In the case of workplace chatter, one is free to engage in romanticism and all kinds of fantasy without the necessity of considering the realities that would have to be faced in a real life situation. The potential infliction of pain and permanent damage to family relationships doesn’t have to come into the equation. Perhaps each of these women would have acted as they suggested had the fantasized opportunity been actual. But they would first have had to weigh consequences that cannot be fully considered in an imaginary scenario.

Polls work the same way. Those polled do not have to live with the consequences of their answers, so they do not (cannot) consider the matter with the same cogency they would if the consequences were real. For these reasons, it is wise to ignore most polling reports. They rarely approximate reality.

Where polling is useful
Pollsters are not unaware of the inherent inaccuracy of their art. But that does not mean that all polling is useless. Polling can be valuable in trend analysis, for example. When a similar sampling of the same population is polled on the exact same questions at intervals, the resulting trend data can be useful, although, the actual results of each poll may differ substantially from reality. Thus, repeating well controlled polling can provide effective information.

Polling about political races can more closely approximate actual results, if done with a pool that closely represents actual voters. One of the reasons for this is that the polling question about which candidate one plans to vote for is pretty much the same question that will be faced in the voting booth. But it is still wise to be skeptical when presented with such polling information, because candidates, political parties, and other interested groups sometimes stand to gain by publicizing tailored polling results.

I think that if more Americans understood the principles outlined by my statistics professor years ago, they’d be less likely to be swayed by published polling results. Few polls can hope to closely approximate reality and a lot of published polling data is agenda driven. That’s why my basic posture toward any poll data is one of distrust.

2 comments:

Cameron said...

Hmmm..timely post. Former governor Norm Bangerter was speaking today about raising taxes and told the story of how when he was governor all the polling said the public was in favor of raising taxes. He rose taxes and his approval rating promptly dropped from 75% to 41% and he had a tough reelection two years later.

Unknown said...

The media (among others) loves to give us lots of facts under the pretense that they are actually helping to inform us. Being in possession of facts is not the same as being informed. Polling statistics are simply facts but without all the background information you mentioned from your professor they do not even begin to approach the status of information.

Polling is a rather delicate art (as you mentioned) which I studied as a graduate student. How a question is asked and in what order can have a profound affect on the results. Wording can be very important. For example, Gallup does a lot of polling and they seem to know what they are doing. They sell what they call the Q12 to businesses which is a very refined 12 question poll that helps to determine workplace satisfaction. The poll consists of 12 statements with responses on a 5-point Likert scale. Intermountain Healhtcare has Gallup administer the poll among their employees yearly to keep their pulse on how people feel about working here.

As we were reviewing the results of our team this year there were a lot of people asking about one specific statement in the poll - "I have a best friend at work." Our team asked what that meant, how we were supposed to judge that in answering the questions and why that was important to workplace satisfaction. The answer is that when the statement is worded like that it is a better predictor of workplace satisfaction (as measured through more concrete means such as employee turnover) than if it is worded in other ways. The only way to learn that is through trial and error. Polls that are commissioned with an agenda are almost guaranteed to have a low degree of accuracy - especially compared to long established polls that are used repeatedly to gather information for trend analysis as you said.