Thursday, September 11, 2014

What's Up with Political Polls?

It seems like everyday we get another news story about another political poll. One day it's good news and the next we are depressed by the results of another poll. How do we know which poll or pundit to trust? Have you ever wondered how Mitt Romney felt on election night 2012 after polls had him beating President Obama? Romney and his entire campaign put their trust in polls and came up lacking in a big way.

According to the Pew Center, pre-election polling is one of the few ways they can actually measure the validity of their polling. They can look at voting results and compare that data to their polling data.  http://www.people-press.org/methodology/election-polling/

Did you know that there is a National Council on Public Polls (NCPP)? Well, I didn't either until I started this quest about polling validity. According to the NCPP, there are 20 questions that journalists should use in looking at public polling data. I think they are actually good for all of us to know. Are you interested in knowing how the NCAA suggests a reporter analyze a poll? If so, please go here for the article:  http://www.ncpp.org/?q=node/4   Maybe even copy and paste it to your favorite journalist! Here is a PDF file you can send to them:  http://www.ncpp.org/files/20%20Questions%203rd%20edition_Web%20ver_2006.pdf

In short, here are the 20 questions:

  1. Who did the poll?
  2. Who paid for the poll and why was it done?
  3. How many people were interviewed for the survey?
  4. How were those people chosen?
  5. What area (nation, state, or region) or what group (teachers,lawyers, Democratic voters, etc.) were these people chosen from?
  6. Are the results based on the answers of all the people interviewed?
  7. Who should have been interviewed and was not? Or do response rates matter?
  8. When was the poll done?
  9. How were the interviews conducted?
  10. What about polls on the Internet or World Wide Web?
  11. What is the sampling error for the poll results?
  12. Who’s on first?
  13. What other kinds of factors can skew poll results?
  14. What questions were asked?
  15. In what order were the questions asked?
  16. What about "push polls?"
  17. What other polls have been done on this topic? Do they say the same thing? If they are different, why are they different?
  18. What about exit polls?
  19. What else needs to be included in the report of the poll?
  20. So I've asked all the questions. The answers sound good. Should we report the results?   
Each of the questions is discussed in depth in the article. I hope you will be better informed and not so quick to panic at the next dire polling report you read or hear about.

1 comment:

  1. Good informative article, Pat. Thanks for this information and all of the relevant links. The reality, however, is that most people (including journalists) don't have the time to dissect every poll and ask all of these questions about every pollster. Therefore, no one poll should ever be taken as gospel, and it's good to know what the poll aggregators, the people who look at polls for a living, such as Nate Silver at 538, think. Nate grades the various pollsters according to many of the above standards. Also, give less credence to a poll that you've never heard of before.

    Now, about the 2012 elections: The only people who seemed surprised were Rove and the people at Fox. Those of us who followed the 538 blog knew that Obama had over a 90% chance of winning. I was simply surprised that the election was called so early. Fox and Rove had 2-3 polls they were watching which simply were not accurate; they were polls that traditionally leaned Republican. And they were wrong.

    Nate was right on about the 2014 Senate elections as well, unfortunately. I watched as the chances of the Senate returning to the Republicans went up and up during the months of September and October.

    Polls are important and, as a whole, valid. Though it's important to look at polls carefully, it's also important not to be too cynical or skeptical of their results. I read too many comments such as "I don't believe polls; they don't include cell phone users, they are slanted, the powers that be manipulate them, etc. etc." Some are slanted. Some should be taken with a grain of salt. But, taken as a group, they are important signposts as to how the American electorate is feeling. We ignore or invalidate them at our own peril. For example, we have some polls out there now, a year before the 2016 election, showing that Trump might well beat Clinton or Sanders. I'm not that concerned about those polls now; we're a year out.. But I'd be very concerned if polls, several polls, in October 2016 showed the same thing. That would be serious.

    ReplyDelete