According to the Pew Center, pre-election polling is one of the few ways they can actually measure the validity of their polling. They can look at voting results and compare that data to their polling data. http://www.people-press.org/methodology/election-polling/
Did you know that there is a National Council on Public Polls (NCPP)? Well, I didn't either until I started this quest about polling validity. According to the NCPP, there are 20 questions that journalists should use in looking at public polling data. I think they are actually good for all of us to know. Are you interested in knowing how the NCAA suggests a reporter analyze a poll? If so, please go here for the article: http://www.ncpp.org/?q=node/4 Maybe even copy and paste it to your favorite journalist! Here is a PDF file you can send to them: http://www.ncpp.org/files/20%20Questions%203rd%20edition_Web%20ver_2006.pdf
In short, here are the 20 questions:
- Who did the poll?
- Who paid for the poll and why was it done?
- How many people were interviewed for the survey?
- How were those people chosen?
- What area (nation, state, or region) or what group (teachers,lawyers, Democratic voters, etc.) were these people chosen from?
- Are the results based on the answers of all the people interviewed?
- Who should have been interviewed and was not? Or do response rates matter?
- When was the poll done?
- How were the interviews conducted?
- What about polls on the Internet or World Wide Web?
- What is the sampling error for the poll results?
- Who’s on first?
- What other kinds of factors can skew poll results?
- What questions were asked?
- In what order were the questions asked?
- What about "push polls?"
- What other polls have been done on this topic? Do they say the same thing? If they are different, why are they different?
- What about exit polls?
- What else needs to be included in the report of the poll?
- So I've asked all the questions. The answers sound good. Should we report the results?
For a quick minute explanation go here: http://www.takepart.com/video/how-political-polls-work-civics-minute-video
Good informative article, Pat. Thanks for this information and all of the relevant links. The reality, however, is that most people (including journalists) don't have the time to dissect every poll and ask all of these questions about every pollster. Therefore, no one poll should ever be taken as gospel, and it's good to know what the poll aggregators, the people who look at polls for a living, such as Nate Silver at 538, think. Nate grades the various pollsters according to many of the above standards. Also, give less credence to a poll that you've never heard of before.
ReplyDeleteNow, about the 2012 elections: The only people who seemed surprised were Rove and the people at Fox. Those of us who followed the 538 blog knew that Obama had over a 90% chance of winning. I was simply surprised that the election was called so early. Fox and Rove had 2-3 polls they were watching which simply were not accurate; they were polls that traditionally leaned Republican. And they were wrong.
Nate was right on about the 2014 Senate elections as well, unfortunately. I watched as the chances of the Senate returning to the Republicans went up and up during the months of September and October.
Polls are important and, as a whole, valid. Though it's important to look at polls carefully, it's also important not to be too cynical or skeptical of their results. I read too many comments such as "I don't believe polls; they don't include cell phone users, they are slanted, the powers that be manipulate them, etc. etc." Some are slanted. Some should be taken with a grain of salt. But, taken as a group, they are important signposts as to how the American electorate is feeling. We ignore or invalidate them at our own peril. For example, we have some polls out there now, a year before the 2016 election, showing that Trump might well beat Clinton or Sanders. I'm not that concerned about those polls now; we're a year out.. But I'd be very concerned if polls, several polls, in October 2016 showed the same thing. That would be serious.