Wednesday, August 14, 2019

Pigeonholing People

Political pundits get into trouble when they insist on pigeonholing voters. Yet, they persist in trying to lump people together. Here’s an example: An over 70 rural white voter, according to the pundits, most likely is conservative, feels threatened by people of color and immigrants, feels “economic anxiety” and is most likely a Trump supporter. Isn’t that how many in that demographic are characterized? Yet, I and many of my friends are over 70 rural white voters and we voted for Hillary Clinton.

Another example: A 40 year old, white, non-college educated male factory worker. Again, the pundits pigeonhole this person as most likely threatened by immigrants coming to take their jobs and most likely a Trump supporter. But, is it true? I know several men who fit this category and they are avid Bernie Sanders supporters.

The point is that political pundits err when they insist on trying to find simplistic explanations for complex issues like why certain candidates are popular. For a more nuanced explanation than I can give, see

Sunday, January 27, 2019

Political Polling

According to the NCPP, there are 20 questions that journalists should use in looking at public polling data. I think they are actually good for all of us to know. Are you interested in knowing how the NCAA suggests a reporter analyze a poll? If so, please go here for the article:  http://www.ncpp.org/?q=node/4   Maybe even copy and paste it to your favorite journalist! Here is a PDF file you can send to them:  http://www.ncpp.org/files/20%20Questions%203rd%20edition_Web%20ver_2006.pdf

In short, here are the 20 questions:

  1. Who did the poll?
  2. Who paid for the poll and why was it done?
  3. How many people were interviewed for the survey?
  4. How were those people chosen?
  5. What area (nation, state, or region) or what group (teachers,lawyers, Democratic voters, etc.) were these people chosen from?
  6. Are the results based on the answers of all the people interviewed?
  7. Who should have been interviewed and was not? Or do response rates matter?
  8. When was the poll done?
  9. How were the interviews conducted?
  10. What about polls on the Internet or World Wide Web?
  11. What is the sampling error for the poll results?
  12. Who’s on first?
  13. What other kinds of factors can skew poll results?
  14. What questions were asked?
  15. In what order were the questions asked?
  16. What about "push polls?"
  17. What other polls have been done on this topic? Do they say the same thing? If they are different, why are they different?
  18. What about exit polls?
  19. What else needs to be included in the report of the poll?
  20. So I've asked all the questions. The answers sound good. Should we report the results?   
Each of the questions is discussed in depth in the article. I hope you will be better informed and not so quick to panic at the next dire polling report you read or hear about.