I hate surveys.
I hate them because you cannot trust them.
Why can you not trust them? Because generally you only see the “results” through the lens of the medium that is reporting them. Be it Fox News or CNN or the Obama Administration or (I’m not trying to pick on politics here) Habitat for Humanity or Greenpeace or whomever, they all have their own axe to grind so rarely will you be able to review the survey questions or the actual results.
Peer-reviewed work is of slightly higher caliber, but even in peer-reviewed articles it depends on the quality of the reviewers. Do these people have any education in the ethics and preparation of survey questions? Sometimes that answer is a loud, “Hell No!”
This post was prompted by two things, one lesser and one greater. First the lesser:
I had seen a blog post that was a link to a link to a link to a link that finally let me to this “study” by retrevo.com that produced some (in my opinion) questionable statistics about iPhone users (In defense of retrevo, I’m sure they did their survey to generate content and buzz, not from any deep seated need to academically refine their audience database). As I mentioned in my twitter about it this morning, I couldn’t find anything blog-worthy in it and so decided to just let it be.
That is until I read the latest posting on fivethirtyeight.com about the fabrication of poor performance by Oklahoma students by a polling research firm. This would be the greater thing.
As an aside, I tend to read items that originate in Oklahoma because over half of my direct family lives out there. Same way I tend to read new stories that come from Northern California where another portion of the family tree is at root.
As a further aside, I just discovered a bias in myself. While I will read news stories from Oklahoma and lump them in as “family affective” I will only pop up and read stories from the communities immediately surrounding Santa Cruz. Communities which do not include San Francisco, Oakland, etc. My geographic filters for “family affective” stories seems to have some skew.
Asides over. The story out of Oklahoma is about how Strategic Vision LLC likely fabricated the survey results for how well Oklahoma High School students could do on a basic citizenship exam. I say likely because an Oklahoma Legislator duplicated the study as well as possible and got entirely different results. Fivethirtyeight covers it much better than I.
Which brings me back to my original point. Survey reporting cannot be trusted, but Americans don’t think about this. How many people know that the “margin of error” reported on every poll during the political campaign season means absolutely nothing without also knowing the confidence interval used?1 How many people know that margin of error is not some “unknown voter factor” but actually a hard and fast number determined by the number of people polled and that confidence interval I just mentioned? Only people who have some background in statistics. News outlets have no incentive to educate the public, they merely want to report the polls in a way that garners the most viewers/readers/clickthroughs.
Surveys and polls too often do not allow you to research their basis: The questions, who they surveyed, what statistical methods they used, how the random sampling was conducted. Even with the best of intentions, surveys can be skewed by the order of questions, placing people in a particular frame of mind.
Do not trust survey results! At least, do not trust them over your own judgment unless you can see the guts of the work.
That is all. Off to drink less coffee.
1: I assume that most pollsters use a 95% confidence interval, but I have no real knowledge about that.