Thu, Jun 4th 2009 9:23pm
It's pretty easy to find examples of bad statistics, whether they stem from poor data collection or interpretation. But a writer over at the BBC thinks he's fingered the real culprit: people who respond to surveys incorrectly. While on one level, his claim may have a little bit of truth to it, it really seems like an abdication of responsibility for the media, who all too often don't check out the bogus stats they cite, or swallow manipulated data without much question. Furthermore, when many groups use push polls not just to collect data, but to influence opinions, it's hard to blame the average respondent too much. In short, bad responses to survey questions could be a problem -- but that doesn't excuse the people collecting the data, and especially the media reporting it, from ignoring the issue. If they know the data's inaccurate, why do they keep reporting it?
If you liked this post, you may also be interested in...
- Techdirt 2016: The Stats.
- Techdirt 2013: The Numbers.
- Retired Lt. Col.: Violent Media Has Bred A Generation Of Killers
- MPAA Starts Backing Away, Slowly, From Bogus Piracy Stats (But New Bogus Stats Are On Their Way)
- RIAA's Bogus Math Strikes Again: Claimed 41% Decline In Musicians... Not Even Close To True