When you read that “researchers have found a statistically significant correlation between [thing x and thing y]” what do you imagine the percent of correlation to be (after, of course, you realize that since it’s just correlation so all they’ve found is that two things sometimes happen at the same time, without any idea of what causes them, and with the distinct possibility that they are completely unrelated.) Still, what would your expectation be if a source that it supposed to be reporting the news said that there was a “statistically significant correlation?” Would you expect that correlation to be 80%? 50%? Would you expect it to be .02 percent?
That’s exactly what Mother Jones did on their Facebook post today. Their teaser for an article they linked to by Pacific Standard called “Grand Obese Party” (see what they did there – so clever) read “Researchers have found a statistically significant correlation between support for Mitt Romney and a pudgy populace.” The story this linked to reported a study that had found that “a one percent increase in county-level support for Romney corresponds to a 0.02 percent increase in age-adjusted obesity rates.” I guess “Having Cured All Diseases and Ended Poverty and Hunger, Researchers are Spending Money to See if Republicans are Fatter than Democrats” was just too long for a headline.
[Edit for quick discussion of statistical significance: The reason I ask about expectation is that the paper is either counting on the readers to not understand statistical significance or they don’t understand it themselves. Either way, just really poor journalism. Statistical significance is about the probability that an effect doesn’t occur by happenstance. One of the first things I learned in my stats and research methods classes (right after correlation never ever implies causation) is that statistical significance is not an indicator of meaningful results. Unfortunately reporters who don’t understand research often think that “statistically significant” sounds important and all “science-y” and this kind of “news story” is the result – where they report the results as if they are important because they were “statistically significant” and don’t tell you that, for example, the sample was a small group of middle aged white men which severely limits how they can extrapolate results.
Even more ridiculous:
The researchers argue this reflects poorly on the Republican party’s emphasis on “personal responsibility” for reducing obesity risk. Successful fat-fighting strategies “will necessarily involve government intervention,” they argue, “because they involve workplace, school, marketing and agricultural policies.”
Bigger government or bigger waistlines: The choice is yours.
First of all, nobody has a “successful” intervention for obesity – we have no idea how to make fat people not fat. What we do have is a political climate where any idea that someone suggests will “reduce obesity” is likely to be implemented with absolutely no evidence required. That’s how we ended up with programs in schools that didn’t do anything to make fat students smaller, but did increase eating disorders.
So let’s review: Mother Jones contributed to a climate of bullying, stigma, and shaming of fat people for a .02% correlation, some cheap alliteration, and the suggestion that people should vote for the party that will best eradicate a group of people based on how they look. Pretty sure this is the definition of being a hack.
This member of the “pudgy populace” is not impressed.
Let’s give some feedback:
Pacific Standard (who ran the piece): firstname.lastname@example.org
Mother Jones: email@example.com
Comment on MJ’s Facebook thread about the piece: