Everyone who has ever interviewed for a job, asked for a raise, given a speech or made a sales pitch jumped for joy when they listened to Amy Cuddy’s TED Talk about “power posing.”

Cuddy is a Harvard Business school professor who studies body language and its effects on biology and behavior. Her research showed “power posing” – standing like Superman - before intimidating situations would boost hormone levels, decrease cortisol levels, and immediately make you a more powerful individual. It could “significantly change the way your life unfolds.”

The results of this research are a dream. It found that individuals can take action that is free, easy to implement by anyone anywhere, carries no risk, and will change their lives.

But. Of course there is a but. Later studies have been unable to replicate these findings.

There are many reasons that this could happen, and the original researcher may not have done anything wrong. The small size (only 42 subjects) and non-targeted approach mean any number of factors could have influenced the results. An initial study like this is really a call for more in-depth research, not a call for behavior change.

The study was published anyway – probably because of those dreamy results. When results of research are positive, they are more likely to find their way to publication, followed by headlines in popular media. This is called publication bias, and it’s why you now see people imitating Superman and Wonder Woman outside of boardrooms across the country--- even though the evidence doesn’t support it.

This leads to a new question. What if publishers couldn’t see the results when they reviewed the research for publication? What if they could only see the study design? Would they have published it?

Researchers broadly addressed this question in the journal Comparative Political Studies. They asked for submissions for publication with the results blinded. Reviewers evaluated them based on methodology, not results. This, too, was a small study, and it leaves many questions unanswered. But it did find that more research with unimpressive findings was selected for publication than is typical. That type of research may not produce many exciting headlines. After all, no newspaper is going to print a headline that says “Science Shows Chocolate Has No Impact on Health.” But if more of that research were published, it might balance out the extreme claims we read about daily.

More to the point, this research highlights a critical lesson: methodology matters.

Whether the world of academic publishing takes notice of this research, it is a warning to us all. We can’t take something as gospel just because a scientific journal published it. The scientific process involves taking ideas, testing them, and then refining them. Conclusions must be drawn from the complete body of research and not from one isolated component.

This isn’t such a big deal if you’re spending a minute or two standing tall before your big client presentation. But think twice before you make changes in your health behaviors based on the results of one small study. You may not get the same results the study did.