Skip to content

How to Photoshop a Research Study

[as demonstrated by Rock et al. (2010) on weight loss results of the Jenny Craig® program]

by Deb Burgard, PhD

1. Publish in a peer-reviewed, respectable journal like JAMA. The average person has no idea they are willing to publish research sponsored by industry, so the fact that Jenny Craig paid for this will stay buried in the fine print on page 1810.

2. You’re the researcher, so you get to choose who can be in the study. Forget the bother of a representative sample.  Make sure you eliminate at least 20% of your interested participants right off the bat, even though you don’t do that with your customers.  Who’s going to notice that you don’t have even a representative sample of your customers, let alone a representative sample of the “obese and overweight women” of your title?

3. Don’t bother to test your actual program−too many people would drop out.  Instead, pay your participants for showing up to clinic visits, and give away your diet food.  Readers won’t realize that you are not really testing your real-world program, which costs $100/week.  Don’t report on (or maybe even bother to track) the percentage of people who actually chose to eat the (free!) food−just track whether people showed up at the center or talked on the phone.  Don’t report on the percentage of people who would not eat Jenny Craig food even when it is given away.

4. Identify the study participants to your staff, for no discernible reason.  Could it be so they can be sure to work extra hard to get the desired results?  But report on how you told them to treat everyone the same, as if that is an accepted research procedure.

5. Say your study tests maintenance of weight loss, but don’t ever stop your intervention.  Who’s going to notice the difference between a two-year study of continuous dieting vs. a study that actually follows up, i.e., shows what happens two years after the intervention is over?

6. Report in BMI, kg, and means so that readers won’t do the math and translate into what is familiar to them.  Who’s going to go back and look at the average baseline weight of 92 kg and multiply by 2.2 then figure out what 5% of that would be (about 10 pounds) to understand that this statement, “By study end, more than half in either intervention group (62% [n=103] of center-based participants and 56% [n=91] of telephone-based participants) had a weight loss of at least 5% …” means that 59% of the people who showed up at clinic visits were at least 10 pounds lighter at two years out, going from an average of 203 pounds to 193 pounds?  Who’s going to subtract to figure out that even when they were getting paid and the food was given away for free, 41% of the participants could not maintain even a 10 pound average loss?

And really, who would actually divide to notice that it took an average of $6958 over two years to return an average weight loss of 15 pounds, or $463.87 per pound all while losing your sanity points being on a continuous diet for two years?

7. Count on no one noticing that even when you are paying people over $3000/year in food products and counseling rather than asking them to pay over $3000/year in the real world, the average weight trajectory is on the way back up after month 12.

8. Claim in the results section that the intervention groups reported better quality of life and reduced depression at 12 months; maybe people won’t notice that sure enough, at 24 months there were no significant changes from baseline in physical fitness or psychosocial measures.

9. Make sure to end your study at the point when you stop paying people, but describe the study in the abstract as “conducted over 2 years with follow-up between November 2007 and April 2010.”  Who reads the actual article anyway?

10. Make sure you publish your study side-by-side with an independently-conducted study but make sure that one stops at the 12-month point in the process where people tend to have maximum weight loss and benefits, even though studies consistently show this reverses over the next year.  That way your sponsoring company can send out its press release mashing everything together and imply all kinds of results no one found in either study, like you had a “two-year independent, multi-site clinical trial” (OK, the independent trial was a only a year and only one site) and “those who took part in the Jenny Craig program adopted healthier eating habits and meaningful health benefits for overall improved quality of life” (OK, the quality of life changes were not significant at 24 months) and “those following the program reduced risk factors that can lead to chronic disease including depression, diabetes, cancer and even stroke” (OK, there were no significant changes at 24 months in total cholesterol, LDL cholesterol, HDL cholesterol, or triglycerides, or step test fitness measures, or any psychosocial measures including depression).

Hey, if Vogue can get away with it, why not JAMA?

Accessibility Toolbar