The Journal of Consumer Research (JCR) is widely considered to be the most prestigious journal in consumer behavior, a sub-field of marketing. Despite perceptions of prestige, JCR is plagued with problems. In addition to the high degree of bias that can be seen in forensic meta-analyses of its papers, JCR’s replication rate is an abysmal 3.4%. Only one out of twenty nine attempts to replicate JCR studies has ended in success. And now, the JCR Policy Board has determined not to retract two problematic articles by one of its associate editors (AE), David Dubois. Interestingly, the problems with Dubois’ papers were discovered due to one of the aforementioned failed replication attempts.
Both articles continue to be fatally flawed
- With exception of Study 3, the 2012 paper is bereft of internal numerical consistency. That means there is no evidence that participant data was used to generate the summary and test statistics. It’s surprising to see JCR being okay with this given that Journal of Marketing Research (JMR) retracted a paper last year for the same problem. The JMR anomalist wasn’t an AE.
- In the 2016 paper the manipulation check is nearly perfectly correlated between audience and communicator in the matched condition (r=.83), among other problems. So everyone gets the same manipulation in this condition, then they’re paired up, then when the manipulation check is measured it’s nearly a perfect correlation across the pairs? Makes no sense. There is no way to run a study in which the manipulation checks for matched pairs are correlated. There is also no error that anyone involved can think of to explain this bizarre numerical relationship. However, it is convenient that this correlation existed because it helped the author get a significant result for a robustness test on the manipulation check. Hmmmmm…. More information and data about the 2016 paper is forthcoming.
Author requests to retract were denied
- All three authors jointly requested retraction of the 2012 paper due to five categories of numerical inconsistency.
- Both senior authors, Derek Rucker and Adam Galinsky, requested retraction of the 2016 paper due to data anomalies. JCR’s Associate Editor, David Dubois, did not want the retraction, so the JCR Policy Board opted not to retract it. They actually said they would retract it if their AE (the anomalist) was okay with the retraction but since he didn’t like the idea they didn’t do it. Dubois was a grad student when these two papers were generated, which means that JCR has sided with a grad student against his advisor in deciding not to retract this flawed study.
Early career marketing researcher struggling to navigate the recent fire hose of replication failures, bias studies and data anomalies should check out the Open MKT list of marketing studies that are (a) preregistered, (b) open data, and (c) good p-curve. That will give you a good idea of the types of effects and studies that are likely to worth your time to investigate. If you teach grad seminars, consider using these papers in your teaching. Don’t use the classics unless and until they replicate. Focus on the good stuff. Good = replicable.
- Open MKT: Following a failure to replicate, “Super Size Me” paper turns out to be full of numerical inconsistencies
- Retraction Watch: Journal hasn’t retracted ‘Super Size Me’ paper six months after authors’ request
- Tunca, B., Ziano, I., & Wenting, X. (2022). Super-size me: an unsuccessful preregistered replication of the effect of product size on status signaling. Meta-Psychology, 6.