To the editor of Evolutionary Psychological Science:
Dear Dr. Shackelford,
On April 24, 2017, in your capacity as editor of Evolutionary Psychological Science, you issued an Editorial Note [PDF] that referenced the article "Eating Heavily: Men Eat More in the Company of Women," by Kevin M. Kniffin, Ozge Sigirci, and Brian Wansink (Evolutionary Psychological Science, 2016, Vol. 2, No. 1, pp. 38–46).
The key point of the note is that the "authors report that the units of measurement for pizza and salad consumption were self-reported in response to a basic prompt 'how many pieces of pizza did you eat?' and, for salad, a 13-point continuous rating scale."
For comparison, here is the description of the data collection method from the article (p. 41):
Even if we ignore what appears to have been a deliberately misleading description of the method, there is a further very substantial problem now that the true method is known. That is, the entire study would seem to depend on the amounts of food consumed having been accurately and objectively measured. Hence, the use of self-report measures of food consumption (which are subject to obvious biases, including questions around desirability), when the entire focus of the article is on how much food people actually (and perhaps unconsciously, due to the influence of evolutionarily-determined forces) consumed in various social situations, would seem to cast severe doubt on the validity of the study. The methods described in the Editorial Note and the article itself are thus contradictory, as they describe substantially different methodologies. The difference between real-time unobtrusive observations by others, versus post hoc self-reports, is both practically and theoretically significant in this case.
Hence, we are surprised that you apparently considered that issuing an "Editorial Note" was the appropriate response to the disclosure by the authors that they had given an incorrect description of their methods in the article. Anyone who downloads the article today will be unaware that the study simply did not take place as described, nor that the results are probably confounded by the inevitable limitations of self-reporting.
Your note also fails to address a number of other discrepancies between the article and the dataset. These include: (1) The data collection period, which the article reports as two weeks, but which the cover page for the dataset states was seven weeks; (2) The number of participants excluded for dining alone, which is reported as eight in the article but which appears to be six in the dataset; (3) The overall number of participants, which the article reports as 105, a number that is incompatible with the denominator degrees of freedom reported on five F tests on pp. 41–42 (109, 109, 109, 115, and 112).
In view of these problems, we believe that the only reasonable course of action in this case is to retract the article, and to invite the authors, if they wish, to submit a new manuscript with an accurate description of the methods used, including a discussion of the consequences of their use of self-report measures for the validity of their study.
Please note that we have chosen to publish this e-mail as an open letter here. If you do not wish your reply to be published there, please let us know, and we will, of course, respect your wishes.
Sincerely,
Nicholas J. L. Brown
Jordan Anaya
Tim van der Zee
James A. J. Heathers
Chris Chambers
Dear Dr. Shackelford,
On April 24, 2017, in your capacity as editor of Evolutionary Psychological Science, you issued an Editorial Note [PDF] that referenced the article "Eating Heavily: Men Eat More in the Company of Women," by Kevin M. Kniffin, Ozge Sigirci, and Brian Wansink (Evolutionary Psychological Science, 2016, Vol. 2, No. 1, pp. 38–46).
The key point of the note is that the "authors report that the units of measurement for pizza and salad consumption were self-reported in response to a basic prompt 'how many pieces of pizza did you eat?' and, for salad, a 13-point continuous rating scale."
For comparison, here is the description of the data collection method from the article (p. 41):
Consistent with other behavioral studies of eating in naturalistic environments (e.g., Wansink et al. 2012), the number of slices of pizza that diners consumed was unobtrusively observed by research assistants and appropriate subtractions for uneaten pizza were calculated after waitstaff cleaned the tables outside of the view of the customers. In the case of salad, customers used a uniformly small bowl to self-serve themselves and, again, research assistants were able to observe how many bowls were filled and, upon cleaning by the waitstaff, make appropriate subtractions for any uneaten or half-eaten bowls at a location outside of the view of the customers.It is clear that this description was, to say the least, not an accurate representation of the research record. Nobody observed the number of slices of pizza. Nobody counted partial uneaten slices when the plates were bussed. Nobody made any surreptitious observations of salad either. All consumption was self-reported. It is difficult to imagine how this 100-plus word description could have accidentally slipped into an article.
Even if we ignore what appears to have been a deliberately misleading description of the method, there is a further very substantial problem now that the true method is known. That is, the entire study would seem to depend on the amounts of food consumed having been accurately and objectively measured. Hence, the use of self-report measures of food consumption (which are subject to obvious biases, including questions around desirability), when the entire focus of the article is on how much food people actually (and perhaps unconsciously, due to the influence of evolutionarily-determined forces) consumed in various social situations, would seem to cast severe doubt on the validity of the study. The methods described in the Editorial Note and the article itself are thus contradictory, as they describe substantially different methodologies. The difference between real-time unobtrusive observations by others, versus post hoc self-reports, is both practically and theoretically significant in this case.
Hence, we are surprised that you apparently considered that issuing an "Editorial Note" was the appropriate response to the disclosure by the authors that they had given an incorrect description of their methods in the article. Anyone who downloads the article today will be unaware that the study simply did not take place as described, nor that the results are probably confounded by the inevitable limitations of self-reporting.
Your note also fails to address a number of other discrepancies between the article and the dataset. These include: (1) The data collection period, which the article reports as two weeks, but which the cover page for the dataset states was seven weeks; (2) The number of participants excluded for dining alone, which is reported as eight in the article but which appears to be six in the dataset; (3) The overall number of participants, which the article reports as 105, a number that is incompatible with the denominator degrees of freedom reported on five F tests on pp. 41–42 (109, 109, 109, 115, and 112).
In view of these problems, we believe that the only reasonable course of action in this case is to retract the article, and to invite the authors, if they wish, to submit a new manuscript with an accurate description of the methods used, including a discussion of the consequences of their use of self-report measures for the validity of their study.
Please note that we have chosen to publish this e-mail as an open letter here. If you do not wish your reply to be published there, please let us know, and we will, of course, respect your wishes.
Sincerely,
Nicholas J. L. Brown
Jordan Anaya
Tim van der Zee
James A. J. Heathers
Chris Chambers
Well done, thanks for sharing here Nick.
ReplyDeleteThanks for keeping on top of Wasnink's poor work and for helping to ferret out some of this rampant pseudoscience.
ReplyDeleteThis is as good a time as any to recall that when inquiring minds asked Dr Wansink for access to his data, his response was that they should repeat the study themselves. A study that was not repeatable because Wansink's description of the method was a complete fiction.
ReplyDeleteTha editorial note is itself rather sloppy.
ReplyDeleteThe reviewers thought they were reading objective observations of food consumption, biassed by the participants' unconscious responses to environmental factors. In fact the results were the participants' subjective, retrospective reports of food consumption, without unconscious bias. Results which might not be so interesting.
ReplyDeleteIf I had been one of the reviewers, I would not be chuffed to learn that my favourable review applied to a piece of fiction, quite different from the research that had actually been conducted.
I listened to episode 44 of the Everything Hertz podcast in which you briefly mention a possible new case of misconduct by a researcher that loves to tell "sexy stories" that the media pick up on. How long do we have to wait for this one ? :)
ReplyDeleteThank you for all your work concerning the improvement of science!
Not too long, I hope...
ReplyDeleteYou need to be careful about making accusations of research misconduct, even veiled as you do here. The definition of misconduct also includes state of mind considerations, to which you have no specific insight and can only speculate. Such statements are prohibited on sites such as PubPeer for good reason; you're potentially exposing yourself to legal liability here.
ReplyDeleteThe inconsistencies that you find may be the result of carelessness rather than intent to mislead. It would not be the first time this author has been accused of being sloppy.
ReplyDeletehttps://link.springer.com/content/pdf/10.1007/s12110-012-9158-4.pdf
Regardless of intent, the inconsistencies/errors are notable.