Stimuli consisted of colored pictures of three male faces with neutral expressions, obtained from the NimStim database (Tottenham et al., 2009 (link)), along with 36 colored pictures of meat dishes and 36 colored pictures of vegetable dishes obtained from the Internet. Stimuli were presented with using Psychtoolbox 3.0.8 (Brainard, 1997 (link); Pelli, 1997 (link)) in MATLAB 7.8 (The MathWorks, Inc., Natick, MA, USA).
The stimuli were presented using an event-related design. In each trial, one of the three faces was presented along with a text cue above the face indicating the person’s emotional state (“happy” or “sad”), and pictures of a meat dish and a vegetable dish on the left and right of the face (Figure 1). Each trial was presented for 2 s and trials were separated by a 4–10 s jittered fixation interval. Each run consisted of six trials per condition (i.e., each face paired with each emotion) to give a total of 36 trials per run, and a run duration of 5 min. The program “optseq2”1 was used to generate the optimal sequence and separation of trials for maximal statistical efficiency of rapid-presentation event-related hemodynamic response estimation for each run (Dale, 1999 (link)). The position of the meat and vegetable dishes on the left and right of the face was counterbalanced across trials within each condition and each run. Ten runs were presented.
Free full text: Click here