The Unity game engine (version 2018.1.9f2) was used for experimental control. Participants’ and the experimenter’s momentary position and orientation per time point were recorded with a motion-tracking system (see below). The Unity application was programmed to read the motion-tracking data in real-time during the experiment, in order to trigger auditory events that depended on the participant’s momentary location. For instance, the Unity application was programmed to automatically detect when a participant arrived at a hidden target location during a ‘target search’ trial, and then to trigger the next auditory instruction. Next, the Unity application was programmed to detect when a participant arrived at the cued wall-mounted sign, at which point another auditory instruction was triggered to direct participants to find the next target location, and so on.
Moreover, the Unity application was programmed to trigger iEEG data storage automatically, and to insert a mark at specific time points in the iEEG data. These synchronization marks were sent at the beginning and end of each 3.5–4-min recording interval, allowing for synchronization of iEEG data with data from other recording modalities (for example, motion tracking and eye tracking). See ref. 38 (link) for further technical details and specifications of this setup.