The largest database of trusted experimental protocols

97 protocols using eyelink 2

1

Eye-Tracking Study of Visual Attention

Check if the same lab product or an alternative is used in the 5 most similar protocols
SR Research Experiment Builder (SREB) was used in combination with EyeLink II (SR Research Ltd., Mississauga, ON) eye tracking system to create and carry out this study. The flow of the experiment and eye movements of the participants were measured using the EyeLink II (SR Research Ltd., Mississauga, ON) head-mounted, video-based eye-tracking device (sampling rate = 500 Hz; spatial precision <0.01°; spatial accuracy <0.8° RMS error). Calibration of the EyeLink II was carried out in the same picture-plane used to display the experimental stimuli. EyeLink DataViewer software (SR Research Ltd., Mississauga, ON) digitizes the pupil in order to describe the location of the visual gaze fixations. Respondents’ eye movement data (saccadic reaction times) and key presses (manual reaction times) were recorded to a text file which was exported into an excel file and ultimately uploaded to SPSS v.15.0 for further statistical analysis.
+ Open protocol
+ Expand
2

Decoding Orientation from Eye Movements

Check if the same lab product or an alternative is used in the 5 most similar protocols
Participants were instructed to maintain fixation on a central fixation point throughout each fMRI run. For the experiment involving spiral stimuli, the participant’s eye position was monitored using an MRI-compatible ASL EYE-TRAC eye-tracking system. We applied pattern classifiers to these data to evaluate whether any reliable information about the viewed orientation could be decoded from eye-position signals22 . Analyses of eye movement data during a subset of scans revealed that our ability to decode orientation from eye position was at chance level, t(3)=0.34, p=0.75. We also conducted additional high-resolution measurements of eye position (500 Hz, SR-Research Eyelink II) outside of the MRI scanner for 5 of the subjects that ran in the study, using a visual stimulation paradigm identical to that used in the main attention experiment. We found that when we tried to decode the orientation of a viewed grating based solely on the high-resolution eye tracking data, performance was at chance level, t(4)=1.50, p>0.2. These findings replicate other published results from our lab, in which we attempted to decode stimulus orientation from eye position21 ,22 ,43 . Taken together, these results indicate that eye movements are unlikely to have contributed to our ability to classify orientation based on fMRI activity patterns.
+ Open protocol
+ Expand
3

Mother-Child Visual Perception Study

Check if the same lab product or an alternative is used in the 5 most similar protocols
Participants were presented with a cover story involving a mother and child looking at a picture book. The mother commented on objects and animals to help the child identify them. Each trial began with presentation of the display. After one second of preview, participants heard a spoken sentence over Sennheiser HD570 headphones and clicked on the referent that best matched the sentence. Eye movements were monitored using a head-mounted SR Research EyeLink II system sampling at 250 Hz, with drift correction procedures performed every fifth trial.
+ Open protocol
+ Expand
4

Decoding Orientation from Eye Movements

Check if the same lab product or an alternative is used in the 5 most similar protocols
Participants were instructed to maintain fixation on a central fixation point throughout each fMRI run. For the experiment involving spiral stimuli, the participant’s eye position was monitored using an MRI-compatible ASL EYE-TRAC eye-tracking system. We applied pattern classifiers to these data to evaluate whether any reliable information about the viewed orientation could be decoded from eye-position signals22 . Analyses of eye movement data during a subset of scans revealed that our ability to decode orientation from eye position was at chance level, t(3)=0.34, p=0.75. We also conducted additional high-resolution measurements of eye position (500 Hz, SR-Research Eyelink II) outside of the MRI scanner for 5 of the subjects that ran in the study, using a visual stimulation paradigm identical to that used in the main attention experiment. We found that when we tried to decode the orientation of a viewed grating based solely on the high-resolution eye tracking data, performance was at chance level, t(4)=1.50, p>0.2. These findings replicate other published results from our lab, in which we attempted to decode stimulus orientation from eye position21 ,22 ,43 . Taken together, these results indicate that eye movements are unlikely to have contributed to our ability to classify orientation based on fMRI activity patterns.
+ Open protocol
+ Expand
5

Eye Tracking in Visual Perception Experiments

Check if the same lab product or an alternative is used in the 5 most similar protocols
In a dark room, stimuli were displayed on a colour CRT monitor (Mitsubishi Electric RDF223H, 75 Hz, 2.1 arcmin/pixel, 46.5 deg. × 35.7 deg.) with a refresh rate of 75 Hz, controlled by a computer (Apple PowerMac G5). Each observer’s head was fixed with the help of a chin rest. The viewing distance was set at 52 cm. The left eye of each observer was completely occluded by an opaque acrylic cup. Pupil diameter changes and gaze positions of the right eye were recorded with an eye tracker (SR Research, Eyelink2) at 250 Hz. For the experiments shown in Fig. 5, stimuli were displayed on an LCD monitor (VPixx Technologies VIEWPixx, 1.5 arcmin/pixel, 49.0 deg. × 31.7 deg.) with a refresh rate of 120 Hz.
+ Open protocol
+ Expand
6

Eye Tracking in Controlled Visual Conditions

Check if the same lab product or an alternative is used in the 5 most similar protocols
Participants sat alone in a dark testing room, facing an LCD screen (15-in. wide, 1,280 × 1,024, 60 Hz). A chin rest was used in order to hold the distance to the screen constant at 57 cm. Stimuli were created in MATLAB (The MathWorks, Natick, MA, USA) using the Psychophysics Toolbox (Brainard, 1997; Pelli, 1997) .
Eye movements of the right eye were monitored with a head-mounted eye tracker (EyeLink 2, SR Research, Oakville, ON, Canada; 500 Hz sampling rate).
+ Open protocol
+ Expand
7

Eye Movement Tracking in Autism Spectrum Disorder

Check if the same lab product or an alternative is used in the 5 most similar protocols
Participants were seated on a chair with their head restrained by a chin-rest and faced a computer screen (47.5 Â 30 cm) placed 57 cm away from them. Eye movements of both eyes were recorded at 500 Hz using an infrared eye tracker (Eyelink 2, SR Research, Ottawa, Canada) for the first 44 participants (28 ASD and 16 control participants). For the 31 additional participants, eye movements were recorded at 1000 Hz using an Eyelink 1000 (12 ASD and 19 control participants). We checked that all pursuit and saccades parameters were not significantly different between the two recording systems.
+ Open protocol
+ Expand
8

Dichoptic Visual Stimulus Protocol

Check if the same lab product or an alternative is used in the 5 most similar protocols
Each trial began with 5 minutes of dark adaptation where the participants were instructed to relax their eyes. Each participant completed 18 SV experimental trials each (three at each stimulus amplitude) and six PV trials on separate days. Stimuli were presented dichoptically at 40 cm on two 7-inch LCD monitors (Lilliputt, Wolverhampton, UK) within a haploscope (Figure 2). Each eye's visual stimulus subtended 2.738 3 2.738, with a line width of 0.088 (Figure 2) and had two vertical or horizontal lines unique to each eye's stimulus, which served as suppression checks. Eye movements were recorded binocularly using videobased infrared oculography at 250 Hz (EyeLink 2; SR Research, Ottawa, Canada). All eye movements fell within the linear range (6408) of the eye tracker, which has a spatial resolution of 0.038 and an average accuracy of 0.58 or less.
+ Open protocol
+ Expand
9

Eye-Tracking Protocol for Visual Stimuli

Check if the same lab product or an alternative is used in the 5 most similar protocols
Participants sat in a dimly lit room with their head supported by a chin rest. They operated a two-button computer mouse. Stimuli were controlled using a custom-written program in Delphi (Embarcadero) software. Visual stimuli were displayed on a 19 inch CRT monitor (Philips 109B) using a vertical refresh rate of 100 Hz and a resolution of 1024 x 768 pixels. The monitor was positioned about 30 cm in front of the participant’s eyes, encompassing 61° x 46° (HxV) of the visual field. A photodiode was placed over the bottom-left corner to determine the precise onset and displacement of the visual stimuli with respect to eye movements. Binocular eye position was recorded at 500 Hz using a head-mounted eye tracker (EyeLink II; SR Research). The eye tracker was calibrated using a 9-point grid. A saccade was detected online using a position threshold of 1.5°. Participants were allowed to take breaks every 400 trials. After each break the eye tracker was recalibrated and as needed during testing, for example when the program failed to detect a fixation at the start of a trial.
+ Open protocol
+ Expand
10

Visual and Joystick-Controlled Self-Motion Perception

Check if the same lab product or an alternative is used in the 5 most similar protocols
Experiments were conducted in a darkened (but not completely dark) sound attenuated room. Subjects were seated at a distance of 114 cm from a tangential screen (70° x 55° visual angle) and their head‐position was stabilized by a chin‐rest. Visual stimuli were generated on a windows PC using an in‐house built stimulus package and were back‐projected onto the screen by a CRT‐Projector (Electrohome Marquee 8000) at a resolution of 1152 x 864 pixels and a frame rate of 100 Hz. The auditory stimuli were also generated using MATLAB and presented to the subjects by head‐phones (Philips SHS390). The eye position was recorded by a video‐based eye‐tracker (EyeLink II, SR Research) at a sampling rate of 500 Hz and an average accuracy of ~0.5°. During the distance reproduction, the subjects controlled the speed of simulated self‐motion using an analog joystick (Logitech ATK3) that was placed on a desk in front of them. The speed of the simulated self‐motion was proportional to the inclination angle of the joystick. The data from the joystick were acquired at a rate of 100 Hz and minimal change in speed of simulated self‐motion that could be triggered by the joystick was 1/1000 of the maximum range of speeds used in the experiments.
+ Open protocol
+ Expand

About PubCompare

Our mission is to provide scientists with the largest repository of trustworthy protocols and intelligent analytical tools, thereby offering them extensive information to design robust protocols aimed at minimizing the risk of failures.

We believe that the most crucial aspect is to grant scientists access to a wide range of reliable sources and new useful tools that surpass human capabilities.

However, we trust in allowing scientists to determine how to construct their own protocols based on this information, as they are the experts in their field.

Ready to get started?

Sign up for free.
Registration takes 20 seconds.
Available from any computer
No download required

Sign up now

Revolutionizing how scientists
search and build protocols!