The largest database of trusted experimental protocols

Rift s

Manufactured by Oculus
Sourced in United States

The Oculus Rift S is a virtual reality headset designed for immersive gaming and entertainment experiences. It features a high-resolution display, improved tracking technology, and ergonomic design for comfortable long-term use.

Automatically generated - may contain errors

14 protocols using rift s

1

Avatar Reactions and Awareness in VR

Check if the same lab product or an alternative is used in the 5 most similar protocols
We used Unity (Unity Technologies, n.d. ) as the development platform for implementing our virtual environment and the avatars’ behaviors. We deployed our system with Oculus Rift S (“Oculus Rift S: PC-Powered VR Gaming Headset | Oculus”, n.d. ) because it can easily connect with Unity and implement the desired behaviors. For the avatars, we used two 3D models from the Unity Assets Store (o3n Studio, n.d. ), and implemented independent animations for the gaze behaviors, and used animations from Adobe Mixamo (“Mixamo”, n.d. ) for the idle avatar movements (e.g., breathing movement) to add some natural feeling to their behavior.
For the avatar reaction, we implemented a face-looking reaction behavior at a certain distance from the participant’s hand, based on the previous work (Cuello Mejía et al., 2021 (link)). In a previous study, pre-touch reaction distances were obtained for socially-touchable upper body parts: shoulders (24.8 cm), elbows (24.1 cm), and hands (21.5 cm) based on human interactions. We also implemented an “awareness” behavior for the avatar based on the concept of proxemics (Hall et al., 1968 (link)) and defined a looking behavior for it when the participant enters its personal space (~1.2 m).
+ Open protocol
+ Expand
2

Virtual Reality Immersive Experiences

Check if the same lab product or an alternative is used in the 5 most similar protocols
All visuals were rendered using an Oculus Rift S (https://www.oculus.com/rift-s/, Menlo Park, CA, USA) head mounted display using an MSI GP74 gaming laptop (https://www.msi.com, New Taipei City, Taiwan). Audio was delivered using Sennheiser HD 650 (https://www.sennheiser-hearing.com/en-UK/p/hd-650/, Wedemark, Germany) open back headphones. Participant head rotation and positional data within the virtual environment was tracked with 6DoF using the Oculus Rift S cameras. Participants controlled the in-game avatar using the Oculus Touch controllers.
+ Open protocol
+ Expand
3

Virtual Reality Simulation Setup

Check if the same lab product or an alternative is used in the 5 most similar protocols
The virtual reality simulations in this study were designed and run via the software Unreal Engine 4 (UE4) on a desktop computer with specifications: PU (Intel i7-9700K), GPU (Nvidia RTX 2080), RAM (32 GB DDR4), Storage (500 GB SSD and 2 TB HDD). The VR head mounted display (HMD) unit used was an Oculus Rift S. This is a tethered headset with 1280 × 1440 resolution per eye with an 80 Hz refresh rate that connects to the PC via the DisplayPort.
+ Open protocol
+ Expand
4

Virtual Reality in Physiotherapy Management

Check if the same lab product or an alternative is used in the 5 most similar protocols
The following test materials were used in our study:
Carl Zeiss VR ONE plus—The ZEISS® VR ONE plus contains two aspherical, biconvex lenses with +32.5 dioptres that facilitate fitting every subject within the display screen of a smartphone, which slides into the front tray of the head-mounted system at a distance of 44 mm, thus allowing a full 3D immersion.
Oculus Rift S—contains a number of sensors tracking the body movements. Thanks to the infrared LEDs and a camera placed on the goggles, it is possible to track the rotational movements of the head and pinpoint the position of the head in space. With the help of Touch VR controllers, carrying out various activities in virtual reality feels quite natural, just as in everyday life.
The physiotherapy management sessions were held for 3 weeks, 3 times a week, for 30 min, in each group, except for the first session, which lasted 60 min, so as to ensure that all participants were fully familiarised with how the programme actually worked. All sessions were carried out in the subject’s home environment (home, place of residence).
+ Open protocol
+ Expand
5

Embodied Experiences in Virtual Reality

Check if the same lab product or an alternative is used in the 5 most similar protocols
The VR was delivered via an Oculus Rift S head mounted display with connected Touch controllers (Oculus, Facebook Technologies, LCC, Menlo Park, USA) and a Windows computer (Alienware 17 R4, with NVIDIA GTX1080 GPU, Dell Technologies inc., Round Rox, Texas, USA). Four applications, all available on the Oculus store, were used. For VR-SH, the applications allowed the participant to embody a boxer (Creed: Rise to Glory), a superhero (The Avengers Powers Unite) and a rock climber (The Climb) (see Figure 1 ). For the VR-Play exposure, the application (Vacation Simulator) allowed embodiment of a cartoon-like character ( Figure 1 ). The hardware and software afforded six degrees of freedom for head and hand movement. Creed: Rise to Glory and The Avengers Powers Unite included full-body avatars, that were animated in response to head and hand tracking and "best guess" algorithms. This allowed dynamic full-body avatars without the need for full-body motion capture. This approach resulted in excellent synchronization of the avatar to the participants upper body. While the synchronization of the avatar to the lower body was to a lower degree, the interactions primarily involved the upper body, and the lower body was only visualized when standing still.
+ Open protocol
+ Expand
6

Virtual Reality Experiment Setup

Check if the same lab product or an alternative is used in the 5 most similar protocols
We have used a high-performance gaming computer (CPU Intel core i7 7700K, GPU Nvidia GTX 1070, RAM 16 GB DDR4 3000 MHz, SSD Samsung Evo 850), which was connected to the Oculus Rift S head-mounted device (HMD, Oculus, Menlo Park, CA, USA). For interaction and locomotion within the virtual environment, wireless Oculus Touch controllers with 6 degrees of freedom (Lenovo) were used. Oculus Rift S features integrated inside-out tracking (without external sensors), named Oculus Insight, which was used for motion and controller tracking. The sound was played through speakers integrated into the headband of the HMD. Fulfilment of all the questionnaires by the participants was done on a notebook PC equipped with a touch screen.
+ Open protocol
+ Expand
7

CorFix: VR-Enabled Patient-Specific Vascular Graft Design

Check if the same lab product or an alternative is used in the 5 most similar protocols
The VR surgical planning software, CorFix, was developed based on the Unity 3D engine. The software-running platform was an Alienware Aurora R8 (Dell) with an Intel Core i7-9700 processor, a NVIDIA GeForce RTX-2080Ti, and 16 GB of RAM. An Oculus Rift S was used for displaying CorFix in full-immersive VR. Touch controllers (Oculus Rift) were integrated into the system for interacting with the interface. CorFix was previously designed to perform simple diagnosis (ie, zoom, rotation, label, ruler, and clipping) and modeling (ie, cutting vessels, parametric modeling, and free-form modeling) tasks. This version of CorFix had a modified user interface to accommodate clinicians untrained in VR, modeling software (eg, CAD), or CFD. The interface was adapted to allow users to intuitively design patient-specific vascular grafts in a short amount of time and integrate image analysis in the workflow.
+ Open protocol
+ Expand
8

Integrating 3D VR Surgical Planning

Check if the same lab product or an alternative is used in the 5 most similar protocols
Anonymized DICOM files, together with 3D segmentation files, were loaded into our CardioVR surgical planning tool. CardioVR was developed in collaboration with MedicalVR (Amsterdam, The Netherlands). CardioVR software enables immediate automatic CT to 3D VR rendering and also provides the user additional editing tools to view the conventional 2D-CT scan images, change visual scan settings (e.g. opacity), highlight structures by brushing (colouring) and erasing parts of the scan, and add/remove/highlight the colour of additional 3D segmentations. All scans can be accurately reviewed with an HMD (Oculus Rift S, Oculus VR, Irvine, CA, USA) and controllers while 3D projections of the VR view are provided on a computer screen (Figure 1, Supplementary material online, Video S1). After initial assessment of the scans by a resident, the surgeon reviewed the scans in immersive VR. The surgeon analysed the VR reconstruction one day before or on the same day of the procedure (when the procedure was planned in the afternoon). During review of the CT scan by the surgeon, a recording of the analysis was performed and saved to provide (optional) visualization and display of the recordings during surgery, on a monitor (Figure 1).
+ Open protocol
+ Expand
9

Oculus Rift S VR Equipment Setup

Check if the same lab product or an alternative is used in the 5 most similar protocols
The VR equipment consisted of an Oculus Rift S head-mounted display (HMD) (Oculus VR, Irvine, CA, USA), sensors, touch controllers and a computer compatible with VR technology.
+ Open protocol
+ Expand
10

Virtual Reality Interviews for Qualitative Research

Check if the same lab product or an alternative is used in the 5 most similar protocols
In the VE condition, interviewer and participant were in different rooms within the same building and communicated using an Oculus Rift S virtual reality (VR) headset. The Oculus Rift creates a sense of complete immersion in a three-dimensional world (here, a bespoke interview environment) via 2,560 × 1,440 high-resolution OLED panels, one for each eye, which globally refresh at a rate of 90 Hz. An on-board Inertia Measurement Unit (IMU) positional camera allows transitional and rotational movement to be tracked with 6 DoF. The headset tracks the movements of both head and body, then translates them into VR with realistic precision. Verbal communication was via 3D positional audio built directly into the headset, which was digitally recorded for transcription and coding. A bespoke, virtual interview environment was developed for this research using Unreal Engine 4. The VE interview environment was purposely sparse and neutral, comprising a sofa, a table and chairs—one chair for the avatar interviewer, the other for the avatar participant (see Figs. 1 and 2). Limited choice was offered to participants regarding the appearance of their avatar, likewise the interviewers. They could appear as male or female. Participants and all interviewers chose to match their avatar to their gender appearance.

Avatar and environment example

+ Open protocol
+ Expand

About PubCompare

Our mission is to provide scientists with the largest repository of trustworthy protocols and intelligent analytical tools, thereby offering them extensive information to design robust protocols aimed at minimizing the risk of failures.

We believe that the most crucial aspect is to grant scientists access to a wide range of reliable sources and new useful tools that surpass human capabilities.

However, we trust in allowing scientists to determine how to construct their own protocols based on this information, as they are the experts in their field.

Ready to get started?

Sign up for free.
Registration takes 20 seconds.
Available from any computer
No download required

Sign up now

Revolutionizing how scientists
search and build protocols!