Skip to content

VR driving πŸš™ + eye tracking πŸ‘€ simulator based on CARLA for driving interaction research

License

Notifications You must be signed in to change notification settings

HARPLab/DReyeVR

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

DReyeVR

Welcome to DReyeVR, a VR driving simulator for behavioural and interactions research.

Main Figure

Submission Video Demonstration (YouTube)

Build Status

This project extends the Carla simulator to add virtual reality integration, a first-person maneuverable ego-vehicle, eye tracking support, and several immersion enhancements.

If you have questions, hopefully our F.A.Q. wiki page and issues page can answer some of them.

IMPORTANT: Currently DReyeVR only supports Carla versions: 0.9.13 with Unreal Engine 4.26

Highlights

Ego Vehicle

Fully drivable virtual reality (VR) ego-vehicle with SteamVR integration (see EgoVehicle.h)

  • SteamVR HMD head tracking (orientation & position)
    • We have tested with the following devices:
      Device VR Supported Eye tracking OS
      HTC Vive Pro Eye βœ… βœ… Windows, Linux
      Quest 2 βœ… ❌ Windows
    • While we haven't tested other headsets, they should still work for basic VR usage (not eye tracking) if supported by SteamVR.
    • Eye tracking is currently ONLY supported on the HTC Vive Pro Eye since we use SRanipal for the eye-tracker SDK. We are happy to support more devices through contributions for adding other SDKs.
  • Vehicle controls:
    • Generic keyboard WASD + mouse
    • Support for Logitech Steering wheel with this open source LogitechWheelPlugin
      • Includes force-feedback with the steering wheel.
      • We used a Logitech G923 Racing Wheel & Pedals
        • Full list of supported devices can be found here though we can't guarantee out-of-box functionality without testing.
  • Realistic (and parameterizable) rear & side view mirrors
    • WARNING: very performance intensive
  • Vehicle dashboard:
    • Speedometer (in miles-per-hour by default)
    • Gear indicator
    • Turn signals
  • Dynamic steering wheel
    • Adjustable parameters, responsive to steering input
    • See our documentation on this here
  • "Ego-centric" audio
    • Responsive engine revving (throttle-based)
    • Turn signal clicks
    • Gear switching
    • Collisions
  • Fully compatible with the existing Carla PythonAPI and ScenarioRunner
    • Minor modifications were made. See Usage.md documentation.
  • Fully compatible with the Carla Recorder and Replayer
    • Including HMD pose/orientation & sensor reenactment
  • Ability to handoff/takeover control to/from Carla's AI wheeled vehicle controller
  • Carla-based semantic segmentation camera (see Shaders/README.md)

Ego Sensor

Carla-compatible ego-vehicle sensor (see EgoSensor.h) is an "invisible sensor" that tracks the following:

  • Real-time Eye tracking with the HTC Vive Pro Eye VR headset
    • Eye tracker data includes:
      • Timing information (based off headset, world, and eye-tracker)
      • 3D Eye gaze ray (left, right, & combined)
      • 2D Pupil position (left & right)
      • Pupil diameter (left & right)
      • Eye Openness (left & right)
      • Focus point in the world & hit actor information
      • See DReyeVRData.h:EyeTracker for the complete list
    • Eye reticle visualization in real time
  • Real-time user inputs (throttle, steering, brake, turn signals, etc.)
  • Image (screenshot) frame capture based on the camera
    • Typically used in Replay rather than real-time because highly performance intensive.
  • Fully compatible with the LibCarla data serialization for streaming to a PythonAPI client (see LibCarla/Sensor)
    • We have also tested and verified support for (rospy) ROS integration our sensor data streams

Other additions:

  • Custom DReyeVR config file for one-time runtime params. See DReyeVRConfig.ini
    • Especially useful to change params without recompiling everything.
    • Uses standard c++ io management to read the file with minimal performance impact. See DReyeVRUtils.h.
  • World ambient audio
  • Non-ego-centric audio (Engine revving from non-ego vehicles)
  • Synchronized Replay with per-frame frame capture for post-hoc analysis (See Docs/Usage.md)
  • Recorder/replayer media functions
    • Added in-game keyboard commands Play/Pause/Forward/Backward/etc.
  • Static in-environment directional signs for natural navigation (See Docs/Signs.md)
  • Adding weather to the Carla recorder/replayer/query (See this Carla PR)
  • Custom dynamic 3D actors with full recording support (eg. HUD indicators for direction, AR bounding boxes, visual targets, etc.). See CustomActor.md for more.
  • (DEBUG ONLY) Foveated rendering for improved performance with gaze-aware (or fixed) variable rate shading

Install/Build

See Docs/Install.md to:

  • Install and build DReyeVR on top of a working Carla repository.
  • Download plugins for DReyeVR required for fancy features such as:
    • Eye tracking (SRanipal)
    • Steering wheel/pedals (Logitech)
  • Set up a conda environment for DReyeVR PythonAPI

OS compatibility

OS VR Eye tracking Audio Keyboard+Mouse Racing wheel Foveated Rendering (Editor)
Windows βœ… βœ… βœ… βœ… βœ… βœ…
Linux βœ… ❌ βœ… βœ… ❌ ❌
MacOS ❌ ❌ βœ… βœ… ❌ ❌
  • While Windows (10) is recommended for optimized VR support, all our work translates to Linux systems except for the eye tracking and hardware integration which have Windows-only dependencies.
    • Unfortunately the eye-tracking firmware is proprietary & does not work on Linux
      • This is (currently) only supported on Windows because of some proprietary dependencies between HTC SRanipal SDK and Tobii's SDK. Those interested in the Linux discussion for HTC's Vive Pro Eye Tracking can follow the subject here (Vive), here (Vive), and here (Tobii).
    • Additionally, the LogitechWheelPlugin we use only has Windows support currently. Though it should be possible to use the G923 on Linux as per the Arch Wiki.
  • Also, although MacOS is not officially supported by CARLA, we have development happening on an Apple Silicon machine and have active forks of CARLA + UE4.26 with MacOS 12+ support. Note that this is primarily for development, as it is the most limited system by far.

Documentation & Guides

  • See F.A.Q. wiki for our Frequently Asked Questions wiki page.
  • See Install.md to install and build DReyeVR
  • See Usage.md to learn how to use our provided DReyeVR features
  • See Development.md to get started with DReyeVR development and add new features
  • See Docs/Tutorials/ to view several DReyeVR tutorials such as customizing the EgoVehicle, adding custom signs/props and more.

Citation

If you use our work, please cite the corresponding paper:

@inproceedings{silvera2022dreyevr,
  title={DReyeVR: Democratizing Virtual Reality Driving Simulation for Behavioural \& Interaction Research},
  author={Silvera, Gustavo and Biswas, Abhijat and Admoni, Henny},
  booktitle={Proceedings of the 2022 ACM/IEEE International Conference on Human-Robot Interaction},
  pages={639--643},
  year={2022}
}

Acknowledgements

  • This project builds upon and extends the CARLA simulator
  • This repo includes some code from CARLA: Computer Vision Center (CVC) at the Universitat Autonoma de Barcelona (UAB) & Intel Corporation.
  • This repo includes some code from Hewlett-Packard Development Company, LP. See nvidia.ph. This is a modified diagnostic tool used during development.

Licenses

  • Custom DReyeVR code is distributed under the MIT License.
  • Unreal Engine 4 follows its own license terms.
  • Code used from other sources that is prefixed with a Copyright header belongs to those individuals/organizations.
  • CARLA specific licenses (and dependencies) are described on their GitHub