Sensor-enabled Calibration of VR-Integrated Co-Simulation Platforms for Enhanced Accuracy in Multi-modal Mobility Models
-
2024-10-01
-
Details
-
Creators:
-
Corporate Creators:
-
Corporate Contributors:
-
Subject/TRT Terms:
-
Resource Type:
-
Geographical Coverage:
-
Edition:Final Report
-
Corporate Publisher:
-
Abstract:This paper introduces VR-WISE (Virtual Reality-based Work Zone Integrated Simulation Environment), a high-fidelity co-simulation platform that integrates realistic roadway worker models into CARLA-hosted car driving simulators to capture different vehicle-work zone interactions. The virtual reality (VR)-based platform features eye-tracking and biometric sensors to monitor drivers' situational awareness as they navigate work zones populated by 3D animated workers, whose movements are mapped using human motion tracking in Maya. VR-WISE supports rapid customization of diverse work zone scenarios and simulates worker-driver interactions, emphasizing human-robot integration to offer sustainable solutions for improving work zone safety. In parallel, this research task aims to develop a comprehensive vocabulary of calibration metrics for each simulation environment within the multi-simulation platform, initially focusing on integrating VR and micro traffic simulators like CARLA and SUMO. This vocabulary will serve as the foundation for accurate calibration and synchronization of diverse simulation environments, ensuring that simulation outcomes align closely with real-world observations. The research will also develop methods to expand the library of data sources for calibration, encompassing human data inputs, vehicle dynamics, traffic flow, and situational awareness metrics from VR-based simulations. The ultimate goal is to create an immersive environment that facilitates the calibration of multi-modal mobility models. A key calibration metric aims to bridge the gap between real-world conditions and the simulation platform. The platform integrates a high-fidelity physical engine, external hardware for human-in-the-loop interactions, sensory human behavior capture, and full-body animation resources. The human data capture, such as gaze duration, fixation ratios, blood volume pulse (BVP), heart rate (HR), and electrodermal activity (EDA), contributes to behavior analysis and is crucial for calibrating both the transportation models and the overall system.
-
Format:
-
Funding:
-
Collection(s):
-
Main Document Checksum:urn:sha-512:b0330cdf71bdec620c8ad4bf558aaa13202b5967fed4d5c14c639a67846d8407693d0a9761f2a0847896d38ae22757d762d78efa136755a5e668e1c34ff01ca0
-
Download URL:
-
File Type: