Using extended reality (XR) for medical training and real-time clinical support during deep space missions

Publication Date

1-1-2023

Document Type

Article

Publication Title

Applied Ergonomics

Volume

106

DOI

10.1016/j.apergo.2022.103902

Abstract

Medical events can affect space crew health and compromise the success of deep space missions. To successfully manage such events, crew members must be sufficiently prepared to manage certain medical conditions for which they are not technically trained. Extended Reality (XR) can provide an immersive, realistic user experience that, when integrated with augmented clinical tools (ACT), can improve training outcomes and provide real-time guidance during non-routine tasks, diagnostic, and therapeutic procedures. The goal of this study was to develop a framework to guide XR platform development using astronaut medical training and guidance as the domain for illustration. We conducted a mixed-methods study—using video conference meetings (45 subject-matter experts), Delphi panel surveys, and a web-based card sorting application—to develop a standard taxonomy of essential XR capabilities. We augmented this by identifying additional models and taxonomies from related fields. Together, this “taxonomy of taxonomies,” and the essential XR capabilities identified, serve as an initial framework to structure the development of XR-based medical training and guidance for use during deep space exploration missions. We provide a schematic approach, illustrated with a use case, for how this framework and materials generated through this study might be employed.

Funding Number

0506

Funding Sponsor

National Aeronautics and Space Administration

Keywords

Clinical decision support, Clinical guidance, Deep space missions, Extended reality, Medical training

Department

Research Foundation

Share

COinS