Publication Date

3-1-2024

Document Type

Article

Publication Title

Applied Sciences (Switzerland)

Volume

14

Issue

5

DOI

10.3390/app14052005

Abstract

The conventional design cycle in human–computer interaction faces significant challenges when applied to users in isolated settings, such as astronauts in extreme environments. Challenges include obtaining user feedback and effectively tracking human–software/human–human dynamics during system interactions. This study addresses these issues by exploring the potential of remote conversation analysis to validate the usability of collaborative technology, supplemented with a traditional post hoc survey approach. Specifically, we evaluate an integrated timeline software tool used in NASA’s Human Exploration Research Analog. Our findings indicate that voice recordings, which focus on the topical content of intra-crew speech, can serve as non-intrusive metrics for essential dynamics in human–machine interactions. The results emphasize the collaborative nature of the self-scheduling process and suggest that tracking conversations may serve as a viable proxy for assessing workload in remote environments.

Funding Number

80JSC017N0001-BPBA

Funding Sponsor

National Aeronautics and Space Administration

Keywords

collaborative work, conversation analysis, human-in-the-loop, remote observation, self-scheduling, usability evaluation, user interaction development methodology

Creative Commons License

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.

Department

Research Foundation

Share

COinS