Adaptive Human-Machine Interaction for Enhanced Human-Robot Collaboration in Construction

Restricted (Penn State Only)
- Author:
- Shayesteh, Shayan
- Graduate Program:
- Architectural Engineering
- Degree:
- Doctor of Philosophy
- Document Type:
- Dissertation
- Date of Defense:
- June 07, 2024
- Committee Members:
- James Freihaut, Program Head/Chair
John Messner, Major Field Member
Houtan Jebelli, Chair & Dissertation Advisor
Asok Ray, Outside Unit & Field Member
Dorukalp Durmus, Major Field Member - Keywords:
- Human-Robot Collaboration
Immersive Virtual Environment
Physiological Sensing
Adaptive User Interface
Human-Machine Interaction
Construction Safety - Abstract:
- The construction industry is experiencing a significant change with the integration of advanced artificial intelligence (AI), mechatronics, and sensing technologies to tackle its long-lasting issues such as stagnant productivity, safety concerns, and labor shortages. The implementation of robotic systems to assist human workers in demanding tasks, while beneficial, requires continuous human supervision and collaboration due to the unpredictable and complex nature of construction sites, highlighting the need for safe and efficient human-robot interactions. To address this, immersive user interfaces have emerged to facilitate human-robot interactions by offering simulated settings that replicate real-world construction scenarios. However, conventional immersive interfaces often fail to meet the diverse needs of individual users. To bridge this gap, this research introduces an adaptive, immersive human-machine interface designed to enhance human-robot collaboration in construction. This novel interface combines immersive technologies, physiological sensing, and AI, enabling users to acquire a multi-sensory understanding of the system’s status, communicate spontaneous responses intelligently, and dynamically adjust the interface through an AI-driven adaptive mechanism. The research progressed through three main steps: 1) the development of a multi-modal sensory system integrating enhanced visual and kinesthetic cues to augment user perception within the immersive environment; 2) the establishment of a physiological sensing method to objectively assess user responses by measuring cognitive load, trust, and engagement; and 3) the creation of an AI-based co-adaptation mechanism that tailors the system’s performance in real-time according to user feedback. The experimental results demonstrated that the proposed interface significantly improved the user experience in various human-robot collaborative tasks, with physiological metrics effectively mediating users’ status to facilitate responsive performance adjustments. As such, this research represented an innovative approach in combining immersive technologies, predictive modeling, and non-intrusive wearable physiological sensing to enhance human-robot collaboration scenarios, potentially improving worker safety and overall productivity in the construction sector.