Effective and Pleasant Interactive Dialog System Design for Guiding People with Visual Impairments
Open Access
- Author:
- Lee, Sooyeon
- Graduate Program:
- Information Sciences and Technology
- Degree:
- Doctor of Philosophy
- Document Type:
- Dissertation
- Date of Defense:
- September 04, 2020
- Committee Members:
- Jack Carroll, Dissertation Advisor/Co-Advisor
Jack Carroll, Committee Chair/Co-Chair
Mary Beth Rosson, Committee Member
Steven R. Haynes, Committee Member
Vijaykrishnan Narayanan, Outside Member
Mary Beth Rosson, Program Head/Chair - Keywords:
- People with visual impairments
Non-visual directional guidance
Assistive Technology
Remote Sighted Assistance
Non-visual hand navigation
Blind grocery shopping
Meaning of Independence
Multimodal guidance interface
Haptic guidance - Abstract:
- The worldwide visually impaired population has been rapidly growing due to the large ageing population. The estimated number of people with visual impairments (PVI) is over 285 million. The rising number of PVI and the long-acknowledged struggles that PVI face have motivated a great deal of research in academia and industry, resulting in an abundance of assistive technology development. Despite research endeavors, however, the everyday challenges that PVI encounter remain daunting. PVI still use guide dogs and white canes as their primary navigation and wayfinding tools. Moreover, they frequently seek assistance from sighted people for various types of daily activities and mundane tasks, such as finding clothes, picking up dropped objects, and grocery shopping. Due to their limited vision, PVI use other viable sensory channels such as hearing and touch to understand and interact with the world around them. Thus, it is important for assistive technologies to present information or guidance via auditory or tactile modes and provide an interface that accommodates the way that PVI process non-visual information. The research in this area to date has focused on the interpretation of visual information (object and scene recognition), but the interface through which information is conveyed and the communication mode employed, core parts of the user experience, are under-researched. Although some research has assessed different types of modalities and interaction interfaces for guidance for PVI, they are limited in that most of them focused only on navigation tasks, and their results vary. Previous work has not fully explored the processes by which PVI take in, comprehend, and interpret information in various modalities and in different contexts. In this dissertation, I present a research effort focused on this overlooked area, filling the gap in the field of the research. A grocery shopping task was selected for this research because it includes all components of guidance for PVI, including object recognition and body and hand navigation. Furthermore, it is an essential everyday activity that has been a complex, unsolved problem for PVI. I used human-centered design as a framework throughout all of my research investigations. I conducted ethnographic and experimental field and lab studies that shed light on the kinds of assistive interactions necessary to support the needs and desires of PVI. I also evaluated the effectiveness of a multimodal interface design using a Wizard of Oz methodology and investigated conversational assistive experiences provided by remote sighted assistance (RSA). These studies inspired a study of the meaning of independence to PVI and empirical and scenario-based design studies regarding a RSA model enhancement incorporating multimodal communication and computer vision technology. I synthesize and interpret the findings and results of each of these studies, discuss broad design implications, and propose considerations for design directions for assistive guidance interfaces. Finally, I discuss how the final studies of remote sighted assistance inform future research and development directions, including fully automated, assistive artificial intelligence guidance systems.