Augmentative and Alternative Communication

Persons with complex communication needs (CCNs), use augmentative and alternative communication (AAC) devices to express thoughts, needs, wants, and ideas when they cannot rely on their speech to communicate. Some examples of these devices include specialized keyboards along with adapted controllers and text-to-speech interfaces.

We are exploring different ways people use AAC devices in dyadic interactions to inform new designs that can reduce user burden of persons with CCNs and motor impairments and enable the design of more personal AAC interfaces. For more information, please contact Stephanie.

Markerless 3D Human Pose Forecasting

It is widely agreed that effective interaction with humans is a challenging problem that first requires that robots be able to perceive them. At HARP lab we believe that enabling robots to model human intent and predict human behavior would allow for more natural human robot interaction. To this end, we are also working on human pose forecasting methods with low-computational cost. Abhijat would be happy to provide more information!

Assistive Manipulation Through Intent Recognition

An upper body mobility limitation can severely impact a person’s quality of life. Such limitations can prevent people from performing everyday tasks such as picking up a cup or opening a door. The U.S. Census Bureau has indicated that more than 8.2% of U.S. population, or 19.9 million Americans, suffer from upper body limitations.

Assistive robots offer a way for people with severe mobility impairment to complete daily tasks. However, current assistive robots primarily operate through teleoperation, which requires significant cognitive and physical effort from the user. We exlore how these assistive robots can be improved with artificial intelligence to take an active role in helping their users. Drawing from our understanding of human verbal and nonverbal behaviors (like speech and eye gaze) during robot teleoperation, we study how intelligent robots can predict human intent during a task and assist toward task completion. We aim to develop technology to decrease operator fatigue and task duration when using assistive robots by employing human-sensitive shared autonomy. Reuben and Ben are the contacts on this project.