The goal of the Human And Robot Partners (HARP) lab is to understand and develop autonomous, intelligent robots that help people live better. Our robots engage people through social and physical interactions, monitoring human behavior to understand and predict the types of help people need. Our lab’s expertise includes robotics, human-robot interaction, machine learning, computer vision, artificial intelligence, human-computer interaction, and cognitive science. Application domains include assistive robot manipulators for people with motor impairments, robot tutors for education, and robot therapy assistants for people with cognitive or social disabilities.
Stop by our lab in NSH 4502 for a visit!
If you’re interested in getting involved with our research, please read this first.
29 April ‘21: Pallavi and Tesca Fitzgerald’s joint work with collaborators at UT Austin on “Understanding the Relationship between Interactions and Outcomes in Human-in-the-Loop Machine Learning” was also accepted to IJCAI 2021 Survey Track!
29 April ‘21: Pallavi Koppol’s paper on “Interaction Considerations in Learning from Humans” was accepted to IJCAI 2021!
20 April ‘21: George Yu’s Summer Undergraduate Research Fellowship (SURF) proposal titled “Determining the Dynamics of the Functional Field of View During Driving” was accepted and George will be joining HARPlab for summer research! Congratulations George!
5 April ‘21: Congratulations to Pallavi Koppol for her Speaking Qualifier talk on “Learning with People”!
1 Feb ‘21: Michelle’s paper on “Adapting Language Complexity for AI-Based Assistance” was accepted to Lifelong Learning and Personalization in Long-Term Human-Robot Interaction workshop at HRI 2021!