Speaker: Prof. Tapomayukh Bhattacharjee, Cornell University (Dept. of Computer Science)
Date: Jul 5, 2024
Time: 2:00PM – 3:30PM SGT
Venue: Meeting Room 20, COM3 #02-59
Abstract:
How do we build robots that can assist people with mobility limitations with activities of daily living? To successfully perform these activities, a robot needs to be able to physically interact with humans and objects in unstructured human environments. Through this talk, I will cover various projects in my lab that showcase fundamental advances in the field of physical robotic caregiving that involve complex and uncertain physical human-robot interaction. Specifically, I will show you how we can build caregiving robots to perform activities of daily living such as feeding, meal-preparation, and bed-bathing using our newly developed caregiving simulation tools and algorithms that leverage multimodal perception and user feedback, and how we deployed these systems to work in the real world with real users.
Biography:
Tapomayukh “Tapo” Bhattacharjee is an Assistant Professor in the Department of Computer Science at Cornell University where he directs the EmPRISE Lab (https://emprise.cs.cornell.edu/). He completed his Ph.D. in Robotics from Georgia Institute of Technology and was an NIH Ruth L. Kirschstein NRSA postdoctoral research associate in Computer Science & Engineering at the University of Washington. He wants to enable robots to assist people with mobility limitations with activities of daily living. His work spans the fields of human-robot interaction, haptic perception, and robot manipulation and focuses on addressing the fundamental research question on how to leverage robot-world physical interactions in unstructured human environments to perform relevant activities of daily living. He is the recipient of TRI Young Faculty Researcher Award’24, NSF CAREER Award’23, and his work has won Best Paper Award Finalist at HRI’24, Best Demo Award at HRI’24, Best RoboCup Paper Award at IROS’22, Best Paper Award Finalist and Best Student Paper Award Finalist at IROS’22, Best Technical Advances Paper Award at HRI’19, and Best Demonstration Award at NeurIPS’18. His work has also been featured in many media outlets including the BBC, Reuters, New York Times, IEEE Spectrum, and GeekWire and his robot-assisted feeding work was selected to be one of the best interactive designs of 2019 by Fast Company.