This project aims to enable computer vision capabilities on a mobile robot in such a way that decisions can be made local to the robot in real time based on video input. This includes design and initial development of a vision API. This could include adding vision to UR5e arm so that QR-type tags could (1) identify fiducial positions to provide a coordinate system transformation to enable accurate interactions with mobile arm and things on static tables, and (2) identify things of interest called out by name in protocol: “beaker 7”, “start button”, “trash bin”
Related projects include the development of vision packages for real-time coordination of mobility (based on MiR250 cameras); Dexterity (based on UR5e movements) and experiment control based on experiment specific tagging.
Mentors: Rory Butler
Students: Amaan Khan, Halona Dantes, Saaya Patel