Support software development to coordinate gestures of CEASAR with voice commands.
Lisa Ali and Michael Zitolo
This week was very intense for us in the Mechatronics and Controls lab. Although we were successful at having the Arduino communicate with the ArbotiX and the computer using software serial, we noticed that the calculation of the objects’ location was incorrect. We then started brainstorming ways to trouble shoot our problem. First, we checked the coordinates that the computer was sending to the Arduino, next we checked the algorithm by which the distance was being calculated and finally we looked at the information the ArbotiX was sending to the Arduino. After elaborate testing we believe that one or both of the cameras we are using for Caesar’s eyes are not situated on the plastic platform perfectly radiantly. We think that might account for the discrepancy that we are seeing with our location readings. We did some preliminary testing and then determined that there is about a 4 degree difference. We are going to use our final week to figure out a way of getting a more accurate reading.
As of last week, the design team and I solved the issue of how to re-work CAESAR's mouth to more effectively smile and frown to convey emotions. I referenced Robodyssey's ESRA II robot, which features four servo motors that flex an elastic band, and asked if we might be able to design something similar with the two servo motors in CAESAR. Miles created a sketch based on this suggestion with two attachments to CAESAR's mouth that enable the server to stretch and bend a small band stretched across CAESAR's mouth. After creating the new mouth in the Makerbot printer, we tested various expressions. During this week, I also interviewed Julie Russell, the Educational Director at the Brooklyn Autism Center, for her feedback on our project and to see if she'd be interested in testing our robot at a future date. She concurred that the repetition of facial expressions is useful and in accord with existing autism therapies, but added that what would really be useful is to have the robot teach coping strategies for handling difficult emotions. I included her insights into a draft of the paper for the Human-Robot Interaction conference, and Ms. Russell professed interest in testing CAESAR when we're ready to do so.