Jules, the posh robot from the University of Bristol, U.K., is equipped with tiny motors under its skin, which means it can accurately mimic human facial expressions. Jules is a disembodied head though, and while its copycat technique is impressive, robots need to do more than copy us to be able to interact on an emotional level. A step up the emotional adeptness scale, AIDA, the driving companion, uses its facial expressions to respond to the driver's mood—looking sad if a seatbelt is undone, for instance, or detecting that you are tense as you drive and helping you relax.