NINA
Humanoïd Robot

Videos

Beaming of the iCub NINA with chessboard using a Sony HMD with an Arrington eyetracker: head (21/7/2014)
Beaming of the iCub NINA using a HTC Vive HMD with an embedded SMI eyetracker: head, eyes, jaw and lips (20/6/2017)
NINA replicating a real interview conducted by Alessandra Juphard with an elderly subject. The videos are filmed from the interviewee's perspective. NINA holds a dummy tablet to give the impression that it's trigerring displays (the words to be learnt are displayed another tablet placed on the table in front of the subject) and taking care of scoring.
  1. first test (19/5/2016): behaviors are triggered by events: expressive text-to-speech synthesis, gazing, pointing & clicking, etc
  2. latest performance (10/6/2016): adding iris, adding blinks and new gaze events, etc
First demonstration of autonomous interviews (using Google speech recognition®: latest performance (10/5/2017) Autonomous Nina