RHUM
ROBOTS in HUMAN ENVIRONMENTS

Project publications


Perception

  • Labourey Q., Pellerin D., Rombaut M., Aycard O. and Garbay C. (2015) Sound classification in indoor environment thanks to belief functions, European Signal Processing Conference (EUSIPCO), Nice, France.
  • Labourey Q., Aycard O., Pellerin D. and Rombaut M. (2104) Audiovisual data fusion for successive speaker tracking, International Conference on Computer Vision Theory and Applications (VISAPP), Lisbon, Portugal.
  • Azim, A. and O. Aycard (2014) Layer-based supervised classification of moving objects in outdoor dynamic environment using 3d laser scanner. IEEE International Conference on Intelligent Vehicles (IV), Dearborn, MI.
  • Chavez, O., TD. Vu and O. Aycard. Fusion at detection level for frontal object perception. IEEE International Conference on Intelligent Vehicles (IV), Dearborn, MI.
  • Vu, T.D., O. Aycard and T. Fabio (2014) Object perception for intelligent vehicle applications: A multi-sensor fusion approach.IEEE International Conference on Intelligent Vehicles (IV), Dearborn, MI.

Human activity monitoring

  • Chan, Wai Tim S., Rombaut M. and Pellerin D. (2015) Rejection-based classification for action recognition using a spatio-temporal dictionary, European Signal Processing Conference (EUSIPCO), Nice, France.
  • Vettier, B. and C. Garbay (2014) Abductive Agents for Human Activity Monitoring, International Journal on Artificial Intelligence Tools (IJAIT): Special Issue on New Perspectives on the Use of Agents in Health Care, 23:1, 34 pages.

HRI

  • Badeig F. and C. Garbay (2014) Supporting distant human collaboration under tangible environments : A normative multiagent approach, International Conference on Agents and Artificial Intelligence (ICAART), Angers, France.
  • Bailly, G., F. Elisei and M. Sauze (2015). Beaming the gaze of a humanoid robot. Human-Robot Interaction (HRI), Portland, OR.
  • Bailly, G., A. Mihoub, C. Wolf and F. Elisei (2015). Learning joint multimodal behaviors for face-to-face interaction: performance & properties of statistical models. Human-Robot Interaction (HRI). Workshop on behavior coordination between animals, humans and robots, Portland, OR.
  • Mihoub, A., G. Bailly and C. Wolf (2015). "Learning multimodal behavioral models for face-to-face social interaction." Journal on Multimodal User Interfaces (JMUI), 9:3, 195-210.
  • Parmiggiani, Alberto, Randazzo, Elisei, Frédéric, Marco, Maggiali, Marco, Bailly, Gérard and Metta, Giorgio  (2015) “Design and Validation of a Talking Face for the iCub”, International Journal of Humanoid Robotics, 12:3, 20 pages.
  • Mihoub, A., G. Bailly, C. Wolf and F. Elisei (2016). "Graphical models for social behavior modeling in face-to face interaction." Pattern Recognition Letters (PRL).