RHUM
ROBOTS in HUMAN ENVIRONMENTS

Project publications


Perception

  • Labourey Q., Pellerin D., Rombaut M., Aycard O. and Garbay C. (2015) Sound classification in indoor environment thanks to belief functions, European Signal Processing Conference (EUSIPCO), Nice, France.
  • Labourey Q., Aycard O., Pellerin D. and Rombaut M. (2104) Audiovisual data fusion for successive speaker tracking, International Conference on Computer Vision Theory and Applications (VISAPP), Lisbon, Portugal.
  • Azim, A. and O. Aycard (2014) Layer-based supervised classification of moving objects in outdoor dynamic environment using 3d laser scanner. IEEE International Conference on Intelligent Vehicles (IV), Dearborn, MI.
  • Chavez, O., TD. Vu and O. Aycard. Fusion at detection level for frontal object perception. IEEE International Conference on Intelligent Vehicles (IV), Dearborn, MI.
  • Vu, T.D., O. Aycard and T. Fabio (2014) Object perception for intelligent vehicle applications: A multi-sensor fusion approach. IEEE International Conference on Intelligent Vehicles (IV), Dearborn, MI.

Human activity monitoring & social navigation

  • José Grimaldo Da Silva Filho, Anne-Hélène Olivier, Armel Cretual, Julien Pettré, Thierry Fraichard (2018) Human inspired effort distribution during collision avoidance in human-robot motion. IEEE Int. Symp. on Robot and Human Interactive Communication (RO-MAN), Nanjing and Tai'an, China.
  • José Grimaldo Da Silva Filho, Thierry Fraichard (2017) Human Robot Motion: A Shared Effort Approach. European Conference on Mobile Robotics, Paris, France.
  • Matteo Ciocca, Pierre-Brice Wieber, Thierry Fraichard (2017) Strong Recursive Feasibility in Model Predictive Control of Biped Walking. HUMANOIDS 2017 - IEEE-RAS International Conference on Humanoid Robots, Birmingham, UK.
  • Chan, Wai Tim S., Rombaut M. and Pellerin D. (2015) Rejection-based classification for action recognition using a spatio-temporal dictionary, European Signal Processing Conference (EUSIPCO), Nice, France.
  • Vettier, B. and C. Garbay (2014) Abductive Agents for Human Activity Monitoring, International Journal on Artificial Intelligence Tools (IJAIT): Special Issue on New Perspectives on the Use of Agents in Health Care, 23:1, 34 pages.

HRI

  • Bailly, G. & F. Elisei (2018) Demonstrating and learning multimodal socio-communicative behaviors for HRI: building interactive models from immersive teleoperation data, AI-MHRI: AI for Multimodal Human Robot Interaction Workshop at the Federated AI Meeting (FAIM), Stockholm - Sweden, pages 39-43.
  • Nguyen, D.-C., G. Bailly & F. Elisei (2018) Comparing cascaded LSTM architectures for generating gaze-aware head motion from speech in HAI task-oriented dialogs, HCI International, Las Vegas, USA.
  • Cambuzat, R., Elisei, F., Bailly, G., Simonin, O., & Spalanzani, A. (2018) Immersive teleoperation of the eye gaze of social robots, International Symposium on Robotics (ISR), Munich, Germany: 232-239.
  • Y-S. Liang, D. Pellier, H. Fiorino, S. Pesty, M. Cakmak (2018) Simultaneous End-User Programming of Goals and Actions for Robotic Shelf Organization. In the proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems.
  • Y-S. Liang, D. Pellier, H. Fiorino, S. Pesty (2017) Evaluation of a Robot Programming Framework for Non-Experts Using Symbolic Planning Representations. In the proceedings of the 26th IEEE International Symposium on Robot and Human Interactive Communication, pages 1121-1126.
  • Nguyen, D.A., G. Bailly & F. Elisei (2017) "Learning Off-line vs. On-line Models of Interactive Multimodal Behaviors with Recurrent Neural Networks", Pattern Recognition Letters, 100C:29-36.
  • Nguyen, D.-C., G. Bailly & F. Elisei (2017)  An evaluation framework to assess and correct the multimodal behavior of a humanoid robot in human-robot interaction, Gesture in Interaction (GESPIN), Posznan, Poland: pp. 56-62.
  • Cambuzat R., G. Bailly & F. Elisei (2017) Gaze contingent control of vergence, yaw and pitch of robotic eyes for immersive telepresence, European Conf. on Eye Movements (ECEM), Wuppertal, Germany.
  • Y-S. Liang, D. Pellier, H. Fiorino (2017). A Framework for Robot Programming in Cobotic Environments: First user experiments. In the proceedings of the International Conference on Mechatronics and Robotics Engineering, pages 30-35.
  • 0-S. Mohammed, G. Bailly, D. Pellier (2017) Acquiring Human-Robot Interaction skills with Transfer Learning Techniques. In the proceedings of the International Conference of Human Robot Interaction, pages 359-360.
  • Bailly G. (2016) "Critical review of the book Gaze in Human-Robot Communication", Journal on Multimodal User Interfaces, 1-2,
  • C. Adam, W. Johal, D. Pellier, H. Fiorino, S. Pesty (2016) Social Human-Robot Interaction: a new Cognitive and Affective Interaction-Oriented Architecture. In proceedings of the International Conference on Social Robotics.
  • Mihoub, A., G. Bailly, C. Wolf and F. Elisei (2016). "Graphical models for social behavior modeling in face-to face interaction." Pattern Recognition Letters (PRL), 74:82-89.
  • Bailly, G., F. Elisei and M. Sauze (2015). Beaming the gaze of a humanoid robot. Human-Robot Interaction (HRI), Portland, OR.
  • Bailly, G., A. Mihoub, C. Wolf and F. Elisei (2015). Learning joint multimodal behaviors for face-to-face interaction: performance & properties of statistical models. Human-Robot Interaction (HRI). Workshop on behavior coordination between animals, humans and robots, Portland, OR.
  • Mihoub, A., G. Bailly and C. Wolf (2015). "Learning multimodal behavioral models for face-to-face social interaction." Journal on Multimodal User Interfaces (JMUI), 9:3, 195-210.
  • Parmiggiani, Alberto, Randazzo, Elisei, Frédéric, Marco, Maggiali, Marco, Bailly, Gérard and Metta, Giorgio  (2015) “Design and Validation of a Talking Face for the iCub”, International Journal of Humanoid Robotics, 12:3, 20 pages.
  • Badeig F. and C. Garbay (2014) Supporting distant human collaboration under tangible environments : A normative multiagent approach, International Conference on Agents and Artificial Intelligence (ICAART), Angers, France.