Skip to content

Dynamic reconfiguration of mission parameters for underwater human-robot collaboration. #ICRA2018

Notifications You must be signed in to change notification settings

xahidbuffon/RoboChatGest

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

56 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

This repository contains a hand gesture-based human-robot communication framework named RoboChatGest. This allows divers to use a set of simple hand gestures to communicate instructions to an underwater robot and dynamically reconfigure program parameters during a mission. The ROS version, tested on Aqua 8 robot, is provided in the robo_chat_gest folder.

Hand gestures

The following set of 10 simple and intuitive hand gestures are used:

zero one two three four five left right pic ok
det-1 det-5 det-9 det-13 det-17 det-2 det-6 det-10 det-14 det-18
det-3 det-7 det-11 det-15 det-19 det-4 det-8 det-12 det-16 det-20

Testing the detector

Use the test_detector.py file to test images or video files of interest.

{pic, pic} {five, two} {zero, ok} {left, left}
det-21 det-22 det-23 det-23

Testing the RoboChatGest

In RoboChatGest, a sequence of hand gestures is used to generate instructions for:

  • Task switching: STOP current task and SWITCH to another (predefined) task
  • Parameter reconfiguration: CONTD. current program, but UPDATE values of a (predefined) parameter

For instance, instructing the robot to 'STOP current task and HOVER' can be done as follows:

  • Start token for STOP current task {0, 0} + HOVER token {5, 5} + confirmation token {ok, ok}
  • Hence, the {left-hand, right-hand} gesture tokens are = {0, 0}, {5, 5}, {ok, ok}
RoboChatGest mode STOP HOVER Token: STOP HOVER Token: STOP HOVER GO
det-24 det-24 det-25 det-26

Details about the hand gestures-to-instruction mapping can be found in the paper. A simple Finite-State Machine (FSM) is used for implementing the mapping rules, which is modified based on various application requirements (see instructionGenerator.py for details). We also use a different FSM for menu selection, i.e., for switching between (five) menu3 options in the Aqua robot (see menueSelector.py for details); to select a menu, the {left-hand, right-hand} gesture tokens are: {ok, ok}, {menu #, menu #}. For example:

Menu mode Token: SELECT MENU Token: SELECT MENU Token: SELECT MENU 3
det-24 det-24 det-25 det-26

Demos:

ROS version

  • The robo_chat_gest folder contain the ROS-kinetic package version
  • This version is currently running on the Aqua 8 (MinneBot) robot (more details: http://irvlab.cs.umn.edu)
  • Feel free to cite the papers if you find anything useful!
@article{islam2018understanding,
  title={{Understanding Human Motion and Gestures for Underwater Human-Robot collaboration}},
  author={Islam, Md Jahidul and Ho, Marc and Sattar, Junaed},
  journal={{Journal of Field Robotics (JFR)}},
  pages = {1--23},
  year={2018},
  publisher={Wiley Online Library}
}

@inproceedings{islam2018dynamic,
  title={{Dynamic Reconfiguration of Mission Parameters in Underwater Human-Robot Collaboration}},
  author={Islam, Md Jahidul and Ho, Marc and Sattar, Junaed},
  booktitle={{IEEE International Conference on Robotics and Automation (ICRA)}},
  pages={1--8},
  year={2018},
  organization={IEEE}
}

About

Dynamic reconfiguration of mission parameters for underwater human-robot collaboration. #ICRA2018

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published