Survey of the State-of-the-Art
RoSe-HUB, in collaboration with the U.S. Dept. of Energy, Division of Environmental Management, and the
National Science Foundation, undertook a study of the robotic handling of high-consequence materials
(see report on nuclear waste handling, below, supported by both the DOE and NSF). This survey took stock of nuclear materials handling in the United
States and several allies around the world including the United Kingdom, Canada, Japan, and South Korea. The
report was a key input to the Dept. of energy's subsequent development of a Robotics Roadmap for the Handling of
Nuclear Waste (report also included below).
MOTHERSHIP - Hybrid Locomotion for Inspection
SuperBaxter: Gesture-Based Programming for Glove Box Operations
Gesture-Based Programming (GBP) is a form of programming by human demonstration. The process begins by observing a human demonstrate the task to be programmed. Observation of the human's hand and fingertips is achieved through a sensorized glove with special tactile fingertips. The modular glove system senses hand pose, finger joint angles, and fingertip contact conditions. Objects in the environment are sensed with computer vision while a speech recognition system extracts "articulatory gestures." Primitive gesture classes are extracted from the raw sensor information and passed on to a gesture interpretation network. The agents in this network extract the demonstrator's intentions based upon the knowledge they have previously stored in the system's skill library from prior demonstrations. Like a self-aware human trainee, the system is able to generate an abstraction of the demonstrated task, mapped onto its own skills. In other words, the system is not merely remembering everything the human does, but is trying to understand -- within its scope of expertise -- the subtasks the human is performing ("gesturing"). These primitive capabilities in the skill base take the form of encapsulated expertise agents -- semi-autonomous agents that encode sensorimotor primitives and low-level skills for later execution.
The output of the GBP system is the executable program for performing the demonstrated task on the target hardware. This program consists of a network of encapsulated expertise agents of two flavors. The primary agents implement the primitives required to perform the task and come from the pool of primitives represented in the skill base. The secondary set of agents includes many of the same gesture recognition and interpretation agents used during the demonstration. These agents perform on-line observation of the human to allow supervised practicing of the task for further adaptation.
Demos
Waste Management Demo |
![]() |
RoSe-HUB Publications
- M.M. Rahman, N. Sanchez-Tamayo, G. Gonzalez, M. Agarwal, V. Aggarwal, , R.M. Voyles, J. Wachs, "Transferring Dexterous Surgical Skill Knowledge between Robots for Semi-autonomous Teleoperation", in 28th IEEE Intl Conf on Robot and Human Interactive Communication (RO-MAN 2019), New Delhi, India, Oct., 2019.
- M.V.S.M. Balakuntala, V.L.N. Venkatesh, J.P. Bindu, R.M. Voyles, J. Wachs, "Extending Policy from One-Shot Learning through Coaching", in 28th IEEE Intl Conf on Robot and Human Interactive Communication (RO-MAN 2019), New Delhi, India, Oct., 2019.
- Voyles, R., Chun, W., Hamel, W., Krovi, V., Padir, T., Pryor, M., Santos, V., Sweet, L., Townsend, M., "State-of-the-Art of Robotic Handling of High-Consequence Materials - Nuclear Waste," Technical Report, Mar., 2018.
- Wheeler, J., Rimando, R., Chun, W., Dixon, P., Hamel, W., Harden, T., Heermann, P., Kriikku, E., Lee, J., Mehling, J., Minichan, R., Nance, T., Voyles, R."DOE Robotics Roadmap 2018," Technical Report, Mar., 2018.
- Cabrera, M.E., Sanchez-Tamayo, N., Voyles, R., Wachs, J.P., "One-Shot Gesture Recognition: One Step Towards Adaptive Learning," in Automatic Face & Gesture Recognition (FG 2017), 12th IEEE International Conference on, 2017.
Copyright: © 2018,2019 by Richard M. Voyles
Purdue University, West Lafayette, IN 47907, (765) 494-3733