The research team within autonomous systems in industrial environments looks at how trained human operators interact and operate with autonomous vehicles in industrial environments. Humans may interact with the robots in three ways: trained operators can control or teach the vehicle in a semi-autonomous mode; workers on foot may directly signal to or indirectly influence the behaviour of the robot; and finally, humans in manually driven vehicles need to share the driving environment. The areas addressed are:
Intention Communication, Robot to Human
This task addresses questions related to the visualization of some parts of the robot’s world model, its state and plans (a robot’s intentions) to facilitate intuitive and smooth human-robot interaction.
Intention Communication, Human Operator to Robot
In this task, we assume a trained human operator who communicates commands (for example where the robot should navigate to in an unknown environment) through the use of hand-drawn maps or emergency maps.
Action Recognition deals with people tracking and action recognition in industrial environments where regulations are in place that make sure that people who are in areas shared with robots are wearing reflective vests.
This task is, on one hand, about learning typical patterns of motion by addressing the question, how can we learn and represent models that represent the spatial distribution of motion in an environment?
The purpose of this task is to evaluate the impact that the novel methods.