The SNOW flagship aims to develop, integrate, demonstrate and evaluate hybrid Al capabilities for an autonomous system that can operate safely and effectively in an open world.
Recently, robotic systems have moved on from closed and prepped environments to the open and real world. While robotic systems are capable of moving in the real world, they also need to be able to operate intelligently in such an open world. Challenges they might face are the fact that the environment can now change and that it may contain unknowns. Additionally, in an open world, the robotic system will likely have multiple purposes rather than only a single purpose. Finally, in an open world, the robotic system will likely have to work together cooperatively with others, be it other humans or other machines.
To this end, the SNOW flagship aims to develop, integrate, demonstrate and evaluate hybrid Al capabilities for an autonomous system that can operate safely and effectively in an open world. Within the project we develop AI that make robotic systems autonomously understand their real open world, while simultaneously plan the remainder of their operation. Through semantic task agreement, the system understands that it has received all information necessary to start a task and it can evaluate, from all that is known about its own capabilities and the external conditions, whether and how it is able to complete this task effectively. Through goal-direct perception, the system understands what new information should be obtained from the real-world, and how this information may be acquired, so that it can answer the sub-questions derived from the goal it received.
SNOW develops these AI capabilities on an actual robotic systems and evaluates the robots increase of autonomy in a real-world use-case. An increase of autonomy is achieved when fewer operator interventions are required in environments with similar complexity. SNOW keeps track of Key performance Indicators (KPI’s), such as the number of interventions in a particular environment which measurable complexity. The quadruped robot SPOT of Boston Dynamics is, out-of-the-box, a remotely controlled system. SNOW uses the robot to evaluate the increase of autonomy of SPOT, after extension with the developed SNOW AI-modules.
In 2021, the SNOW-team extended SPOT with autonomous navigation capabilities, so that it could perform various roles in a search-and-rescue task. Visual and audio data from Spots sensors was used and combined with an internal knowledge base in a hybrid AI approach, so that SPOT had good situation awareness of its current context. SPOT’s knowledge base contained a map of its surroundings, information about the objects it could find in said surroundings, and information on its tasks. With this hybrid AI approach, SPOT was able to both locate and assess potential victims in its search-and-rescue task. In 2022 SPOT will be used to evaluate the increase of autonomy in an industrial environment.
- Joris Sijs, Scientist, TNO, e-mail: firstname.lastname@example.org