Dr Steve Fleming Co-PI on new EPSRC Programme Grant
With significant support from EPSRC, UCL together with the University of Oxford and a cohort of academic, industrial and deployment partners are embarking on an ambitious five-year programme of work on Engineering, Exploring and Exploiting the Building Blocks of Embodied Intelligence.
The overarching aim of this research programme is to deliver autonomous systems which amplify human capacity and potential. Currently, robots are too specialised, uncooperative and too unsafe to be productive at scale. To contribute to productivity in strategically important areas such as social care, manufacturing, logistics, service, inspection or agriculture, future generations of robots need to be able to sense, interpret, act, navigate, coordinate and collaborate with an unprecedented acuity.
Dr Steve Fleming (who leads the Metacognition Team at WCHN), co-PI on the project, said:
“I’m delighted to be part of this ambitious EPSRC Programme Grant that is bringing together investigators from UCL and Oxford to create the next generation of autonomous robot systems that boost human potential.
“As a psychologist and cognitive neuroscientist, my role in the project is to lead work on metacognition – the ability to reflect on and evaluate other cognitive processes. We anticipate that by building metacognitive capability into the robots of the future, they will become able to know what they don’t know, improving collaboration and trust in human-robot teams.
“We will combine research on humans and human-robot teams to map metacognitive processes in humans (such as confidence judgments) to similar processes in robots to identify gaps and opportunities for novel, neuroscience-inspired architectures.”
PaLS Professor Nadia Berthouze, co-PI said:
“I’m really excited by this EPSRC Programme Grant and to contribute to design the next generation of autonomous robot systems. As Human-Computer Interaction and Affective Computing researcher, my role in the project is to lead work on how affective body expressions and touch help mediate workers and people interactions and how we can endow robots with the ability to interpret and leverage such expressions to become effective collaborators.
“Working across a variety of industry sectors, we will be able to better understand how to design affective-aware systems that are able to adapt across different types of affect-demanding contexts.”
Dr Fleming said the team anticipates that by building metacognitive capability into the robots of the future they will become able to know what they don’t currently know, improving collaboration and trust in human-robot teams. For instance, autonomous vehicles could be engineered to glow different colours depending on how confident they are in their interpretation of the world – making it intuitive for humans to know when and whether to intervene.
“We will have an unprecedented opportunity to explore how introspective capability can boost robots’ abilities to coordinate with other robots and collaborate with human teams. In turn, as a neuroscientist and psychologist, I’m excited about how robotics can give us novel insight into the types of computational architectures that support metacognition in the human brain,” added Dr Fleming.