New advancements in robotics and the use of artificial intelligence-based computing are leading to an increasingly technology-assisted future.
Although emerging automation technologies are predicted to bring a range of benefits, their development also raises a wide range of concerns around safety and security and, in particular, the potential for disruption from cyber-attacks.
A new £3 million research node will look into key issues surrounding security as these technologies get used in an increasingly diverse range of applications. Autonomous systems can be described as any system that can work with varying levels of human interaction, including completely unsupervised. These include autonomous software and smart devices, as well as driver-less vehicles, drones for delivery, environmental monitoring, robots operating in hazardous environments for search and rescue, and nano-bots for assistive surgery.
Led by Lancaster University’s Security Institute, the UKRI Trustworthy Autonomous Systems node in Security (TAS-S) will research fundamentals of adaptive security for mission-oriented autonomous systems to address the grand challenge of developing secure machine-machine and human-machine interactions. TAS-S will research fundamental principles of security of autonomous systems along with the security of the users and the environment where these systems are expected to operate.
The TAS-S node is part of the UKRI Trustworthy Autonomous Systems (TAS) programme, funded through the UKRI Strategic Priorities Fund and delivered by the Engineering and Physical Sciences Research Council (EPSRC). The TAS programme brings together the research communities and key stakeholders to drive forward cross-disciplinary fundamental research to ensure that Autonomous Systems are safe, reliable, resilient, ethical and trusted.
Assembling a cross-disciplinary team of internationally reputed security experts from Distributed Systems, Communications, Controls, AI, Sociology and Law, Lancaster’s TAS-S security node will develop a portfolio of fundamental security techniques to provide practical and scientifically rigorous principles for secure Autonomous Systems. Working closely with Cranfield University as a partner institution, the project is supported by multiple external stakeholders including Airbus, BAE, Raytheon, Thales, UK Coast Guard, as well as a number of international bodies such as CMU-USA, CODE-Germany, RISE-Sweden, AIT/TTTech-Austria, Arthurs Legal-Holland, Academia Sinica-Taiwan and NATO.
He said: “Imagine a driverless car has its GPS signal jammed in a cyber-attack, causing it to go through a red light in rush hour traffic, and the chaos that results from all the other vehicles and pedestrians trying to avoid collisions. Despite the attack, the autonomous car is still expected to keep its occupants safe and cause as little damage as possible to its surroundings. This is a complex situation that needs to balance technical decisions and understandings of human behaviours to realise an ethically acceptable action that is legally and regulatory compliant.”
Professor Suri adds: “Security solutions are not just about technology in Autonomous Systems. For security to be effective, it often needs to address both technical and sociological elements.” Such a composite socio-technical approach to security is a unique competence where Lancaster’s Security Institute excels. “It is exciting that our distinctive security expertise, along with the Cranfield team, well-known for its air-ground Autonomous Systems research, will collectively provide the critical mass to help significantly advance the “science of practically usable security,” said Professor Suri.
Cranfield’s Professor Weisi Guo’s adds: “Our combined team is uniquely positioned to to develop a game-changing approach to the security of networked Autonomous Systems in dynamic mission environments.”
Co-operating with the other TAS nodes, Lancaster’s security node will enable a myriad of complex socio-technical security considerations to offer new insights and guidance for the scientific community, practitioners and policy makers to meaningfully realise Trustworthy Autonomous Systems.
Lancaster researchers involved in the project include Professor Neeraj Suri (School of Computing and Communications), Professor Corinne May-Chahal (Sociology), Professor Plamen Angelov (Computing and Communications), Professor David Hutchison (Computing and Communications), Dr Vasileios Giotsas (Computing and Communications), Dr Joe Deville (Sociology, Department of Organisation, Work and Technology), and Dr Catherine Easton (Law School).