A Cambridge team has been announced as one of the winners of a prize to drive ‘innovation in privacy-enhancing technologies that reinforce democratic values’ for its work on tackling international money laundering.
Most of the world’s data is inaccessible for machine learning – however, these new methods are making such data available in a safe manner. This will be a game changer for many high impact domains
The announcement came at the second UK-US Summit for Democracy on 30 March 2023. The prize challenges innovators on both sides of the Atlantic to build solutions that enable collaborative development of artificial intelligence (AI) models, while keeping sensitive information private.
Driven by a shared priority to employ data to help solve critical global challenges in a manner that supports US and UK commitments to democratic values and the fundamental right to privacy, the challenges focused on developing PETs solutions for two scenarios: forecasting pandemic infection and detecting financial crime.
A team led by Professor Nic Lane from the Department of Computer Science and Technology at the University of Cambridge was named joint winner in the financial crime category. Their challenge was to develop a privacy-preserving solution to help tackle the challenge of international money laundering.
Xinchi Qiu, a PhD student in Professor Lane’s lab, said: “We developed an end-to-end privacy-preserving federated learning solution to detect potentially anomalous payments, leveraging a combination of inputs from a number of financial institution and different banks. Our project aims to develop a method that can utilise all the inputs from different institutions while protecting the original data.”
Professor Lane said: “Right now, machine learning with federated and other privacy preserving methods are niche. But in the near future they will be the norm. Most of the world’s data is inaccessible for machine learning – however these new methods are making such data available in safe manner. This will be a game changer for many high impact domains that are currently starved of sufficient data, such as health, finance and legal. Our solution shows how this can be done effectively for money laundering, but our methods can migrate to these other domains.”
Experts from academic institutions, global technology companies, and privacy start-ups competed for cash prizes from a combined UK-US prize pool of $1.6 million (£1.3 million). The winning solutions combined different PETs to allow the AI models to learn to make better predictions without exposing any sensitive data. This focus on combining privacy approaches encouraged the development of innovative solutions that address practical data privacy concerns in real world scenarios.
In the final phase of the challenges, the privacy guarantees of the solutions were put to the test by ‘red teams’, who attempted to reveal the original data used for training the models. The resilience of the solutions to these attacks determined the final winners.
Michelle Donelan, Secretary of State for the UK Department for Science, Innovation and Technology, said: “Never before has our privacy been so important and we must protect our democratic values by safeguarding the right to privacy. That is why the UK and its allies are collaborating to create innovative technologies that enable public institutions to combat financial crime and promote public health without compromising the confidentiality of the sensitive data they manage.”
UK participants also received support from the UK Information Commissioner’s Office to help them consider how their solutions could demonstrate compliance with key UK data protection regulation principles.
John Edwards, UK Information Commissioner, said: “Privacy enhancing technologies can help analyse data responsibly, lawfully and securely and it will be important for regulators and industry to continue to work together to support responsible innovation in these technologies.”
Arati Prabhakar, Assistant to the President for Science and Technology and Director of the White House Office of Science and Technology Policy, added: “Data has the power to drive solutions to some of our biggest shared challenges, but much of that data is sensitive and needs to be protected.”
Adapted from a press release from the Centre for Data Ethics and Innovation