Carmela Troncoso on surveillance technology used in the US and Germany
Carmela Troncoso is director at the Max Planck Institute for Security and Privacy.
© Max Planck Institute for Security and Privacy
Prof. Troncoso: According to investigative media, the US-authority ICE uses a Palantir software called Eclipse to track down migrants who might live in the USA without valid residence permits. How does the app called ELITE work and what is so problematic about it?
First off, I am not a lawyer and can only answer based on my knowledge and expertise. Elite allegedly has access to rich information from public and private sources and it uses AI to infer a probability score on whether a person could be a target for ICE. This is information about anyone. A priori they do not know who the targets could be, which is a key difference to how law enforcement works normally: Identifying a suspicious person and then getting information on it after a court order. to the app also infers a confidence score on a particular address for a person of interest after processing data like bills, social media posts, family links, economical status, future plans or hobbies. This is extremely problematic, as it allows to create arbitrary criteria and apply it without transparency or control.
AI is often criticized as of not being able to explain, why it made a decision. Now, the destinies of human beings depend on this black box?
The inferences the app makes are not certain by definition. "A confidence score" means that as there is always a chance that people get targeted without a reason. A system that does automated targeting cannot be built in a proportional, reasonable manner as opposed to the kind of targeted surveillance that we have nowadays. And such an AI system is not interpretable, meaning that it is not possible to understand why errors happen.
The extreme right-wing party AfD in Germany sympathizes with a task force like ICE also in Germany. Despite the fact that this would be against the Basic Law of Germany, would such an app also be possible in Germany?
The exact app would be difficult as the same sources of information do not exist in Europe. Without further ado, sensitive medical data, banking data or telephone records, cannot be freely accessed or processed for the purpose of profiling and targeting people. But the data exist and thus they can be accessed with subpoenas, or by penetrating the system.
Yet, there is a lot of available information around that can be crawled or acquired. Even with less rich information, very damaging applications could be created. We all leave digital footprints on the internet that could be used to infer a movement profile. And we have shown that even AI tools that are designed to automatically moderate online conversations and filter false information, bullying or hate speech, will be biased or work based on confidence. So in some cases, they can not distinguish right from wrong. So, we always have to be very careful, when making decisions based on AI.
The company Palantir that supposedly built the app that ICE uses, also built Vera, which is used by the German police. What is the difference between Elite and Vera?
Looking at news reports, Vera only uses information from police records. This is in contrast with Elite that gathers external sources to make inferences. So, in my view Vera is not a light version of Elite. In my understanding repurposing data from records to train AI models is not allowed by the general data protection regulation.
Yet, Vera uses AI and it suffers from similar issues as Elite: data collected for purposes other than surveillance and targeting is reused for this purpose without consent, transparency, or control.
On top, inferences made by Vera will have errors and will be systematically biased towards certain subgroups. Like for Elite, Vera is not a proportional tool as it will generate surveillance and harms on innocent people.
Do we need to guard against autocratic / dictatorial tendencies in data protection?
I do not think that we need to protect our data better because there are autocratic tendencies. We need to protect them better because having data accessible and usable by anyone creates a world in which manipulation is easier potentially by governments, but also by private parties that use these data for their own profit. Protecting personal data should be a high priority in any case.
Interview: Tobias Beuchert