New Security Attack Targets Laptop, Smart Speaker Mics

University of Florida

The ghostly woman's voice pipes through the speakers, covered in radio static but her message intact from beyond — "The birch canoe slid on the smooth planks."

A secret message from the other side? A spectral insight?

No, something much spookier: Voice recordings captured, secretly, from the radio frequencies emitted by ubiquitous, cheap microphones in laptops and smart speakers. These unintentional signals pass, ghost-like, through walls, only to be captured by simple radio components and translated back to static-filled — but easily intelligible — speech.

For the first time, researchers at the University of Florida and the University of Electro-Communications in Japan have revealed a security and privacy risk inherent in the design of these microphones, which emit radio signals as a kind of interference when processing audio data.

The attack could open up people to industry espionage or even government spying, all without any tampering of their devices. But the security researchers have also identified multiple ways to address the design flaw and shared their work with manufacturers for potential fixes going forward.

"With an FM radio receiver and a copper antenna, you can eavesdrop on these microphones. That's how easy this can be," said Sara Rampazzi , Ph.D., a professor of computer and information science and engineering at UF and co-author of the new study. "It costs maybe a hundred dollars, or even less."

They used standardized recordings of random sentences to test the attack, giving the eerie impression of a ghostly woman talking about canoes or imploring you to "Glue the sheet to the dark blue background." Each nonsense sentence instantly recognizable despite, in some cases, passing through concrete walls 10 inches thick.

The vulnerability is based on the design of digital MEMS microphones, which are widespread in devices like laptops and smart speakers. When processing audio data, they release weak radio signals that contain information about everything the microphone is picking up. Like other radio signals, these transmissions can pass through walls to be captured by simple antennas.

Even when someone is not intentionally using their microphone, it can be picking up and transmitting these signals. Common browser apps like Spotify, YouTube, Amazon Music and Google Drive enable the microphone sufficiently to leak out radio signals of anything said in the room.

The researchers tested a range of laptops, the Google Home smart speaker, and headsets used for video conferencing. Eavesdropping worked best on the laptops, in part because their microphones were attached to long wires that served as antennas amplifying the signal.

Rampazzi's lab also used machine learning-driven programs from companies like Open AI and Microsoft to clean up the noisy radio signals and transcribe them to text, which demonstrated how easy it would be to then search eavesdropped conversations for keywords.

However, a series of fairly simple changes could greatly decrease the effectiveness of the attack. Changing where microphones are placed in laptops could avoid long cables, which amplify the radio leakage. Slight tweaks to the standard audio processing protocols would reduce the intelligibility of the signals.

The researchers have shared these ideas with laptop and smart speaker manufacturers, but it's not clear if the companies will make the upgrades in future devices.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.