Technology has made our world comfortable and connected and with the advent of AI (artificial intelligence)- assisted devices, life’s tasks are just a voice-command away. But beware, the Siris and Alexas of the world come with their own vulnerabilities a.k.a- hacking.

A recent study by a team of researchers at the University of Michigan and University of Electro-Communications, Tokyo, has revealed that lasers can be used to silently speak to any device that uses voice-control commands, including smartphones, portable voice command devices like Amazon Echo speakers, Google Homes, and even iPads.

Microphones in smart devices turn sound into electric signals which transmute into voice commands. But a laser light pointed at these devices can do the same job. The researchers found that a 5 mW of laser power equivalent to a laser pointer is enough to obtain control over voice-controlled devices and smartphones.

“It’s possible to make microphones respond to light as if it were sound,” says Takeshi Sugawara, one of the lead researchers on the study. “This means that anything that acts on sound commands will act on light commands.”

This vulnerability can be used to transmit malicious commands to the voice-operated devices in a home and even get access to credit cards and e-commerce purchases.

Smart Speakers Can Be Easily Hacked

The researchers used the light command on 17 voice-controlled devices over several months, using laser pointers to telephoto lens and were able to operate the devices. Almost all the smart speakers were vulnerable to the light at 164 feet, iPhones were susceptible at 33 feet and two brands of android phones could be commandeered at only 16 feet.

Smart Speakers Can Be Easily Hacked

The voice commands carried over the light beam are silent and the light can be noticed as only a dot on the devices to the highly observant people. “Your assumptions about blocking sound aren’t true about blocking light,” says Daniel Genkin, a professor at the University of Michigan who co-led the team. “This security problem manifests as a laser through the window to your voice-activated system.”

Also, an initial command can be given for voice control or to turn on the whisper mode available in some of these voice-controlled devices.

Companies like Amazon, Apple and Google will certainly need a relook into this vulnerability. Activating a pin to protect such devices is a necessity.

Researchers advise that the only way to ensure that the light beams do not reach your devices is to keep them away from windows or such points of access.