Digital assistants obey commands that are inaudible to humans.



In our time, technology has become very popular, which can be called a "digital assistant" of man. This includes both Siri, Cortana, and smart speakers, in particular, from Amazon. Many of us are used to the fact that only one voice command can force an assistant to act - for example, tell the weather for tomorrow, read the text or sing a song.

Some aspects of the work of such assistants raise a number of questions for information security experts. No doubt, the services are quite convenient, but they are vulnerable to outside interference. The simplest example was after a report on the Amazon Echo TV system from San Diego, which “listened” to the local TV channel CW6, began ordering doll houses.

The problem was that on TV they were saying something about doll houses, discussing the girl who ordered just such a house from Alexa. And the column ordered nothing. The reporter said the following on the TV program: “I liked the girl who said“ Alexa, order me a dollhouse ”. And other columns urgently started ordering doll houses.


The same column some time ago was noticed that she began to laugh terribly without reason. Most likely, the reason was nevertheless, and this is the hacking of the column itself (some kind of bug can hardly lead to similar consequences).

As for unauthorized orders, the reason for the “failure” lies in the fact that Alexa is not equipped with a “friend-foe” recognition system, so that it can listen to the voice commands of other people, and not just its user. The situation is similar with Siri, Cortana, Google assistant. As it turned out, the “helpers” obey commands that are indistinguishable to the human ear.

A joint team of researchers from China and the United States were able to achieve covert activation of artificial intelligence systems on smartphones and smart speakers. Inaudible commands made digital assistants dial different numbers and open sites. But the possibilities are much broader - attackers can use the “naivety” of the columns in order to make money transfers or order goods (only not doll houses).

The commands mentioned above can be embedded in the recordings of songs of any length or colloquially. An attacker can start playing a musical recording, which will sound quite normal for a person, but the digital assistant will hear the command and begin to execute the order. Researchers who have already published the results of their work, believe that their technology will sooner or later become public knowledge. This idea is not unique, so anyone can come to it. Or maybe it already works, being used by some intruder.


The information security researcher does not disclose the details of the technology, but one must think that the threat is quite real. Some time after the publication of information about the vulnerability of voice assistants, the developers of such services from other countries began to talk about fixing the problem soon. In Amazon, they said that the problem has already been solved.

Whatever it was, but the same Siri listens to such commands now. In addition, they contain instructions about the need to remain silent for a digital assistant. After all, what is the point of giving a command if the phone “wakes up” and starts to repeat with a voice everything that is hidden in an inaudible command? And so - the team is on, the phone is activated, but goes into silent mode and says nothing 2 in reply.

The researchers called the technology DolphinAttack. Now software developers digital assistants began to study the information provided by scientists, in order to solve the problem. Make it difficult, but probably possible. If nothing is done, an idea that is really a potential danger will begin to find understanding among all kinds of intruders. And those, in turn, will use it to implement any of their own goals that are incompatible with the law. So far, the study of a potential "sound vulnerability" continues.

Source: https://habr.com/ru/post/412331/


All Articles