This new type of attack guesses your password from the sound of your keyboard
The machine learning based model can guess what we are typing on the keyboard with up to 100% accuracy.
Navigating today’s plethora of data leaks, infectious software and fraudulent scams, maintaining our online security requires a lot of attention, and for those who were already afraid of cyberattacks, the new method that is not at all easy to defend. be particularly bad news.
British universities recently showed in a joint experiment that hackers can deduce typed words with a great degree of certainty even from the sound of our keyboard, which means they don’t need to run keyloggers in the traditional sense locally.
the Computer according to his report Scientists have come up with a machine learning-based model that recognizes keystrokes through the target mobile phone’s microphone or even through Zoom and Skype calls. In order to use the microphone of a mobile phone, it is necessary to install malware, but with the latter it is enough for a person to “listen” during an online conversation.
the Research published on arXiv.org It found that the model was 95% accurate on microphone recordings, 93% accurate on Zoom and 91.7% accurate on Skype for what the victim typed, including private messages, passwords or bank details. To train the algorithm responsible for recognition, the researchers pressed 36 keys on a MacBook Pro, recording their sounds 25 times each, so that the model could recognize the noises associated with any character.
Judging by the MacBook Pro’s quiet keyboard, it’s likely that even silent keyboards won’t stand up to such forms of attack, so the scientists recommend other methods of defense. Examples include software that reproduces the sound of keystrokes, or the use of white noise that interferes with monitoring, but it’s much easier for those to log in with password managers and autofills, since there’s no typing for prying ears to notice.
However, it is important to reiterate that the device used in the experiment was built specifically for the purpose of research, so there is no need to fear this particular solution. However, with the development of built-in microphones and eavesdropping technology, there is a risk that vocal attacks like this will become more effective and popular in the future.