Tag Archive for: eavesdropping

Sinister AI ‘eavesdropping’ trick lets ‘anybody read private chats’ on your Android or iPhone, security experts reveal


CYBERCRIMINALS can spy on users’ conversations with artificial intelligence-powered chatbots, experts have warned.

Ever since ChatGPT came out in November 2022, cybersecurity experts have been concerned with the technology.

Criminals can spy on users’ conversations with AI chatbotsCredit: Getty

ChatGPT is an advanced chatbot that can seamlessly complete tasks like writing essays and generating code in seconds.

Today, several chatbots function like ChatGPT, including Google’s Gemini and Microsoft’s Copilot within Bing.

The chatbots are easy to use, and many users quickly get captivated into conversations with the natural-language companions.

However, experts have expressed concerns over users sharing personal information with AI chatbots.

ChatGPT can collect highly sensitive details users share via prompts and responses.

It can then associate this information with a user’s email address and phone number, and store it.

That’s because to use the platform, users need to provide both an email address and mobile phone number.

Users cannot bypass this by using disposable or masked email addresses and phone numbers.

Most read in Phones & Gadgets

As a result, ChatGPT is firmly tied to your online identity as it records everything you input.

What’s more, this private data can also be obtained by cybercriminals if they are keen enough.

ChatGPT creator reveals more creepy videos after announcing major change & fans are shocked by ‘cyborg’ German Shepherd

“Currently, anybody can read private chats sent from ChatGPT and other services,” Yisroel Mirsky, the head of the Offensive AI Research Lab at Israel’s Ben-Gurion University, told Ars Technica in an email.

“This includes malicious actors on the same Wi-Fi or LAN as a client (e.g., same coffee shop), or even a malicious actor on the internet — anyone who can observe the traffic.”

This is known as a “side-channel attack,” and it can be very dangerous for victims.

“The attack is passive and can happen without OpenAI or their client’s knowledge,” Mirsky revealed.

“OpenAI encrypts their traffic to prevent these kinds of eavesdropping attacks, but our research shows that the way OpenAI is using encryption is flawed, and thus the content of the…

Source…

Google Home speakers were at risk of eavesdropping hackers


A security researcher recently revealed that Google Home speakers were susceptible to eavesdropping hackers in close proximity, reports Bleeping Computer.

Now, before you tell everyone on your contact list to unplug their devices, Google patched the issue and fixed the speaker’s vulnerability.

Alright, now some background. Security researcher Matt Kunze noticed a loophole allowing any clever hacker to install a “backdoor” account on your smart speaker.

More importantly, Kunze found that bad actors could potentially remotely send commands to the device, listen in on your every word, and even snoop on your other smart devices.

Kunze shows how he remotely listened in on a Google Home speaker

Here’s a quick video Kunze uploaded to YouTube showing how he can remotely tap into the device, eavesdrop, and record a conversation.

Before the fix, all an attacker had to do is be within wireless range, and boom – they had full access to your life.

And as if that wasn’t bad enough, they could potentially expose your Wi-Fi password or gain access to other devices.

Thankfully, the issue is now patched. Kanze brought this to Google’s attention, and the company rewarded him with $107,500 for responsibly disclosing the vulnerability.

Don’t panic – there’s no cause for concern

Now, before you go running for the hills (or at least unplugging all your gadgets), it’s worth noting that these types of vulnerabilities are rare.

In fact, Kunze states that Nest and Home devices are secure for the most part and don’t have many weaknesses for attackers to exploit.

So, you can probably keep your smart speaker plugged in without worrying, at least for now. To learn more, check out Kunze’s blog detailing everything in his research.

Have any thoughts on this? Carry the discussion over to our Twitter or Facebook.

Editors’ Recommendations:

Just a heads up, if you buy something through our links, we may get a small share of the sale. It’s one of the ways we keep the lights on here. Click here for…

Source…

How to Turn a Coke Can Into an Eavesdropping Device


BLACK HAT ASIA — A soda can, a smartphone stand, or any shiny, lightweight desk decoration could pose a threat of eavesdropping, even in a soundproof room, if an attacker can see the object, according to a team of researchers from Ben-Gurion University of the Negev.

At the Black Hat Asia security conference on Thursday, and aiming to expand on previous research into optical speech eavesdropping, the research team showed that audio conversations at the volume of a typical meeting or conference call could be captured from up to 35 meters, or about 114 feet, away. The researchers used a telescope to collect the light reflected from an object near the speaker and a light sensor — a photodiode — to sample the changes in the light as the object vibrated.

A lightweight object with a shiny surface reflects the signal with enough fidelity to recover the audio, said Ben Nassi, an information security researcher at the university.

“Many shiny, lightweight objects can serve as optical implants that can be exploited to recover speech,” he said. “In some cases, they are completely innocent objects, such as a smartphone stand or an empty beverage can, but all of these devices — because they share the same two characteristics, they are lightweight and shiny — can be used to eavesdrop when there is enough light.”

The eavesdropping experiment is not the first time that researchers have attempted side-channel attacks that pick up audio from surrounding objects.

Improving on Past Optical Eavesdropping
In 2016, for example, researchers demonstrated ways to reconfigure the audio-out jack on a computer to an audio-in jack and thereby use speakers as microphones. In 2014, a group of MIT researchers found a way to use a potato chip bag to capture sound waves. And in 2008, a group of researchers created a process to capture the keys typed on a keyboard by their sounds and the time between keystrokes.

The MIT research is similar to the technique pursued by the Ben-Gurion University researchers, except that exploitation required more restrictive placement of the reflective object and required substantial processing power to recover the audio, said Raz Swissa, a researcher with Ben-Gurion…

Source…

Eavesdropping By LED | Hackaday


If you ever get the feeling someone is watching you, maybe they are listening, too. At least they might be listening to what’s coming over your computer speakers thanks to a new attack called “glow worm.” In this novel attack, careful observations of a power LED on a speaker allowed an attacker to reproduce the sound playing thanks to virtually imperceptible fluctuations in the LED brightness, most likely due to the speaker’s power line sagging and recovering.

You might think that if you could see the LED, you could just hear the output of the speaker, but a telescope through a window 100 feet away appears to be sufficient. You can imagine that from a distance across a noisy office you might be able to pull the same trick. We don’t know — but we suspect — even if headphones were plugged into the speakers, the LED would still modulate the audio. Any device supplying power to the speakers is a potential source of a leak.

On the one hand, this is insidious because, unlike more active forms of bugging, this would be pretty much undetectable. On the other hand, there are a variety of low-tech and high-tech mitigations to the attack, too. Low tech? Close your blinds or cover the LED with some tape. High tech? Feed a random frequency into the LED to destroy any leaking information. Super spy tech? Put fake speakers in front of your real speakers that silently playback misinformation on their LEDs.

The video plays samples of recovered speech and, honestly, it was clear enough but not great. We wondered if a little additional signal processing might help.

Passive bugs are hard to find. Even a fancy junction detector won’t tell you if your speakers are compromised by glow worm.

 

 


Source…