Easy AI mistake that feels ‘harmless’ risks humans ‘losing control’ as experts over costly bank-emptying danger --[Reported by Umva mag]

ARTIFICIAL intelligence can build an eerily accurate picture of your life and who you are. Now security experts have warned The Sun readers not too give away too much info when speaking to chatbots – even if it seems “harmless”. GettyBe very wary when talking to AI chatbots – where is your info going?[/caption] The problem with AI-powered chatbots is that they can seem almost humanlike. They’re designed that way to be more helpful and approachable – but it can also mean you dropping your guard. “We’re getting so comfortable talking to AI tools like ChatGPT that we often forget something important,” said cybersecurity pro Akhil Mittal, speaking to The Sun. “We’re still sharing data, and that data help these systems learn more about us.” You might think you’re sensible enough to not hand over sensitive financial info. But the kinds of information that you shouldn’t be sharing with artificial intelligence bots goes far beyond your bank log-in. “Most people know not to share passwords or credit card numbers,” said Akhil, a senior security consulting manager at Black Duck Software. “The real concern is the small and everyday details. “You might mention your upcoming travel plans, a work project, or even your health, thinking it’s harmless. “But these small bits of information can add up over time, creating a picture of your life that you never intended to share. “You’re giving away more than you think.” There are several reasons why you wouldn’t want an AI knowing too much about you. Firstly, it’s possible that you’re talking to a dodgy chatbot created by cyber-criminals specifically for extracting info from you. Secondly, even if the chatbot itself isn’t malicious, your account could be hacked – leaking all of your info to whoever breaks in. Hackers could exploit that info for financial gain. Chris Haukcybersecurity expert at Pixel Privacy And thirdly, you have very little control over where your info ends up once it has been “ingested” by the AI machine. “Think of it like sending a text – just because it feels private doesn’t mean it can’t be shared or even seen by others,” Akhil warned. “The point is AI systems learn from what you tell them. “Even if they don’t store everything, they process that information to improve responses. “So, it’s important to treat your conversations with the same caution as social media. “Once it’s out there, it’s out of your control.” DON’T SHARE! It’ll becoming increasingly difficult to avoid chatbots in the future. So experts say it’s important to be extremely vigilant with what you send to an AI – or it could prove very costly. HOW TO INTERACT WITH CHATBOTS Here’s some advice from The Sun’s tech expert Sean Keach… The best way to interact with chatbots is to treat it like a total stranger. You (hopefully) wouldn’t dish out sensitive details about your life to a random person on the internet. Chatbots are no different – they talk like a human, and you don’t know where the info you share will end up. Don’t be fooled by the fact that they can come across like a trusted friend or colleague. In fact – and sorry to say – chatbots don’t care about you at all. So they don’t have your best interests at heart. They don’t have a heart! It’s just lines of code simulating a human, so remember that if you’re tempted to pour your heart out to what is little more than a smart app. Chatbots can be immensely powerful and help you with difficult problems – even personal ones – but keep everything anonymous. Don’t share specifics about your life, and try to sign up to chatbots with info that doesn’t give away exactly who you are. It’s especially important not to share info about your job with a chatbot, as you don’t want to land yourself in hot water professionally. But don’t allow chatbots to build up a picture of who you are, because that could eventually be used against you. If you hand over enough info, hackers might even be able to steal your identity or break into your bank account. Speaking to The Sun, security expert Chris Hauk said you should try to avoid using a chatbot as a therapist or life guide. “You should never share financial information, or your deepest thoughts,” said Chris, a consumer privacy advocate at Pixel Privacy. “Some folks may be inclined to share problems with chatbots, using them as a therapist of sorts. GettyYou should even be careful when you’re speaking to respected and reputable chatbots like OpenAI’s ChatGPT[/caption] “This is not a good idea, as doing so is a serious privacy concern, as both types of information would be used by bad actors to cause issues for you down the line. “Also, never share confidential workplace information, which could re

Oct 7, 2024 - 14:08
Easy AI mistake that feels ‘harmless’ risks humans ‘losing control’ as experts over costly bank-emptying danger --[Reported by Umva mag]

ARTIFICIAL intelligence can build an eerily accurate picture of your life and who you are.

Now security experts have warned The Sun readers not too give away too much info when speaking to chatbots – even if it seems “harmless”.

a computer generated image of a human head with binary code coming out of it
Getty
Be very wary when talking to AI chatbots – where is your info going?[/caption]

The problem with AI-powered chatbots is that they can seem almost humanlike.

They’re designed that way to be more helpful and approachable – but it can also mean you dropping your guard.

“We’re getting so comfortable talking to AI tools like ChatGPT that we often forget something important,” said cybersecurity pro Akhil Mittal, speaking to The Sun.

“We’re still sharing data, and that data help these systems learn more about us.”

You might think you’re sensible enough to not hand over sensitive financial info.

But the kinds of information that you shouldn’t be sharing with artificial intelligence bots goes far beyond your bank log-in.

“Most people know not to share passwords or credit card numbers,” said Akhil, a senior security consulting manager at Black Duck Software.

“The real concern is the small and everyday details.

“You might mention your upcoming travel plans, a work project, or even your health, thinking it’s harmless.

“But these small bits of information can add up over time, creating a picture of your life that you never intended to share.

“You’re giving away more than you think.”

There are several reasons why you wouldn’t want an AI knowing too much about you.

Firstly, it’s possible that you’re talking to a dodgy chatbot created by cyber-criminals specifically for extracting info from you.

Secondly, even if the chatbot itself isn’t malicious, your account could be hacked – leaking all of your info to whoever breaks in.

Hackers could exploit that info for financial gain. Chris Haukcybersecurity expert at Pixel Privacy

And thirdly, you have very little control over where your info ends up once it has been “ingested” by the AI machine.

“Think of it like sending a text – just because it feels private doesn’t mean it can’t be shared or even seen by others,” Akhil warned.

“The point is AI systems learn from what you tell them.

“Even if they don’t store everything, they process that information to improve responses.

“So, it’s important to treat your conversations with the same caution as social media.

“Once it’s out there, it’s out of your control.”

DON’T SHARE!

It’ll becoming increasingly difficult to avoid chatbots in the future.

So experts say it’s important to be extremely vigilant with what you send to an AI – or it could prove very costly.

HOW TO INTERACT WITH CHATBOTS

Here’s some advice from The Sun’s tech expert Sean Keach

The best way to interact with chatbots is to treat it like a total stranger.

You (hopefully) wouldn’t dish out sensitive details about your life to a random person on the internet.

Chatbots are no different – they talk like a human, and you don’t know where the info you share will end up.

Don’t be fooled by the fact that they can come across like a trusted friend or colleague.

In fact – and sorry to say – chatbots don’t care about you at all. So they don’t have your best interests at heart. They don’t have a heart!

It’s just lines of code simulating a human, so remember that if you’re tempted to pour your heart out to what is little more than a smart app.

Chatbots can be immensely powerful and help you with difficult problems – even personal ones – but keep everything anonymous.

Don’t share specifics about your life, and try to sign up to chatbots with info that doesn’t give away exactly who you are.

It’s especially important not to share info about your job with a chatbot, as you don’t want to land yourself in hot water professionally.

But don’t allow chatbots to build up a picture of who you are, because that could eventually be used against you.

If you hand over enough info, hackers might even be able to steal your identity or break into your bank account.

Speaking to The Sun, security expert Chris Hauk said you should try to avoid using a chatbot as a therapist or life guide.

“You should never share financial information, or your deepest thoughts,” said Chris, a consumer privacy advocate at Pixel Privacy.

“Some folks may be inclined to share problems with chatbots, using them as a therapist of sorts.

a computer screen that says ask chatgpt anything
Getty
You should even be careful when you’re speaking to respected and reputable chatbots like OpenAI’s ChatGPT[/caption]

“This is not a good idea, as doing so is a serious privacy concern, as both types of information would be used by bad actors to cause issues for you down the line.

“Also, never share confidential workplace information, which could result in the unintentional exposure of information about your employer.

“Never provide login information for your accounts in chatbot conversations. Hackers could exploit that info for financial gain.”




The following news has been carefully analyzed, curated, and compiled by Umva Mag from a diverse range of people, sources, and reputable platforms. Our editorial team strives to ensure the accuracy and reliability of the information we provide. By combining insights from multiple perspectives, we aim to offer a well-rounded and comprehensive understanding of the events and stories that shape our world. Umva Mag values transparency, accountability, and journalistic integrity, ensuring that each piece of content is delivered with the utmost professionalism.