AI doll trend could bring personal security issues
PHOENIX (AZFamily) — There’s a new trend using ChatGPT where people use AI to create an action figure version of themselves.
The AI dolls are complete with accessories related to your job, interests or hobbies. But giving away what you look like along with personal information could create some security problems.
David Norlin, chief technology officer of Lumifi Cybersecurity, joined Good Morning Arizona on Tuesday to give some expert insight.
“AI has come a long way, and I think there’s a lot of folks who are kind of jumping into the bandwagon of engaging with it and seeing what it can do,” Norlin said. “We now have the ability to create an action figure version of ourselves, which is fun and there’s some novelty to it, and then we have this desire to go share it with others.”
But it’s not without risk and could open the door to data collectors and bad actors.
“You have to give it a lot of information about yourself, about your interests, about your hobbies. And all of those things can be used to then potentially interact with you in a social engineering-type setting or scam,” he explained. “And now as we post this information on social media, it’s available for others to see, so it can be potentially invasive if you don’t know what you’re doing.
Norlin says AI is going to become more prevalent in our everyday lives, and users should exercise caution when deciding how to engage with it.
“It’s not a person, and it doesn’t know you, at least not yet, so don’t give it personal information, don’t give it personal health information, financial stuff,” he said. “Things you wouldn’t tell a stranger, don’t tell that to AI.”
See the full interview in the video player above.
See a spelling or grammatical error in our story? Please click here to report it.
Do you have a photo or video of a breaking news story? Send it to us here with a brief description.
Copyright 2025 KTVK/KPHO. All rights reserved.