Safety on the internet isn’t a new concept. It’s an idea that’s been around since the conception of the internet, and safety in general has been around even longer. Generally speaking, it’s safe to say that most people think they have internet safety locked down. Don’t talk to strangers, don’t give away your personal information, use a VPN, and more. Most people have had these concepts repeated to them throughout grade school. Many people are also probably pretty familiar with the idea of AI. Artificial Intelligence is everywhere now.
At this point someone might be asking what this has to do with digital and online safety. After all, regardless of whether someone is for or against AI, it doesn’t make a difference to online safety right? Well, it can actually cause more harm than someone may think if not handled in a healthy way. During the Oct. 23 Title IX webinar on digital safety, speaker Adam Dodge shared that there was quite a plethora of ways that AI can be used by others for harm, and ways that vulnerable individuals can form unhealthy attachments to AI.
This webinar is part of a series coordinated and created by Kirstein Girdy, a student at Dakota County Technical College who had formerly served as Assistant Compliance Officer in the Equal Opportunity and Compliance department. This webinar series is an “approach to the to be proactive in sexual assault awareness across the campuses as part of like their Title IX prevention proactive approach,” Girdy said. Girdy has worked with several non-profits to bring awareness to sexual assault, domestic violence and things of that nature.
But what is the connection between AI and sexual assault awareness? Well, as it turns out there are AI apps and websites that can be used to create sexual content of people without their consent. A deepfake is when someone uses AI to put someone’s face onto another person’s body. This is harmful for a vast number of reasons, but under this context it is harmful because it can be used to create explicit and sexual material of people who didn’t consent to have their likeness used in such a way.
AI has also been used to virtually remove clothes off of individuals who didn’t consent to it. This is problematic for a number of reasons because while the images themselves aren’t technically real, the impact of them still very much is. Spreading sexual material of anyone is harmful whether it’s real or not. Especially when it comes to minors. A lot of damage can be done because of AI programs like these because they depict someone in a sexual manner when they haven’t consented.
It’s important to discuss these things publicly because they have not only a great effect on grown people, but also the impressionable and vulnerable minds of actual children. “we believe human beings should be at the forefront of what’s safe and what isn’t,” Dodge said.
There is another harmful effect that AI is having on people who are young and/or vulnerable. AI chat bots like ChatGPT have become some people’s confidants. People talk to them every day and treat them like real people and while there’s no problem with wanting to feel safe and vulnerable, there are healthier ways to do it. A statistic shared during the webinar showed that the number of teens using AI for social and emotional support was higher than it should be. Which was any amount above zero, because AI isn’t a friend or a licensed professional. It doesn’t really know a person or care about them, and it has zero qualifications to be giving medical or psychological advice.
Humans are hard-wired to crave connection. It’s something that can be quite beautiful when they find the right people to be in their lives. They care about others on a deep, fundamental level. And that’s why, instead of using AI for emotional connection, people should communicate with those who care about them. This age of technology could be a tumultuous one if people allow it to be.
Below are some resources the webinar provided.
- Pim Eyes: Find anything on the internet with your face on it
- Cyber Civil Rights Initiative: Can help you take down intimate images of you online
- PII Opt Out: Google will remove your private information from google searches
- Google De-Index: Google will remove content of you from their searches
- Take It Down: Helps to remove nude, partially nude, or explicit images of someone from when they were under the age of 18.
- Stop Non-Consensual Intimate Image Abuse: Creates digital fingerprint for images you don’t want online and can be sent to online platforms so if they get flagged and taken down if someone tries to post them.













