The circulation of explicit and pornographic pictures of megastar Taylor Swift this week shined a light on artificial intelligence's ability to create convincingly real, damaging 鈥 and fake 鈥 images.

But the concept is far from new: People have weaponized this type of technology against women and girls for years. And with the rise and increased access to AI tools, experts say it鈥檚 about to get a whole lot worse, for everyone from school-age children to adults.

Already, some high schools students across the world, from to , have reported their faces were manipulated by AI and shared online by classmates. Meanwhile, a young well-known female Twitch streamer discovered her likeness was being used in a fake, explicit pornographic video that spread quickly throughout the gaming community.

鈥淚t鈥檚 not just celebrities [targeted],鈥 said Danielle Citron, a professor at the University of Virginia School of Law. 鈥淚t鈥檚 everyday people. It鈥檚 nurses, art and law students, teachers and journalists. We鈥檝e seen stories about how this impacts high school students and people in the military. It affects everybody.鈥

But while the practice isn鈥檛 new, Swift being targeted could bring more attention to the growing issues around AI-generated imagery. Her enormous contingent of loyal 鈥淪wifties鈥 expressed their outrage on social media this week, bringing the issue to the forefront. In 2022, a ahead of her Eras Tour concert sparked rage online, leading to several legislative efforts to crack down on consumer-unfriendly ticketing policies.

鈥淭his is an interesting moment because Taylor Swift is so beloved,鈥 Citron said. 鈥淧eople may be paying attention more because it鈥檚 someone generally admired who has a cultural force. 鈥 It鈥檚 a reckoning moment.鈥

鈥楴efarious reasons without enough guardrails鈥

The fake images of Taylor Swift predominantly spread on social media site X, previously known as Twitter. The photos 鈥 which show the singer in sexually suggestive and explicit positions 鈥 were viewed tens of millions of times before being removed from social platforms. But nothing on the internet is truly gone forever, and they will undoubtedly continue to be shared on other, less regulated channels.

Although stark warnings have circulated about how misleading AI-generated images and videos could be used to derail presidential elections and , there鈥檚 been less public discourse on how women鈥檚 faces have been manipulated, without their consent, into often aggressive pornographic videos and photographs.

The growing trend is the AI equivalent of a practice known as 鈥渞evenge porn.鈥 And it鈥檚 becoming increasingly hard to determine if the photos and videos are authentic.

What鈥檚 different this time, however, is that Swift鈥檚 loyal fan base banded together to use the reporting tools to effectively take the posts down. 鈥淪o many people engaged in that effort, but most victims only have themselves,鈥 Citron said.

Although it took 17 hours for X to take down the photos, many manipulated images remain posted on social media sites. According to Ben Decker, who runs Memetica, a digital investigations agency, social media companies 鈥渄on鈥檛 really have effective plans in place to necessarily monitor the content.鈥

Like most major social media platforms, X鈥檚 policies of 鈥渟ynthetic, manipulated, or out-of-context media that may deceive or confuse people and lead to harm.鈥 But at the same time, X has largely its content moderation team and relies on automated systems and user reporting. (In the EU, X is currently being investigated over its content moderation practices).

The company did not respond to CNN鈥檚 request for comment.

Other social media companies also have reduced their content moderations teams. Meta, for example, to its teams that tackle disinformation and co-ordinated troll and harassment campaigns on its platforms, people with direct knowledge of the situation told CNN, raising concerns ahead of the in the US and around the world.

Decker said what happened to Swift is a 鈥減rime example of the ways in which AI is being unleashed for a lot of nefarious reasons without enough guardrails in place to protect the public square.鈥

When asked about the images on Friday, White House press secretary Karine Jean-Pierre said: 鈥淚t is alarming. We are alarmed by the reports of the circulation of images that you just laid out 鈥 false images, to be more exact, and it is alarming.鈥

A growing trend

Although this technology has been available for a while now, it is getting renewed attention now because of the offending photos of Swift.

Last year, a New Jersey high school student launched a campaign for federal legislation to address after she said photos of her and 30 other female classmates were manipulated and possibly shared online.

Francesca Mani, a student at Westfield High School, over the lack of legal recourse to protect victims of AI-generated pornography. Her mother told CNN it appeared 鈥渁 boy or some boys鈥 in the community created the images without the girls鈥 consent.

鈥淎ll school districts are grappling with the challenges and impact of artificial intelligence and other technology available to students at any time and anywhere,鈥 Westfield Superintendent Dr. Raymond González told CNN in a statement at the time.

In February 2023, when a high-profile male video game streamer on the popular platform Twitch was of some of his female Twitch streaming colleagues. The Twitch streamer 鈥淪weet Anita 鈥 later told CNN it is 鈥渧ery, very surreal to watch yourself do something you鈥檝e never done.鈥

The rise and access to AI-generated tools has made it easier for anyone to create these types of images and videos, too. And there also exists a much wider world of unmoderated not-safe-for-work AI models in open source platforms, according to Decker.

Cracking down on this remains tough. Nine US states currently have against the creation or sharing of non-consensual deepfake photography, synthetic images created to mimic one鈥檚 likeness, but none exist on the federal level. Many to of the Communications Decency Act, which protects online platforms from being liable over user-generated content.

鈥淵ou can鈥檛 punish it under child pornography laws 鈥 and it鈥檚 different in the sense that no child sexual abuse happening,鈥 Citron said. 鈥淏ut the humiliation and the feeling of being turned into an object, having other people see you as a sex object and how you internalize that feeling 鈥 is just so awfully disruptive to your social esteem.鈥

How to protect your images

People can take a few small steps to help protect themselves from their likeness being used in non-consensual imagery.

Computer security expert David Jones, from IT services company , advises that people should consider keeping profiles private and sharing photos only with trusted people because 鈥測ou never know who could be looking at your profile.鈥

Still, many people who participate in 鈥渞evenge porn鈥 personally know their targets, so limiting what is shared in general is the safest route.

In addition, the tools used to create explicit images also require a lot of raw data and images that show faces from different angles, so the less someone has to work with the better. Jones warned, however, that because AI systems are becoming more efficient, it鈥檚 possible in the future only one photo will be needed to create a deepfake version of another person.

Hackers can also seek to exploit their victims by gaining access to their photos. 鈥淚f hackers are determined, they may try to break your passwords so they can access your photos and videos that you share on your accounts,鈥 he said. 鈥淣ever use an easy-to-guess password, and never write it down.鈥

CNN's Betsy Kline contributed to this report.