Artificial intelligence is threatening to create a new market for child sex abuse materials, prompting South Carolina’s attorney general to warn leaders across the country to tighten laws.  

But in his own state, legislators have yet to discuss the issue.

Attorney General Alan Wilson brought together his counterparts from all 50 states and four territories in a letter to Congress, urging federal legislators to address AI-generated child sexual abuse material, known as CSAM.

The letter emphasizes that AI can create pornographic images of fake children and even alter images of real children – to make them sexually explicit.

South Carolina lawmakers on the Joint Citizens and Legislative Committee on Children are partly responsible for legislation protecting children. They’re not optimistic any bills will be proposed anytime soon related to AI-generated CSAM.

“It’s unlikely that some members are going to stand up and say, ‘I want to talk to you today about a bill on artificial intelligence,’ I think until something bad happens,” said Sen. Brad Hutto, D-Orangeburg, a committee member.

New, free technology found through a quick Google search allows anyone to quickly generate realistic images of just about anything, which could include images of unclothed children or children being sexually abused.

Wilson in his letter asked for the formation of a federal commission to investigate ways AI can be used to exploit children and find ways to prevent that. Wilson led the effort in sending a letter to Congress in September. All 54 attorneys general signed on, according to the office’s website.

Wilson urged Congress to make this issue the forefront of conversation when members return from recess. 

“We are engaged in a race against time to protect the children of our country from the dangers of AI. Indeed, the proverbial walls of the city have already been breached. Now is the time to act,” the letter concluded. 

But in South Carolina, Hutto said, getting a state bill started could be difficult because the state is “not on the cutting edge of virtually anything.” 

‘If you’re a bad person, you can do diabolical wonders with it’ 

Biplav Srivastava, professor at the AI Institute at the University of South Carolina, has studied AI over the past few decades.

He sees AI as a “decision support tool.”

“If you’re a good person, you can do wonders with it,” Srivastava said. “If you’re a bad person, you can do diabolical wonders with it.” 

Accessibility to AI software that can create images has increased in the past year, Srivastava said.

“If there was a low, medium, high kind of accessibility scale, I would say that just in the last one year, it has moved from a little-, medium-accessibility to very high accessibility,” Srivastava said. 

AI works by mimicking how the human brain works in combination with publicly available data, such as that on Wikipedia. 

The more AI is used, the better it becomes at mimicking reality, Srivastava said.

“The technology is just an enabler for wrong people as much as it is an enabler for the right people,” Srivastava said.

He said safeguards are necessary.

“We have it for cars,” Srivastava said. “… The point is there is harm that any technology being put next to people can cause.”

‘We couldn’t tell you which ones were real’

Kevin Atkins is a chief criminal investigator and commander of the S.C. Internet Crimes Against Children task force, or ICAC.

He said it’s only a matter of time before those who view or create CSAM act on their fantasies. He believes that’s true whether the images are of real or fake children. 

“I mean, if I looked at commercials about Big Macs at McDonald’s every day,” Atkins said. “And I just love watching these commercials about Big Macs, I’m probably going to go to McDonald’s and buy one at some point.”

Atkins said the ICAC task force already has had experiences with photos being altered to create CSAM using editing software. 

He said the task force has seen offenders take photos of children from public events, such as a soccer game, and put the children’s heads on existing adult pornography.

“Some of those even were so high-quality we couldn’t tell you which ones were real,” Atkins said.  

Atkins said his team finds adults who have a sexual interest in children, even if they are viewing AI CSAM. He said even those who claim they’re just fantasizing are dangerous.

“How long can you look at it before you act?” Atkins said. “And I would say more than half the people we interact with have offended on children.”

In addition to tracking down predators, ICAC seeks to identify victims depicted in CSAM. Atkins said the possibility of AI images showing children who may not exist could be a new roadblock in investigations. 

“I think, in the beginning, there’s going to be a lot of unknown, unidentified” (victims), Atkins said. “If we start doing a lot of AI here, we’re going to spend a lot of time tracking down people that don’t exist.”

Putting pen to paper

Sen. Katrina Shealy, R-Lexington, serves on the children’s committee with Hutto.

She said she’s not aware of any commission or committee being formed in South Carolina to discuss bills on AI.

But she said she could envision a committee focused on crafting legislation that would prevent AI-generated CSAM from slipping through legal cracks. 

“I think we’ll write the law to the best … (the) Senate or the House are able to,” Shealy said. “And then you know, we always tweak it to make it just a little bit tighter.”

Hutto said writing a bill to present to lawmakers would take time because there is so much to understand yet about AI. 

“I’m sure that we’ll be in the realm of this for a while, too, until we try and figure out how to word the words on a piece of paper to make the law to capture the speed with which technology can change,” Hutto said. “And that has been a challenge for us in many areas of the law.”

Hutto said there’s not a clear timeline on when discussion might begin.

“Technology is changing so quickly,” Hutto said. “It’s going to be hard for us to write a law this year that will capture what’s going on next year.”

It’s unclear if Wilson is going to further press members of Congress on the issue.

Several attempts over a two-week period to interview Wilson were unsuccessful. 

AI can be used to alter existing photos. (Claire Carter/Canva AI Photo Editor/Carolina News and Reporter)

AI also can create fake children. (Claire Carter/Fotor AI Image Generator/Carolina News and Reporter)