University of South Carolina senior instructor Dr. Laura Smith is one of many professors adapting AI technology for use in the classroom. Photos by Sencere Rice/Carolina Reporter

Global artificial intelligence technology continues to grow exponentially into the classroom space, even though some are concerned about its rapid changes.

Artificial intelligence describes computer intelligence systems that perform a selection of tasks similarly to humans: computing, processing and sharing information; problem-solving; and decision-making executive functions.

Education is one of the fastest-growing sectors for AI. AI in education aims to assist and structure the experience of learning and to prepare students for AI use after graduation.

Dr. Laura Smith, a senior instructor at the University of South Carolina’s School of Journalism and Mass Communications, has had to adapt to using educational AI in her courses.

“A lot of times we think about disadvantages (about AI) first, especially in an academic space,” Smith said. “… If you use it to help you be more efficient, to understand something, that can be really time-saving.”

USC partnered with ChatGPT developers’ OpenAI in June 2025 to provide free access to the AI program for its faculty, staff and students. The agreement was worth $1.5 million, the school said.

The university encourages its students to use AI programs, said Associate Vice President for University Communications Jeff Stensland.

“AI is expected to change our daily lives in ways we can only imagine today,” Stensland said.

“We believe that students need to understand these changes and stay ahead of the curve when it comes to adopting the AI skills that employers will be looking for in new graduates.”

USC students, depending on their choice of major, use AI for different reasons.

Senior Ethan Turkel said he uses AI for “arbitrary” work and assignments that aren’t important to his major.

“AI helps me deal with (schoolwork), especially when you don’t understand something,” Turkel said. “It’s very convenient.”

Fellow student Caulder Christian said AI use in schools has its upsides.

“A lot of things that you have to do that aren’t that important to your life or to your future, you can just have AI help you with,” Christian said. “Giving AI the delegation of the plain stuff gives you more time to focus on what really matters.”

The ease of use and immediate results have made AI the fastest-growing industry in the tech world.

An April report from United Nations Trade and Development projected a global AI market worth $4.8 trillion by 2033. That group also projects the market will be the dominant sector in technology by the same year.

Common AI usage includes generative AI, which is a kind of intelligence that creates content from videos, art, images or graphics. Many of the world’s top businesses see AI as an efficient source of innovation and information.

American tech and business giants such as IBM, NVIDIA and Tesla have adopted artificial intelligence systems.

AI adoption describes the integration of AI technology into daily operations to achieve peak efficiency. But this subset of AI use is in decline.

The United States Census Bureau conducted a survey of large business firms earlier this month. The data found that AI adoption rates are trending downward for large companies across the country.

But AI use in education is still expanding.

AI critics and protesters say increased AI use can cause job displacement, privacy and ethical concerns, misinformation, uniformity and environmental risk due to the large consumption of energy AI demands.

Critics of AI use in education point to the risk of causing students to rely heavily on AI technology and decrease their range of critical thinking. An over-reliance on AI technology also can put academic integrity at risk, they warn.

Stensland acknowledged the risk.

“Obviously, students need to be responsible about using AI and need to be mindful of academic integrity issues,” Stensland said.

Smith said AI in classrooms could be confusing for students if there isn’t a standardized model.

“But if we have an institutional model, we’re not going to have innovation,” Smith said. “And now is the time for innovation.”

 

A USC student demonstrates the format of results given by OpenAI’s ChatGPT software when asked an academic question.

USC’s Office of Student Conduct and Academic Integrity, housed in the James F. Byrnes building, handles campus code violations related to AI use.