Artificial intelligence apps are too useful to avoid. But can schools use them in ways that won’t harm creativity or turn students into robots?

Two robots write essays

Two robots write essays for a school assignment. (Illustration by News Decoder)

This article was produced exclusively for News Decoder’s global news service. It is through articles like this that News Decoder strives to provide context to complex global events and issues and teach global awareness through the lens of journalism. Learn how you can incorporate our resources and services into your classroom or educational program. 

Savannah Jenkins used ChatGPT for the first time in February of 2025. She had responded to a posting with a public relations firm that was looking to hire a writer who could humanize articles produced with ChatGPT. 

“I felt conflicted about signing up to be a ‘Writer’ given I was essentially proofreading outputs from a machine with no critical thinking capacity,” Jenkins said. “Artificial intelligence holds great potential but also begs troubling questions for educators.”

Jenkins is now communications manager at Onja, a social enterprise in East Africa training underprivileged youth to become software developers. The doubts that she had mirrors concerns we’ve had at News Decoder and the doubts educators across the world share. How can we know if a student, not AI, has done their work? Will students read and reflect less if they rely on AI? Will AI supplant original research?

In short, will students learn to think for themselves?

I asked four News Decoder alumnae to consider these questions. Their answers give an idea of the uncertainties surrounding use of AI while providing a glimpse into how young people who recently joined the workforce are navigating the challenges.

When humans interact with machines

Jenkins said that when she first used AI she thought it was magic. “Was this what my parents felt when they watched my sisters and I use Google?” she said. “Did they feel the same mix of awe and fear?”

She was offered that job as an AI humanizer. It pushed her to try a tool that she might have otherwise avoided.

“Now I use ChatGPT more regularly,” she said. “I still feel conflicted about feeding this machine. I typically do it to handle heavy workloads, though I wish things would just slow down instead of humans having to go faster to the detriment of their creativity and critical thinking.”

Alexandra Gray-Harkins, a senior marketing professional and PhD candidate in Multi-Sector Communication at St. John’s University uses ChatGPT all the time. 

“When I’m cooking dinner and I’m not sure if I have an appropriate substitute for an ingredient in a recipe, I’ll ask AI,” she said. “If I’m planning a trip, I’ll use ChatGPT to outline an itinerary as a starting point before conducting my own research. I’ve also used AI to create a personalized workout plan for me.”

Giuliana Nicolucci-Altman, coordinator for climate research and innovation at the International Rescue Committee, started using AI for grammar and spell checks and summaries.

“That said, I quickly noticed how addictive it felt,” Nicolucci-Altman said. “I almost instinctively punched requests into AI chatbots rather than doing the work myself and I noticed my own work getting sloppier and less thoughtful.”

Some guidelines for using AI

News Decoder has developed Ethical Guidelines for Using AI that focus on transparency, accuracy, verification, fairness and avoiding bias. All authors who create and publish content with News Decoder — both student journalists and professional correspondents — are expected to follow these guidelines:

– Take responsibility for your work.

– Disclose AI use.

– Illustrate, don’t imitate.

– Add information about AI-generated media.

– Verify AI-generated content.

– Watch out for AI bias.

– Be inclusive.

– Only tell real stories.

For more comprehensive information, download our Guide for Students and Guide for Teachers.

Even though AI saves her time, she tries to limit its use.

“The practice of writing felt less organic and more clumsy and forced,” she said. “As soon as that became clear, I cut down on it big time. I still occasionally use it to summarize meeting notes or spell check, but always do a thorough review since I often catch mistakes.”

Schools struggle to adapt.

The disparity between the benefits and flaws of AI has created a real dilemma for educators. Should they try to stop its use or incorporate it into the classroom?

Nicolucci-Altman said that had she had AI when she was at school she would have relied on it too much. “I am incredibly grateful that the AI chatbot boom, or at least ChatGPT’s launch, occurred after I completed my schooling, including my master’s,” she said. “I would have been half the student if I’d had this option and I don’t think I would have had the discipline I do now at 27.”

Many high school teachers are worried that their students will rely on AI to produce class assignments. Already it is difficult for many of them to tell whether and how much submitted work was generated by AI.

Jenkins said part of the problem is that we feel the need to do everything fast. “I think as things speed up we need to go slower,” she said. “It feels like the whole system needs an overhaul, moving from delivering outputs, which puts pressure on students to deliver better work, faster and to increasingly superhuman — or supercomputer! — standards, to cultivating learning journeys.”

Emma Bapt, a research collaborator at the Geneva Academy of International Humanitarian Law and Human Rights, said that we must establish protocols for using AI. First, make honesty about the use of AI a default. “Incite your students to be honest about their AI use,” she said. “If they use AI, then they should cite the system used. Due diligence goes for AI as much as it does for regular referencing.”

Second, both educators and students need to be trained on how to use it. “There are fantastic, and free, programs out there on how AI can be leveraged positively as a tool and integrated into work processes,” she said. Bapt pointed out AI for social good, as an example. 

Should we keep AI out of schools?

Perhaps the larger question is whether AI has a place in high school education at all. 

Jenkins said no. “This time is so critical to development and identity building,” she said. “Having access to tools that rob students of their ability to form their own opinions is a huge disservice to them.”

She is grateful that there wasn’t AI when she was a teenager. Without it, she had time to learn new things without the influence of a biased machine telling me how to do things. “I had to talk to people and decide if they were worth listening to!” she said. “AI is robbing us of human connection and I think the worst thing we could do is to deploy this in schools.”

Gray-Harkins argued that while AI shouldn’t be used to create original work, it can be a useful tool for such things as synthesizing complex primary sources and refining and polishing work that a student first creates.

Nicolucci-Altman noted that we often leave parents out of the conversation. “AI is one of those tools that is going to slow down students’ skill-building, in key areas like reading and writing,” she said. “Parents need to be bought into this effort to both limit the usage of AI and teach responsible practices.”


Questions to consider:

1. What does it mean to “humanize” an AI app?

2. Why are some of the people in this article glad there was no AI when they were in school?

3. How do you use AI and do you think there are negatives to doing so?

ngraves 2022 square

Nelson Graves is the founder of News Decoder. A dual American-French citizen, he has worked as a foreign correspondent and educator on three continents. Recently he published a memoir entitled “Willful Wanderer”. He lives in France.

 

 

Share This
EducationCan we use AI intelligently?