Skip To Main Content

“This is my own work”: Generational Perspectives on AI and Authenticity

“This is my own work”: Generational Perspectives on AI and Authenticity
  • Students

by Mira Lalani ’27 and Kate Lee ’26

The Experiment

To explore how different generations perceive artificial intelligence, we conducted a small writing experiment. We asked participants from three age groups — middle school student Katie Peng ’31, upper school student Kaitlyn Troiano ’27, and faculty member and librarian Mr. Walz — to take part.

Each individual read two five-sentence short stories in response to the same prompt: Describe your perfect summer day.

What they did not know was which essay was written by a human and which was written by artificial intelligence. 

Sample 1:
My perfect summer day begins with a delicious breakfast, followed by a walk and a trip to the beach. The afternoon is spent swimming and relaxing in the sun. A gentle breeze and temperatures ranging from 70°F to 75°F are ideal for the morning. As the day winds down, I eat dinner at a local restaurant by the water that serves oysters, steak, and burrata. Dinner is followed by a trip to the local ice cream shop.

Sample 2:
My perfect summer day starts with bright sunshine and a clear sky. I go for a bike ride along the beach in the morning. In the afternoon, I meet my friends for a picnic by the water. We watch the sunset together as the day cools down. The evening ends with a small bonfire and quiet conversation.

When the Artificial Sounds More Authentic

After reading both samples, every participant was asked to guess which story was written by AI. Surprisingly, all three guessed incorrectly.

Most believed Sample 1 — the AI-generated story — had been written by a human. They described it as “more detailed,” “flowy,” and “professional,” noting its vivid imagery and specific details like the temperature range and dinner menu. The human-written story, Sample 2, felt simpler and more conversational — which, to many, made it seem less polished but more believable.

Middle school student Katie Peng explained, “Sample 1 is AI because it’s descriptive and uses words you wouldn’t normally think of right away.” Similarly, upper school student Kaitlyn Troiano thought it was “trying too hard” to sound sophisticated. Faculty member Mr. Walz agreed, describing Sample 1 as “overwritten,” while calling Sample 2 “more authentic, like someone just telling you about their day.”

The results suggest that even when AI’s writing feels overly perfect, its polish can be mistaken for authenticity. In an era when algorithms are trained to mimic human tone and rhythm, the boundary between natural and artificial expression is becoming increasingly difficult to see.

The Question of “My Own Work”

As artificial intelligence becomes more present in classrooms and everyday tools, its relationship to authorship and originality is becoming harder to define. At Kent Place, the phrase “This is my own work” has long symbolized academic honesty — but does that still hold the same meaning in the age of AI? When students can generate ideas, outlines, or even polished essays with the click of a button, questions arise about what counts as authentic thinking and what crosses into academic dishonesty. If AI can assist, edit, or even co-create, where should the line be drawn? And more importantly, how can schools preserve student voice and genuine intellectual effort while still embracing the educational benefits these tools offer?

When asked this question, responses revealed both generational differences and shared uncertainty.

Middle School student, Katie Peng, reflected on the accessibility of AI in daily life. “Anyone can ask ChatGPT to do their work for them”, she noted. However, due to her limited exposure to using AI herself, she is wary of the use of artificial intelligence in academic settings. She believes that the statement “This is my own work” means that the submitted work is reflective of the individual’s own thoughts. However, she also recognized the fine line between inspiration and plagiarism. Katie noted that while she would not consider using AI for idea generation as plagiarism, there is a difference from thinking of an idea independently. 

Upper School student Kaitlyn Troiano reflected, “I just don’t think that ‘my own work’ is realistic anymore, because then everything would be considered not to be our own work. AI is in Google — it’s already part of what we use every day.” She emphasized that students should be taught how to use AI ethically rather than be told not to use it at all. For her, AI can be a tool for brainstorming and efficiency, not a replacement for creativity.

Mr. Walz, however, offered a more cautious view. Comparing AI to a hammer, he explained, “A hammer is a useful tool — but you can’t build a birdhouse with just a hammer.” To him, AI is only one instrument among many, and its value depends on how wisely it’s used. “You don’t want it to replace human judgment,” he said, warning against trusting AI to determine truth in an age of misinformation.

Together, their perspectives show how definitions of authorship and originality are shifting. While older generations often frame AI as a supplement to human judgment, younger generations are more likely to see it as an inevitable collaborator — a new kind of pencil in a digital classroom.

Generations in Conversation

Across generations, our conversations revealed a fascinating pattern: while everyone recognized the growing role of AI, their comfort levels and expectations differed sharply.

For the middle school student, AI usage is on the rise— from Google searches to digital homework help, artificial intelligence is integrated in nearly all aspects of life. Although Katie Peng views it less as something new and more as something inevitable, her focus wasn’t on banning AI, but on developing foundational academic skills without it. She holds the position that students of the next generation should have a learning experience similar to the pre-generative AI era. However, for her generation, the need for AI literacy feels increasingly essential.

The upper school perspective brought more ambivalence. Students like Kaitlyn Troiano see AI as a time-saving tool and creative partner, but also worry about fairness and authenticity. High schoolers face increasing academic pressure, and AI can seem like both a temptation and a lifeline. Many want clearer guidance from teachers — not prohibition, but structure. As Kaitlyn put it, “There are rules for a reason, but there shouldn’t be such a stigma. We just need to learn to use it the right way.”

Meanwhile, the faculty perspective — represented by Mr. Walz — reflects the experience of someone who has watched several technological revolutions unfold. For him, AI is a powerful tool, but one that must never replace human judgment. “The more tools you have, the better off you are,” he said, “but you still need to know which one to use.” His reflections reveal a generation that values discernment and craftsmanship — seeing technology as helpful, but not as a substitute for thinking.

When these voices are placed side by side, they illustrate not conflict but evolution. Each generation’s stance is shaped by the tools they grew up with: for younger students, AI feels natural; for teachers, it still demands caution. The real challenge lies not in choosing between the two mindsets, but in building a shared language of ethics that honors both innovation and integrity.

Ethical Takeaways

Our experiment revealed more than just who could tell human writing from AI. It uncovered how each generation defines authenticity, creativity, and ownership in a changing digital world.

All three participants — from middle school, upper school, and faculty — guessed incorrectly which story was AI-generated. That shared mistake suggests something profound: artificial intelligence has become so skilled at mimicking human tone and style that even thoughtful readers can’t always tell the difference. The question, then, is no longer “Can AI sound human?” but rather “How do we preserve what makes human writing distinct?”

Across generations, a consensus emerged that AI is neither entirely good nor bad — it is, as Mr. Walz described, a tool. The ethical challenge lies in how we choose to use it. For younger students, that may mean learning to integrate AI transparently and responsibly. For teachers, it may mean redefining what originality and authorship look like in a classroom where digital assistance is nearly unavoidable.

Ultimately, “This is my own work” still matters — but its meaning must evolve. In a world where AI can generate ideas, refine writing, and imitate voice, “our own work” may come to signify not isolation from technology, but thoughtful engagement with it. Authenticity, after all, isn’t just about who — or what — wrote the words. It’s about intention, honesty, and the human judgment behind every choice to use, or not use, the machine.

  • Ethics
  • Lead Newsletter Blog