It was 1973 and time for my first expository writing experience in third grade. Narrative writing, stories, were easy, even enjoyable. Things were about to get complicated as sources beyond my brain were now required. While my awesome third grade teacher explained in understandable ways, sourcing information, analysis, synthesis, explication and explanation was a lot more work.

I struggled to turn my stack of notecards into a coherent piece of writing. I leveraged The World Book Encyclopedia, a manual pre-personal computing LLM from the 70’s, and their excellent subject area summaries that I copied and padded to meet length requirements. My third-grade teacher later taught me to create an outline as a series of questions before research, a practice I still follow.

Fast forward to COSN in the spring of 2022 in Austin Texas. At that time, Chat GTP was at once a media darling and villain of the education community. I found, sitting in the back of a panel discussion on AI in education, after participating in an earlier panel matter of factly using it to get ideas for an AI RUP preemptively.

This panel discussion was about AI. There the Fear That Had No Proof found its way into the room. The panel had discussed how AI might impact students and how their respective districts were handling the overall issue of AI in the classroom deftly avoiding that as a single-issue trigger. Eventually questions from the audience would force the fear to the surface.

The fear was that our students would never learn to write properly because they would just copy and paste from Chat GTP or some other generative AI tool. I found it somewhat humorous because I could remember online tools that could produce “writing” from a topic paragraph or as little as a topic as early as 2020. The response, if there was cause for such a response, was a little late.

The panel met the moment discussing perspectives and viewpoints on AI in the instructional setting ranging from “shut it down and block it” to “wait and see” to the necessity to expose kids to the technology as part of the 21st century vocational skill set.

This “Fear” has left attitudes stuck between tentativeness and paralysis and stifled conversations about constructive and beneficial applications of AI in EdTech between educators and vendors. Three years later “The Fear” still finds its way into conversations. Here are “The Facts”.

This Turnitin press release from July 2023 states “ of those 65 million papers, over 2.1 million – 3.3 percent – have been flagged as having at least 80 percent AI writing present. Nearly 6.7 million – 10.3 percent – have over 20 percent AI writing present. While great emphasis has been placed on the 2.1 million number, the percentage, 3.3%, is statistically insignificant. “The Fear” isn't a fear but it may be a legitimate concern because it is a situation that can be addressed by a more education informed approach to AI design and development to support writing process and integrity instruction on the human side of the street.

There is other alarming April 2025 data from TurnItIn.

  • 64% of students worry about the use of AI within education, outpacing academic administrators and educators.
  • 95% of academic administrators, educators, and students surveyed believe AI is being misused.
  • Organizations may be expecting a future workforce that is AI-ready, but 67% of surveyed students feel they are shortcutting their learning by using AI.

Almost two years after actual data was released the impact of “The Fear” still remains. In the interest of moving on, here's a bit of a vision of leveraging AI and some AI as infrastructure to augment writing instruction in real time before errors in process like plagiarism become habits.

  • Getting current AI tech out of silos. Currently there are academic integrity tools that can be integrated into the common student productivity platforms like Turn It In Draft Coach. Unfortunately this realtime tool has no point of teacher integration and feedback does not occur until the student submits the work to another TurnItIn tool. In addition there does not appear to be a way for the tool to share information about challenges students may be encountering during the writing process that may benefit from instruction or intervention. There is a tool that does this called Clarity, but it doesn’t seem to integrate with common student productivity platforms. This is a case where agents and agentic systems may be useful to bring all parties and capabilities into a commonly accessible space. In addition the teacher might find the ability to manage if the learning agent should be proactive or reactive depending on the student’s need.
  • Integration of the human infrastructure. TurnItIn has seen the learning opportunity. Now we need to bring all the humans involved in students becoming capable high integrity writers into the mix to expand the range of the “teachable moment”. Students write in more than one class, or at least need to. If a student has challenges with expository writing in ELA, every subject area teacher needs to know because writing in science, history, social studies, music, or any specific content area will be expository whether it is a full blown paper or just essay style answers to study questions.

In addition, every teacher needs to be aware of recommended interactions and progress to ensure support is consistent and accountable. It sounds like more work, but a personal learning management agent will probably orchestrate the learning agents engaging with the student on subject area writing assignments. Teachers will only need to ensure the student is engaging with the agents and make decisions based on system recommendations and data. (E.g. Managing if agents will be proactive or reactive. Proactive agents may even be used to initially guide students with writing project time management.)

  • An ecosystem where data, information, AI agents, and humans can be efficiently orchestrated. The village is now inhabited by the biological and the digital. (And ultimately the quantum) Information must be available to trigger actions that may yield more information that may trigger more actions generating more information until the opportunity is achieved. This is a modular ecosystem approach to implementing AI that allows applications, agents of various functions, and various information and memory stores to work together intelligently at the microscale of the student with teachers, writing coaches, instructional coaches, athletic coaches, transportation scheduling, and parents.. This is an Human and AI as infrastructure specifically suited to the uniquely human enterprise of education.

Why coaches and transportation scheduling? If a student has to get to school early or leave late because of a tutorial class, coaches might need to know why they might be late or leave practice early, as well as someone has to transport them. Ask any assistant principal and they will be glad to validate this often-overlooked component of academic achievement. Plus, something has to manage the self-driving school buses.

Finally, There were only two vehicles that managed to temporarily get a person into low earth orbit when the goal of going to the moon was set. It took a little less than seven years to build physical and computer systems from scratch and even invent technology to go from a few orbits to landing and returning from the moon.

Education is a uniquely human enterprise but just as the team who sent men to the moon, we need to aggressively engage with technological innovation if we are going to move education forward. And this mission is far more important.