A line drawing of a person sits behind a lap next to books, looking to the right at a being made of the letters m-a-c-h-i-n-e. The being types at a computer as the word engage drifts up off the screen like bubbles.

December 12, 2022

Chatbots explain things to me–and an opportunity for educators to engage against the machine.  

The main character on Education Twitter this past week was Open AI’s ChatGPT, which according to itself is an AI assistant “trained to help answer questions and provide information on a wide variety of topics.” 

Early testing by instructors revealed that the chat received high scores on AP test questions. Professors posted full-length essays generated by the AI. Middle school students cheered, gleefully anticipating an end to pesky short answer questions. Proctoring and plagiarism-detecting software companies experienced a cold chill. 

And yet. 

Before sounding the alarm and dragging everyone back to campus for in-person, handwritten proctored exams out of fear of the AI–and haphazardly dismantling all of the progress made in recent years toward accessible education modalities for disabled students and distance learners–let’s take a moment to consider what ChatGPT has actually done. Don’t panic. (And for a nuanced take on this topic and how AI can support outcomes for students, especially disabled students, read this essay by AHEAD Journal CEO Dara Ryder.)

Crafting mostly-relevant, correct-sounding, generic text in multi-paragraph format on almost any topic is now possible for anyone. 

ChatGPT, in other words, didn’t do the assigned reading, according to John Warner (admittedly, his original Twitter thread said it more colorfully). The author of Why They Can’t Write and The Writer’s Practice says,It has no idea what it’s saying. It understands syntax, not content. It is not thinking in the ways humans think when they write” (2022). 

Warner’s work focuses on the systemic ways that teaching writing has been sidelined by generations of standardized assessments, lack of autonomy for teachers and students, and top-down education mandates. Writing (like all disciplines) is at its heart a specific way of thinking, and to become a skilled writer requires practicing that type of thinking. 

The reality is that form-first writing (five paragraphs, anyone?) is prioritized, often at the expense of the critical thinking and metacognition necessary to make intentional writerly moves. 

Far from a death knell to writing in K-16, ChatGPT brings freedom from formats and structures that teachers of writing know don’t always serve our students or support their development as independent thinkers and effective writers. Why assign a generic format on relatively mundane topics when that format can be quickly reproduced by AI? 

The trick now for educators is to craft assessments and prompts that stump the ChatGPT. The bot can recite facts, reiterate themes, and compare and contrast, but it can’t make connections to lived experiences, the zeitgeist or speculate on the human condition. Sure, it’ll write you a sestina in under a minute, but it can’t make you feel it

Buried not-so-far under all this, there lies of course a metaphor for InSpace and the other virtual classroom platforms out there. Do we educators want to teach to the tech? Or do we want freedom from the platforms that don’t serve our students (that we know aren’t serving our students)? Do we want to make space for digging deeper into our subject areas with our students, or do we want to engage only on the surface of our slide decks? Do we want to settle for tech of convenience or demand also tech of liberation?

At InSpace we are always carefully walking a line–always drawn to the futuristic technology that lets us build virtual classrooms unlike anything else out there, but also fiercely protective of the educators and students whose humanity transforms a platform into a classroom. We build tech that’s designed to get tech out of the way so that teachers can teach in spaces where students have the freedom to learn.

We’ve focused our product roadmap on features that prioritize connectedness, community, and fostering student autonomy (in other words, features that respect the true purpose of an educator) over automations that imagine the role of an educator as a paper-pushing, attendance-ticker who can’t function without a slideshow. We’re about building social layers that will support our students long after graduation, not seating charts for the here and now. 

If you’d like to join us in building classrooms that center human voices and prioritize education in pursuit of freedom, visit https://InSpace.chat/try


Ryder, D. (2022). AI is here – If we fight it, we’lll lose and so will our students! AHEAD Journal. Retrieved from https://ahead.ie/journal/CEOs-Corner-AI-is-here-If-we-fight-it-we-ll-loose-and-so-will-our-studentsl

Warner, J. [@biblioracle]. (2022, December, 3). GPT3 is a bullshitter. It has no idea what it’s saying. It understands syntax, not content. It is not thinking in the ways humans think when they write. Lots of students get good grades by becoming proficient bullshitters, regurgitating information back at the teacher. [Tweet]. Twitter. https://twitter.com/biblioracle/status/1599106813021995008 

Williams, M. (1992) The Shrinking Lonesome Sestina. Retrieved from https://www.webpages.uidaho.edu/~markn/courses/315/pdf/sestina.pdf