Teachers Are Right To Be Skeptical About Classroom AI

Teachers Are Right To Be Skeptical About Classroom AI - Professional coverage

According to Forbes, there’s a massive push to get AI products into K-12 classrooms, backed by serious money and influence. The U.S. Department of Education has created a $50 million grant program specifically for “AI literacy.” Former First Lady Melania Trump has launched a task force on the issue. Furthermore, major AI corporations—Microsoft, OpenAI, and Anthropic—are teaming up with the American Federation of Teachers to create a National Academy for AI Instruction. With an explosion of tools available, teachers are being told they’ll be left behind if they don’t adopt AI, but the article argues they should pump the brakes and ask some tough questions first.

Special Offer Banner

The AI is kind of dumb

Here’s the thing that gets glossed over in all the hype: these AI tools aren’t pedagogical experts. The article points to a test drive of ChatGPT for Teachers by professor Carl Hendrick, who found it building lessons around the thoroughly debunked myth of “learning styles.” It also plans lessons backwards, starting with fun activities instead of learning objectives. So you might get a lesson plan for Hamlet that includes bingo. Is that really the best way to teach complex themes about mortality? Probably not. The tech is impressive, but it’s not magic. It doesn’t have deep content knowledge or an understanding of how students actually learn. It’s just pattern-matching from its training data, which includes a lot of outdated educational ideas.

Follow the money and the data

Two of the most critical questions teachers should ask are about long-term costs and data safety. Veteran teachers know this drill: a shiny new tech tool is offered for free, then the district gets hooked, and then the company starts charging real money. Given the astronomical sums being spent on AI development, this bait-and-switch seems inevitable. But the financial cost isn’t the only one. What’s the opportunity cost? Every minute and dollar spent on AI literacy is taken from something else. Is the district prepared to remove other requirements to make room, or is this just another thing piled on the plate?

Then there’s the scary part: student safety and data. The article chillingly references the teen suicide linked to a chatbot, with AI companies arguing it was “unforeseeable use.” Courts will decide liability, but schools can’t wait. They need guardrails now. And on data privacy, remember the FTC action against Illuminate Education after a hack exposed 10 million student records? What safeguards are in place for the data an AI tool collects on your students? If the answer is vague, that’s a huge red flag.

What are we even doing here?

The most fundamental question is also the one administrators seem least able to answer: what’s the actual goal? Is it to have teachers use AI to save time? Is it to have students use AI as a tool for assignments? Or is it to educate students *about* AI—its capabilities, biases, and dangers? These are wildly different objectives requiring different approaches. The article argues that AI literacy—understanding how it works and its pitfalls—should come first. Some teachers are already doing this, having students fact-check AI-generated papers on topics they know. That’s smart. But just waving AI into the classroom because it’s “the future” is a recipe for chaos and wasted time.

The biggest question of all

Let’s say you do let students use AI for, say, the writing process. Maybe it helps with an outline or proofreading. Then what are you actually grading? As a writing teacher of 39 years quoted in the article says, most writing problems are thinking problems. If a student automates part of that thinking, what’s left for you to assess? The student’s skill at prompting the AI? The final product, which is a human-AI collaboration? This isn’t a small detail—it strikes at the heart of assessment and learning. The article frames it as a continuum. On one end, trying to block AI entirely (futile). On the other, AI designs, completes, and grades the work, and “nobody learns anything.” Where a classroom lands depends on the choices teachers make. And making those choices thoughtfully, with clear eyes on the limitations and risks, is the only way this ends well for students.

Leave a Reply

Your email address will not be published. Required fields are marked *