My colleagues buzzed with excitement in the teachers’ lounge. They had discovered a new super-program that could answer almost any question. This artificial intelligence could hold actual conversations. It made Google searches feel obsolete. The journey from encyclopedia to Google to ChatGPT felt like a miracle cure with unimaginable possibilities. Or was it?
The enthusiasm was contagious but short-lived. Headlines soon warned of cheating and easy shortcuts. The chatbot churned out inaccuracies with alarming frequency. We had little overview of what had washed over us here in Norway. Messages streamed in from across the continent. We were not alone in our confusion.
Schools had maintained control as we moved from textbooks into the digital everyday. We could block sites and run plagiarism checks. We could guide what students accessed. With AI’s arrival, however, the problem changed shape entirely.
One evening I tested ChatGPT at home. I used assignments I had given students in the past. The answers were well formed and structured. They were often correct. But not always. Sometimes facts were wrong. Sometimes the logic was flawed. The chatbot presented errors with the same confidence it presented truth. That is when the real worry set in.
The Real Problem
The technology itself is not the problem. What students choose to do with it is. We tell them to check answers and verify sources. We urge them to use critical thinking. But do they? Or do they copy and paste, grateful the work is done? They rarely ask whether the information is reliable.
What keeps me up at night is this concern. We are teaching students to be lazy. Not intentionally, but by default. When a chatbot spits out an answer in seconds, why would they spend hours researching? Why would they struggle and think deeply? This happens across all subjects. Math, science, language, history. What are we doing to their ability to think? To question? To create something original? The ease of it all is seductive. And dangerous.
Adapting Our Approach
So what do we do? In my classroom, I had to fundamentally rethink my approach. We can no longer simply assign essays or problem sets. A chatbot completes those in seconds. The direction we took aligned with new guidelines from the Directorate for Education. We had to shift the teacher’s role. From delivering content to guiding and verifying quality in a broader sense. We had to move fast and adapt routines to keep students on track.
Familiar questions resurfaced in new forms. Why learn math when we have calculators? Why learn to write essays when AI can do it? Why remember historical facts when everything can be looked up in seconds? Why even go to school? The weight of this change hit us hard. We needed to illuminate learning from a different angle. We had to help students see value in the process itself. Not an easy task when the target is invisible. Students must uncover that value themselves.
I began telling them what would have sounded strange years ago. Yes, you can use ChatGPT. But here is what you must do. Verify everything. Question the logic. Document your process. Show the steps, the dead ends, the edits. Explain what was yours and what the machine suggested. Real learning happens in the space between human curiosity and machine convenience. When that space is guarded, guided, and made visible, the machine becomes a partner. Not a replacement.
As I talk with colleagues abroad, the questions echo in different languages. They carry the same urgency. What other systems are doing matters. What we might learn from them matters. This is not a local problem. It is a global conversation about what we value in education.
Reflections Across Borders
When we talk about this in the teachers’ lounge, there is a growing sense. We are chasing something we cannot quite catch. Change happens faster than our systems can adapt. We are running with our shoes tied together. We try to revise our methods and update our policies. We build new frameworks. Yet the technology keeps sprinting ahead. It shapes learning in ways we can barely comprehend. The uneasy truth is clear. We no longer have full control. Not even close.
What worries us most is the widening gap. The gap between what education is and what it needs to be. We fear falling further behind. We may lose our ability to ensure that learning remains accurate and meaningful. If we fail to guide students now, they will chart their own courses. Not always wisely. They will follow shortcuts that lead nowhere. The ease of it all lures them toward passivity. It pulls them away from the slow, deliberate act of thinking.
Underneath it all lies the question none of us can answer. How do we control something that refuses to be controlled? A system that learns, evolves, and reacts faster than we can respond?
I have not witnessed classrooms abroad firsthand. But I have spoken to teachers from across the world. They all echo the same concern. We are in the same boat, drifting without a clear map. The time for sitting on the fence is over. Now is the moment for public and private schools across borders to join forces. They must work with the very people developing these systems. Together, we must establish common guidelines. Guidelines that can be implemented, adapted, and lived out in classrooms. Because without it, education may not just lag behind. It may lose its compass altogether.
Diversity in the Age of AI
When we talk about diversity in education, we often think in limited terms. Culture, language, background. But now, diversity means access. Access to technology, to understanding, to frameworks that govern AI use. In Norway, public schools operate under national guidelines from the Directorate for Education. These emphasize digital competence, source criticism, and data privacy. They stress responsible use of tools. Private schools receiving state funding must follow similar standards. They are subject to the same oversight. But schools without such funding may have greater autonomy. They also have fewer resources to adapt quickly.
This divide is mirrored beyond Norway’s borders. In many western countries, wealthier private schools can afford advantages. Professional development, digital infrastructure, early implementation of AI policies. In eastern or lower-income regions, many schools lack basics. Consistent electricity, reliable internet, teacher training. There, AI tools may arrive first as luxury. Then as crutches without guardrails.
The consequence is real. When some schools lead in enacting guidelines, others scramble to keep the lights on. Equal opportunity becomes a hollow promise. In Norway, 85 percent of high school teachers want national guidelines for AI usage. But few clear mandates exist yet.
Perhaps the true challenge is not how smart machines become. It is how equitable our systems remain. If private schools surge ahead and public ones lag, diversity shrinks. Diversity of thought, of opportunity, of challenge. We must ensure no school is left behind. Private or municipal, domestic or abroad. All must learn what AI means. For knowledge, for fairness, for growing minds.
Photo by Nahrizul Kadri on Unsplash
- When AI Entered Our Classroom – by Tor Arne Jørgensen - October 17, 2025