AI interface showing student feedback

Written in by Stephen Kitomary

AI in Education: Tool or Crutch?

AI is transforming how students learn. But there's a difference between AI that makes students smarter and AI that makes them lazy. Here's how we think about it at Createch.

AI has changed everything.

Students use ChatGPT to write essays. Teachers use it to generate lesson plans and exam questions. Tutoring apps promise personalized learning at scale. Content that used to take weeks to produce now takes minutes.

This isn’t coming. It’s here. Walk into any school with internet access — or just watch students on their phones — and you’ll see it. The question isn’t whether AI will reshape education. It already has.

But something else is happening too.

The Other Side

Teachers are noticing it. Students submit work that’s polished but hollow. Technically correct answers with no understanding behind them. Essays that sound impressive until you ask a follow-up question and get a blank stare.

The problem isn’t hard to diagnose: when the answer is always one prompt away, why struggle with the problem yourself?

Learning is uncomfortable. Real understanding comes from wrestling with concepts, getting stuck, making mistakes, trying again. That friction — the frustration of not knowing — is where growth happens. It’s not a bug in education. It’s the feature.

AI that removes that friction doesn’t help students learn. It helps them avoid learning.

There’s growing concern that easy AI access is eroding creative thinking. Critical analysis. The ability to sit with a hard problem and work through it. Students who outsource their thinking never develop the muscles that thinking requires.

This is a real risk. We take it seriously.

The Distinction That Matters

But here’s where we disagree with the doomsayers: the problem isn’t AI. The problem is how we deploy it.

There’s a fundamental difference between:

AI that does work FOR students — generates their essays, solves their problems, gives them answers to copy. This makes students weaker. It’s a crutch that atrophies the leg.

AI that does work WITH students — evaluates their thinking, identifies their mistakes, shows them where to improve. This makes students stronger. It’s a coach, not a replacement.

The first kind of AI is everywhere. It’s the default. It’s what students reach for when they’re stuck and want the easy way out.

The second kind is harder to build. It requires restraint. The AI has to know the answer but not give it. It has to understand where the student went wrong and explain why — without just doing the work for them.

That’s what we’re building at Createch.

Feedback, Not Answers

Our LMS includes AI-powered assessment that works differently from what students are used to.

Students still do the thinking. They read the question. They write their response. They struggle with it. They submit what they actually understand — not what an AI generated for them.

Then our system tells them how they did. Not just a score — an explanation. What was correct. What was missing. What would have earned full marks. Where their reasoning broke down.

This is fundamentally different from asking ChatGPT to solve your homework. The student does the cognitive work. The AI accelerates the feedback loop.

In a traditional classroom, that feedback might take weeks. A teacher with 200 students can’t return marked practice exams quickly — there aren’t enough hours. So students wait. By the time they see their mistakes, the material is distant. The opportunity to learn from errors has passed.

Instant feedback changes that equation. Students see where they went wrong while the question is still fresh. They can try again immediately. The iteration cycle that builds real competency — attempt, feedback, correction, attempt — actually becomes possible.

Grounded in What Matters

Here’s what makes our approach different from generic AI tutoring:

We don’t grade against some universal rubric invented by engineers in California. We use RAG — Retrieval-Augmented Generation — connected directly to official NECTA marking schemes.

NECTA is what determines whether Tanzanian students pass their national exams. It’s what actually matters for their futures. Our AI evaluates student responses against the same standards their real exams will use.

This means the feedback is relevant. When we tell a student their answer would earn 2 out of 4 marks, that’s not an abstract judgment. That’s what would actually happen on their Form Four exam.

We also trained the system to handle how Tanzanian students actually write. Mixed English and Swahili. Local examples. The way teachers here actually mark — not some idealized version that ignores reality.

AI that doesn’t understand its context is useless. Worse than useless — it’s misleading. It tells students they’re wrong when they’re right, or right when they’re wrong, because it wasn’t built for their environment.

The Teacher Question

Let’s be direct: this isn’t about replacing teachers. It can’t be and it shouldn’t be.

Teachers do things AI cannot do. They notice when a student is struggling emotionally, not just academically. They inspire. They adapt to the human in front of them. They make judgment calls that require wisdom, not just pattern matching.

What teachers can’t do — what no human can do — is provide instant, individual feedback to hundreds of students simultaneously. The math doesn’t work. Class sizes are too large. Time is too short. STEM teacher shortages are too real.

So practice work doesn’t get assigned. Or it gets assigned and never returned. Or it gets returned so late that the learning moment has passed. Students don’t get the repetition they need because the system can’t support it.

AI handles the volume. Teachers handle the humanity. That’s the division of labor that actually makes sense.

Using AI Right

We’re not naive about the risks. Students will try to game any system. Some will still look for shortcuts. The temptation to let AI do your thinking doesn’t disappear just because we built a better tool.

But we can design systems that make the right behavior easier than the wrong behavior. When AI is positioned as a feedback mechanism rather than an answer machine, the incentive shifts. You can’t get feedback on work you didn’t do. The only way to use the system is to actually engage with the material.

That’s the design principle: AI that rewards effort, not avoidance.

Education technology should make students more capable, not more dependent. It should build skills, not bypass them. It should prepare students for a world where they’ll need to think for themselves — because AI won’t always be available, and even when it is, the people who can think will always have an advantage over those who can’t.

AI is a tool. Whether it becomes a crutch depends entirely on how we choose to use it.

We’re choosing carefully.