
For Year 11 and 12 students, the temptation to treat AI as a “magic button” for homework is real. We’ve all been there: staring at a complex derivative or a trigonometric identity at 11:00 PM, hoping ChatGPT can just spit out the answer. But if you’ve tried it, you’ve likely noticed a trend—AI is surprisingly bad at “doing” math, yet incredibly good at “teaching” it.
The Calculation Trap
Ai is built on patterns of language, not the laws of physics or formal logic. When you ask an AI to execute a multi-step mathematical derivation, it isn’t “calculating” in the way a calculator does; it is predicting the next most likely word or symbol. This often leads to hallucinations—where the AI confidently presents a solution that looks right but contains a fatal logical error in step three. For a VCE or HSC student, relying on AI for execution is a high-risk gamble that bypasses the cognitive struggle required to actually learn the material.
Where AI truly shines is in conceptual translation. Einstein famously suggested that if you can’t explain it to a six-year-old, you don’t understand it yourself. AI allows you to reverse-engineer this. If you’re stuck on the concept of Limits or Integration by Parts, you can ask the AI:
“Explain the fundamental theorem of calculus using a metaphor about a leaking bucket for a 6-year-old.”
Suddenly, abstract symbols become tangible stories. This shifts the AI from a replacement (which does the work for you) to a supplement (which clears the mental fog so you can do the work).
The goal of senior secondary math isn’t just the final number; it’s the mental architecture you build while getting there. Use AI to:
Brainstorm analogies for difficult concepts.
Summarize the “why” behind a formula.
Create practice questions based on a specific topic.
By treating AI as a high-level consultant rather than a ghostwriter, you maintain your academic integrity and—more importantly—you actually show up to your exams with the knowledge in your head, not just on your screen
Phillip Preketes