AI Use Guidance – Year 13 Digital Technologies
Artificial Intelligence (AI) tools can be helpful if used correctly.
They can also invalidate your assessment if used incorrectly.
This document explains what is allowed, what is not allowed, and how your work is checked across all three Year 13 standards.
The Core Rule (Read This First)
You must understand, explain, and be able to modify anything you submit.
If you cannot do that during a live walkthrough, the work may not count — regardless of how polished it looks.
At Year 13, the expectations are higher. You are expected to make independent design and implementation decisions and explain the reasoning behind them.
What AI IS Allowed For
You may use AI to:
☐ explain concepts (e.g., "explain how binary search works", "what is REST?")
☐ explain error messages and suggest what might be wrong
☐ help you understand documentation or syntax
☐ suggest test cases (which you then adapt and justify)
☐ review your code and explain what it does
☐ generate colour palettes or design inspiration (with citation)
☐ help debug a specific error after you have attempted to fix it yourself
☐ suggest database optimisations (which you verify and explain)
AI should help you learn and understand, not replace your thinking.
What AI Is NOT Allowed For
You must not use AI to:
☒ generate code you submit without understanding it
☒ design your database schema, API endpoints, or architecture without your input
☒ create wireframes or prototypes you cannot explain or modify
☒ write your development journal, reflection, or evaluation
☒ generate user testing feedback or fake evidence
☒ produce entire functions, components, or pages that you copy without modification
☒ rewrite your reflections to sound more professional
If AI does the thinking, the work is not yours.
Assessment-Specific Rules
Web Design (AS91901)
| Allowed | Not Allowed |
|---|---|
| AI explains UX principles or WCAG criteria | AI generates your wireframes or prototypes |
| AI suggests colour palette options | AI creates your design rationale |
| AI helps you understand accessibility tools | AI writes your user testing analysis |
| AI explains what a heuristic evaluation is | AI produces your iteration justification |
Your design decisions must be yours. You must explain why you chose specific layouts, colours, and interactions during your design critique session.
Programming Project (AS91906)
| Allowed | Not Allowed |
|---|---|
| AI explains a concept (e.g., recursion, Big O) | AI writes your algorithm or core logic |
| AI helps you understand an error message | AI generates functions you submit as your own |
| AI suggests test cases for you to adapt | AI writes your test plan or testing code wholesale |
| AI explains why a data structure is suitable | AI designs your solution architecture |
Your code walkthrough is the verification. You will explain your code line-by-line to your teacher. If you cannot explain a function, it does not count as evidence.
Development journal must document AI use honestly. If AI helped you understand something or debug an error, record:
- what you asked
- what AI suggested
- what you did with the suggestion
- whether it worked and why
Full-Stack Website Project (AS91903)
| Allowed | Not Allowed |
|---|---|
| AI explains how Express middleware works | AI generates your API routes |
| AI helps debug a CORS or integration error | AI builds your frontend or backend components |
| AI suggests SQL query optimisations | AI designs your database schema |
| AI explains deployment configuration | AI writes your README or documentation |
Your live integration demonstration is the verification. You must show data flowing through all layers and explain what happens at each step. You must handle follow-up questions like:
- "What happens if this API call fails?"
- "Why did you structure your database this way?"
- "How would you add a new field to this endpoint?"
How Teachers Check AI Use
Teachers will:
- Ask you to explain your code — walk through functions, explain logic, justify decisions
- Ask you to modify something live — add a feature, fix a bug, change a query
- Ask why you chose a specific approach — algorithm, data structure, schema design, API structure
- Compare your commit history with your final code — a project with 2,000 lines in one commit and no earlier history is a red flag
- Review your development journal — missing entries, vague descriptions, and sudden jumps in capability indicate potential issues
- Look for code patterns inconsistent with your skill level — advanced patterns you can't explain
If you can't explain it, it may not be accepted as valid evidence.
Red Flags That Trigger Scrutiny
☒ Code that works perfectly but you can't explain how
☒ A commit history with one or two large commits near the deadline
☒ Development journal written in one sitting (timestamps and style are checked)
☒ Sophisticated patterns you haven't been taught and can't modify
☒ Reflections that sound generic or could apply to any project
☒ Different coding style between in-class work and submitted work
☒ Inability to debug or extend your own code when asked
These patterns don't automatically mean misconduct, but they will prompt your teacher to investigate further.
Safe Ways to Use AI
The "Explain, Don't Generate" Rule
Instead of asking AI to write code for you, ask it to explain concepts so you can write the code:
| ❌ Don't ask | ✅ Do ask |
|---|---|
| "Write me a binary search function in Python" | "Explain how binary search works step by step" |
| "Create an Express route for user CRUD" | "What does a typical Express route handler look like?" |
| "Design a database schema for a task manager" | "What should I consider when designing a relational schema?" |
| "Fix this error in my code" | "I'm getting this error — what does it mean and where should I look?" |
The Documentation Rule
If you use AI for anything, document it in your development journal:
### AI Use — March 15
**What I asked:** "What does a 403 status code mean?"
**What AI said:** It means the server understood the request but refuses
to authorise it — different from 401 which means not authenticated.
**What I did:** Updated my API error handling to return 403 for
permission errors instead of using 401 for everything.
**Did it help:** Yes — my error responses are now more accurate.
This transparency protects you. Undocumented AI use that is later discovered is much harder to defend.
Common Mistakes That Cause Problems
☒ Generating code at home and submitting it without understanding
☒ Using AI to write "a first draft" that you then submit with minor changes
☒ Only submitting polished final work with no process evidence
☒ Writing reflections that describe what happened without explaining why
☒ Claiming AI was only used for "minor help" when the code doesn't match your demonstrated skill level
☒ Not keeping early drafts, pseudocode, or intermediate commits
These are risks to your grade, not shortcuts.
If You're Unsure
Before using AI for anything assessment-related:
- Ask your teacher first — a 30-second question can prevent a serious problem
- Use AI to explain, not to produce — understanding is the goal
- Document everything — what you asked, what it said, what you did
- Keep your process visible — commit regularly, write journal entries weekly, save drafts
- Test yourself — can you explain every line? Can you modify it on the spot?
When in doubt: do the thinking yourself first.
Final Reminder
AI use that replaces your learning:
- risks your grade
- risks your assessment validity
- cannot be defended in moderation
- undermines the skills you need for further study
AI use that supports your learning:
- is allowed and encouraged
- helps you understand concepts faster
- strengthens your ability to explain your work
- builds genuine capability
The difference is whether you understand the result.
End of AI Use Guidance – Year 13 Digital Technologies