Digital learning adds an average of 4 months' progress.
That's the headline from the Education Endowment Foundation's Teaching and Learning Toolkit — one of the most respected evidence reviews in education. +4 months. Nearly as much as one-to-one tutoring (+5 months). At a fraction of the cost.
But there's a word in that finding that changes everything.
What the gamification research says
A systematic review of 24 empirical studies on gamification (Hamari et al., 2014) found that gamification produces positive effects on engagement and learning outcomes. The strongest results were in education — not marketing, not health, not productivity. Education.
But the review was emphatic on one point:
Quality of implementation matters more than which mechanics are used. Points, badges, and leaderboards are common — but they're not what drives the effect. The design is what drives the effect.
The features that actually matter
Cross-referencing the gamification research with Dunlosky's learning science and the EEF Toolkit, a clear pattern emerges. Effective digital learning tools share these features:
- Retrieval practice — the tool asks questions, not just presents information. Practice testing is rated HIGH utility by the most comprehensive learning review ever published
- Immediate feedback — the child knows instantly if they got it right or wrong, and why. Feedback adds +6 months of progress (EEF). Worksheets return feedback days later — if at all
- Adaptive difficulty — the tool adjusts to the child's level. Too easy = boredom. Too hard = anxiety. The sweet spot is where learning happens
- Spaced repetition — the tool brings back content at increasing intervals. Daily review is Rosenshine's #1 principle. Spacing is Dunlosky's second-highest rated technique
- Visible progress — the child can see themselves improving. TIMSS 2023 data shows confidence correlates with achievement. Visible progress builds confidence
The features that don't matter (much)
Conversely, some popular EdTech features have weak evidence:
- Badges alone — cosmetic rewards without underlying learning mechanics are engagement theatre. They feel good but don't improve outcomes
- Passive video content — watching an explanation is closer to re-reading (LOW utility) than practice testing (HIGH utility). Video can support, but shouldn't be the main activity
- Gamification without curriculum alignment — if the game mechanics aren't applied to actual curriculum content, children learn the game, not the subject
- Screen time as the metric — time spent matters less than what's done with that time. 20 minutes of retrieval practice beats 60 minutes of passive scrolling
A parent's checklist for evaluating any tool
Before paying for any digital learning tool, ask these questions:
- Does it quiz or just teach? — retrieval practice is the mechanism. If the child is mostly reading or watching, the tool is LOW utility
- Is feedback immediate? — knowing right/wrong within seconds is essential. "We'll mark it later" is a worksheet with a screen
- Does it adapt? — one-size-fits-all content is no better digital than it was on paper
- Is it aligned to the actual KS2 curriculum? — fun maths games that don't cover SATs content won't help in May
- Can you see progress? — if the parent dashboard doesn't show what's improving and what isn't, you're flying blind
- Does your child choose to use it? — the best revision tool is the one they actually open
Digital learning works — when it's built on retrieval practice, immediate feedback, and adaptive difficulty. Not when it's a textbook on a screen with cartoon characters. See how SATs Arcade is built on these principles.
Sources: EEF Teaching & Learning Toolkit (2024); Hamari et al. (2014), "Does Gamification Work?", HICSS; Dunlosky et al. (2013), "Improving Students' Learning"; Rosenshine (2012), "Principles of Instruction"; DfE Parental Engagement Evidence Review (2024); IEA TIMSS 2023
Found this useful? Share it.