Key takeaways

  • Student confidence jumped from 2.78 to 4.26 out of 5 in a single 90-minute session
  • Knowledge improved across all three assessed domains: classification, diagnosis, and treatment
  • Post-session kahoot scores correlated with clinical exam (OSCE) performance
  • Playing anonymously, with self-chosen nicknames, turned out to improve participation rather than limit it

“The class was very fun, I didn’t fall asleep, and I can remember everything well.”

That’s a fifth-year medical student, writing anonymously on a feedback form after a 90-minute lecture on lower urinary tract symptoms. Medical school is a high stakes environment, where each lecture is backed with information, moving rapidly from slide to slide, while students try to keep up, or at least not to fall asleep.

In a study published in MedEdPORTAL in 2025, two physicians decided their standard lecture wasn’t serving their students well. And they were right.

When standard lecture format wasn’t working

Lower urinary tract symptoms are covered in a single 90-minute slot within a six-week surgical course spanning 24 lectures and 112 hours of clinical training. Students arrive with varying levels of preparation. The content is dense. And traditional lectures in that setting tend to do what traditional lectures do — deliver information in one direction while students try to absorb it.

The instructors wanted something different. What they built combined two approaches: a flipped classroom structure, where students received preclass materials a week in advance, and during class, Kahoot!-based activities replaced the lecture itself. The in-class session ran as three rounds of kahoot questions covering classification of symptoms, differential diagnoses, and treatment, with the instructor pausing after each round to address misconceptions and connect questions back to the preclass material.

The final round, covering the most complex clinical scenarios, introduced a team element. Students discussed each case in groups of four before submitting their individual answers. The group element made sure students had the ability to discuss, reason, and practice decision making with peers.

Post-session scores predicted final exam performance

Students’ knowledge scores improved across all three domains: classification, diagnosis, and treatment. Students who scored higher on the post-session quiz were more likely to pass both the multiple-choice and clinical reasoning parts of the final exam. There was also a statistically significant correlation between post-session scores and OSCE performance, which is the practical exam where students demonstrate clinical skills with simulated patients, in front of an examiner, under pressure.

That’s a connection between a formative, game-based activity and a high-stakes clinical assessment. It’s a modest correlation (r = 0.20), and the study noted that the final exam spanned six surgical specialties, not just urology, while the Kahoot!-session covered one.It is still aligned with other research indicating a connection between game-based formative assessment and final exam scores. For instructors, this matters in two directions: as a diagnostic signal to identify students at risk of falling behind, and as a tool that makes learning progress visible for students themselves, giving them a chance to course-correct before the stakes get higher.

Confidence as a leading indicator

In medical education, confidence is not a soft metric. Students who don’t believe they can manage a clinical scenario tend not to manage it well, not because of ability, but because doubt interferes with performance under pressure. Building genuine confidence, grounded in actual knowledge retrieval, is part of what good formative assessment is supposed to do.

Before the session, students rated their confidence with the material at 2.78 out of 5. After the session, their confidence level was at 4.26, a statistically significant jump within a single class period. The time wasn’t spent on long slide decks, but actively answering questions and debating them with peers.

Anonymity was flagged as a limitation — then recognised as a strength

Students participated using self-chosen nicknames. The researchers initially listed this as a limitation: without real names attached to kahoot responses, they couldn’t link in-session performance to individual exam results.

By the end of the paper, they’d reversed that assessment. Anonymity, they concluded, created a low-stakes environment that kept participation open and genuine, particularly during the harder questions where the risk of being visibly wrong might otherwise have made students hesitant. A student who doesn’t answer because they fear being wrong learns nothing. A student who guesses, gets feedback, and adjusts is in a learning cycle that sticks.

A session design any instructor can borrow

This study is deliberately specific: one topic, one institution, one session. The researchers are candid about its limits: no control group, no rigorous tracking of how much preclass material students actually completed, and a final exam covering far more than urology.

But specificity has its uses. This is replicable session design with a clear structure: preparation materials distributed in advance, in-class questions sequenced by complexity, instructor-led brief after each round, team discussion introduced for the hardest content, and anonymity preserved throughout. This design isn’t particular to urology, or even medical training. The general principle is simple: structured, low-stakes retrieval practice with immediate feedback can leave a trace on high-stakes outcomes. And it applies wherever students disengage before material has a chance to land.

This brings us back to the student who didn’t fall asleep. That’s where it starts, staying present long enough for learning to happen. What this study suggests is that when the conditions are right, what happens in a well-designed formative assessment session makes a real difference for learning outcomes.