Promo banner background

[Webinar] Go beyond MCQ with the University of Edinburgh

Explore the variety of our questions to boost student engagement

HomepageAbout us

Designing Better, Together

The monthly post from our Head of Learning Innovation

This article explores five features co-built with our users.

In the first article of this series, I shared our approach to co-construction at Wooclap: how we structure listening, how our teams participate, and why this approach is an integral part of our design philosophy.

Today, it’s time to get practical. Here are five stories of features co-built with our users — five conversations that helped move our tools forward.

Cinq conversations qui ont fait avancer nos outils

Integrating Script Concordance Testing

It all began with an in-depth discussion with Professor Bernard Charlin, surgeon and Full Professor at the Université de Montréal, a recognized researcher in medical education. His work laid the foundation for integrating Script Concordance Testing (SCT) into Wooclap.

As the creator of this evaluation method, he devoted much of his research to analyzing clinical reasoning and improving training tools for healthcare professionals. Script Concordance Testing invites students to compare their reasoning with that of a panel of experts when faced with uncertain situations — a way to refine their clinical judgment.

Adapting this approach to Wooclap required multiple adjustments: question format, feedback type, and comparison logic. Thanks to this collaboration, followed by a testing phase with a small group of instructors, a structural innovation emerged — the creation of TCS (Script Concordance Test) and TCJ (Judgment Concordance Test) question types, now available in both Wooclap and Wooflash.

🎥 Watch Professor Charlin’s presentation in this webinar

When a collaborative feature becomes a learning lever

During a co-construction workshop on Wooflash, discussions revealed the potential of the “student suggestions” feature, which allows learners to propose questions to be added to a course.

Originally designed as a simple collaboration option, this feature was expanded thanks to user feedback. It’s now possible to rate student proposals (star rating) and add feedback, turning it into a space for formative reflection.

The result: a shift from a participatory creation tool to an active learning device, where formulating a question becomes a learning act in itself.

Co-designing an attendance sheet with ten organizations

The idea of an attendance sheet came from converging feedback from organizations such as UPenn, UPorto, UEdinburgh, and Allianz Benelux. Each faced the same challenge: how to easily track attendance without manually merging exports or using third-party tools?

We worked with representatives from these institutions throughout the design process, from early brainstorming to prototype testing. We shared mockups and organized a series of real-life tests. These conversations helped us define the key functionalities and simplify the interface.

Learn more about this feature

The process also revealed a key need: integrating attendance data into the LMS. We therefore continued co-construction around integrating this feature through the LTI 1.3 standard.

Evolving MCQs and polls through one simple detail

It was Emmanuel Zilberberg, Assistant Professor at ESCP Business School, who suggested a simple yet powerful improvement: allowing students to justify their answers in a multiple-choice question.

“The comment option creates a space for expression that sheds light not just on the answer, but on the reasoning behind it,” he explains.

Developed with his input, this feature fosters clarity, encourages reasoning, stimulates reflection, and helps instructors identify misunderstandings. It also offers insight into the learner’s thought process, enabling more accurate assessment and tailored support.

What 300 instructors taught us about AI in education

At the end of 2022, as we began exploring AI-powered question generation, one thing was clear: we couldn’t move forward without grounding the project in real teaching practice.

We launched a beta testing waitlist offering early access in exchange for concrete feedback through user interviews. Hundreds of educators — from different disciplines, countries, and levels — volunteered to participate.

Our Product team, led by Wandrille, conducted dozens of interviews and analyzed hundreds of written responses in just a few weeks. This iterative process allowed us to release a first public version as early as March 2023.

In parallel, we hosted broader workshops to understand teachers’ expectations, practices, and concerns about AI in education — insights that continue to shape our convictions today.

That work has since evolved into the Wooagents Lab, a community of engaged beta-testers co-creating the next generation of AI-powered pedagogical assistants, designed to make Wooclap easier to use and even more impactful for learning.

What these conversations teach us

Behind every exchange, every piece of feedback, and every test lies an opportunity to learn — provided you listen with structure. Here are a few lessons from our co-construction process:

  • What you rarely hear can be decisive. A single remark can trigger meaningful change.
  • Prioritizing is also about respect. Co-construction means making clear choices while maintaining users’ trust.
  • Listening isn’t enough if you don’t widen the circle. The most vocal are often the most tech-savvy — we must also create conditions to hear others.
  • A solid product idea is always a well-formulated pedagogical idea. Clear use cases and learning objectives are the best starting points for design.

And beyond the features, we also think collectively about what surrounds them. We’re currently working on positioning Wooclap and Wooflash within the ABC Learning Design framework.

In late June, I had the opportunity to host a workshop in Paris with instructional designers from France and Belgium to explore how our tools fit into this model.

Wooclap organized a workshop on the positioning of Wooclap and Wooflash within the ABC Learning Design framework, in collaboration with instructional designers.

👉 Go deeper

Explore the principles behind Wooclap’s approach to co-construction.

Writer

Arlène Botokro

Head of Learning Innovation at Wooclap. With 10 years of experience in pedagogy and digital learning, from Sciences Po to international consulting, I make sure our tools are co-designed with educators and grounded in research and real-world teaching practice.

Get the best of Wooclap

A monthly summary of our product updates and our latest published content, directly in your inbox.