Pilot Testing

Once the initial version of the All on Board methodology had been shaped, the next crucial phase began: putting it into practice. For a full year, partner organizations piloted the methodology in their respective youth work contexts—applying its tools, adapting its approaches, and, most importantly, learning from the process.

This was not a theoretical trial or a one-time test. It was a long-term implementation effort that unfolded in real environments, with real challenges. Each organization used the methodology in the settings where they already work: local youth groups, workshops, civic engagement projects, and informal learning spaces. This allowed the teams to see how the tools performed in day-to-day realities, under different conditions, with different groups of young people.

The strength of this phase lay in its diversity. Because the partner organizations operate in varied local contexts—urban and rural, structured and informal, culturally distinct—the pilot testing produced a rich and nuanced understanding of how the methodology functioned across Europe. What worked well in one context sometimes needed adjusting in another, and what seemed promising on paper sometimes played out differently in the field.

Throughout the year, partners systematically gathered feedback. They tracked how young people responded, observed changes in group dynamics, noted which tools generated sustained engagement, and documented moments of resistance or drop-off. Equally, youth workers reflected on their own experience—how easy or intuitive the tools were to use, how they influenced their facilitation style, and what support they needed to apply them effectively.

This wasn’t about proving that the methodology “worked”—it was about learning where it needed to grow. Some tools were simplified; others were expanded. New layers were added to make space for different learning styles or to build more ownership from participants. Small but critical changes—like when to introduce reflection activities or how to build up to decision-making—emerged as key improvements.

At the end of the testing phase, the partners and youth workers reunited for a second Learning, Teaching, and Training Activity—this time in Alicante, Spain. This gathering marked the transition from trial to transformation. With a year of experience behind them, participants returned to the tools they had once co-developed, this time with the clarity of hindsight.

Over several days, the group shared their findings, compared outcomes, and collaboratively revised the methodology. The focus was on refining—not reinventing—what had already been built. The atmosphere was both practical and reflective, with an emphasis on honesty: what had truly worked, what hadn’t, and why. Through guided sessions, feedback circles,

and hands-on redesign work, the methodology was restructured to incorporate the lessons learned across all contexts.

The outcome of this phase was the final version of the All on Board methodology—an evolved, tested, and deeply informed framework ready for broader use. More than a set of tools, it became a shared resource: shaped by dozens of youth workers, tested by hundreds of young people, and crafted through a process that mirrored the values of the methodology itself—participation, reflection, adaptability, and collaboration.

In the end, the pilot testing phase did more than validate a set of practices. It brought the methodology to life, anchoring it in the messy, rewarding, and always evolving reality of youth work. It confirmed that effective engagement doesn’t come from fixed formulas, but from tools that grow with the people who use them—and from processes that invite everyone, youth and facilitators alike, to stay fully on board.



All on Board