CDPS’s skills team have been measuring their past performance against CDPS standards – it’s been a bit painful but worth it
Since late 2021, CDPS has trained more than 600 people from across the Welsh public and third sectors in digital and Agile skills. We’ve had lots of positive feedback, and there’s appetite from across Wales for us to continue. But as we explained in our last blog, we’re pausing our training over the summer to get a better understanding of our users’ needs.
Holding ourselves to account
Even though our training was well received, we needed to explore and evaluate what we did, what worked well and what didn’t. We wanted to ‘walk the walk’ of our users and really understand their experience.
One of the ways we did this was to test our training programme against the CDPS Digital Service Standards – the CDPS benchmark for great, user-centred services in Wales. Although the training wasn’t designed as a service, it was consumed as one and important to assess it as such.
The Service Standards criteria
The Service Standards are broken down into three areas (‘Meeting user needs’, ‘Creating good digital teams’ and ‘Using the right technology’), with requirements within each one.
We reviewed each of these areas in detail against our training and scored ourselves from 1 to 5 (with 1 being the lowest).
How did we perform?
Frankly, not very well. We identified gaps in the way the training was conceived, run and promoted. But we did look through the lens of an ideal scenario, setting the bar high. We’re calling attention to the areas with the biggest shortfalls, as well as the areas we did well on, to learn from the experience.
Here are some of our main findings against the Digital Service Standards:
Understand users and their needs
When we started the training, we hadn’t carried out enough user research to establish what the user need was and whether we’d met it. For example, how easy was it for participants to translate their training into their day jobs? Which type of training was best suited to which type of participant? We didn’t identify what 'good' looked like.
Work in the open
There was no oversight of our end-to-end communications, which resulted in a misalignment between our promotional communications (for example, on social media and in the CDPS newsletter), our admin communications (for example, the emails that we sent to participants) and communications from our training partners. This created an inconsistent and muddled comms experience for participants.
Use scalable technology
We used a third-party booking site that prevented us from getting full oversite of the end-to-end booking process and associated communications. We weren’t sure:
- when and how calendar invites were sent or if they were duplicated
- whether the invite was compatible with the recipient’s email and calendar software
- how easy it was to change a booking
Have an empowered service owner
We scored highly here, thanks to having a dedicated service owner with the authority to make all business, product and technical decisions about the service. The team felt the strong impact of clear ownership, and it’s something we’ll maintain.
Painful but worth it
Assessing our training against the Service Standards exposed the need for a more holistic, wraparound service. We need to have oversight from an end-to-end perspective. We need to understand more about what brings people from the Welsh public sector and third sectors to CDPS and how we can help improve the services the public sector runs.
This assessment process has been a little painful at times, but 100% worth it. We can now create better training that:
- meets user needs
- is more relevant to participants’ jobs
- ultimately gives participants skills to improve public services in Wales
If you want to find out more about the process we went through and how it might help your organisation, we’d love a chat.