What we learnt from testing our data visualisation e-learning
We recently published our data visualisation e-learning. This 11-module e-learning product is an introductory level course. It provides an overview of best practice approaches for creating and publishing charts and tables. The e-learning gives plenty of detail about making charts and tables accessible.
We conducted user testing with colleagues in the Office for National Statistics (ONS) as well as other government departments to develop this resource. This has been crucial to the success of the e-learning.
Alpha phase – getting volunteers and setting up sessions
We started our alpha phase of testing by advertising for volunteers on a range of communication platforms. We recruited volunteers by:
- asking colleagues in our wider team
- posting on ONS Yammer
- posting on Basecamp and Slack, which are both cross-government platforms
- emailing our presentation champions
This approach was more successful than we had anticipated. Within two days we already had over 35 volunteers!
Advertising on different platforms enabled us to get testers from many different departments and different roles. Our testers ranged from people who knew the content well, to people who were completely new to data visualisation.
We contacted each volunteer to arrange a suitable time for a 30-minute user interview session. We sent each volunteer a hyperlink to one of the 11 modules. We asked them to complete the module and answer a short questionnaire on SmartSurvey before their user interview session.
The questionnaire contained five short multiple-choice questions and the answer options varied from “strongly agree” to “strongly disagree”. The questionnaire helped us keep a record of how many people had tested each module. It was also a useful tool to collect quantitative data. The questionnaire helped the testers to evaluate the e-learning and think about the comments they might give. For example, it made them think about the length of the module and the suitability of the exercises.
Before we started the user interview sessions, we discussed which topics we wanted to collect feedback on. We decided that we would like to gather feedback about module length, functionality, structure, and exercises.
Alpha phase – the sessions
During the 30-minute user interview sessions we spoke to our volunteers about their thoughts on the e-learning.
We started by asking about their background and experience of e-learning in general and the topic of data visualisation. We then let them lead the discussion. As they spoke, we noted down their responses in a spreadsheet. We followed up with prompts to cover any topics on our list they had not already mentioned.
The first few people we interviewed were members of our larger team. This allowed us to check that:
- the e-learning was working correctly
- our testing approach was appropriate
It was also a good way for us to get used to conducting user interviews, as we had not done these before. Our colleagues made suggestions on how we could improve the user discussions.
Beta phase testing
After the alpha phase we looked through the responses from our volunteers. We identified common themes and turned them into action points. We took a week between the alpha and beta phase to work on these action points and make changes to our content.
After the second round of testing, we had spoken to a total of 56 testers from across government. Common themes started emerging. For example, there were several requests for more exercises, and many testers said it would be helpful to see how long each module would take to complete at the top of the page.
It was interesting to see the different learning styles our testers had. We tried to accommodate this by adding videos and more interactive exercises. We dedicated two weeks to making changes from the beta testing phase. We also caught up on any suggestions from round one which had been reinforced during the second testing phase.
Launching the training
After the updates and changes had been made, we launched the e-learning. We posted an announcement on a range of platforms and asked for our e-learning to be promoted in newsletters across government. We also contacted the ONS Learning Hub and Civil Service Learning. Both portals are now signposting users to our e-learning.
Our promotion seems to have been successful. The e-learning landing page is now the second most viewed page on the Analysis Function website since January 2023, with over 12,000 views.
And we have not only had interest in our e-learning from within the UK. We have recently been contacted by the Bureau of Statistics in another country to ask if they can translate the e-learning so it can be used by their civil servants.
Lessons learnt
We found running the testing process insightful for improving our e-learning offering. But we were also surprised to find we got so much more valuable insight and experience from the process too.
Through testing and developing the e-learning we have been able to network with other departments. The process has helped us find out more about best practice and highlighted similar projects going on in other areas.
Looking back at the process, we learnt a lot about how to conduct user testing. We have identified areas where we could improve if we were to repeat this process for future products. For example, we would expand on the questionnaire, which would enable us to gather more quantitative data about each module. We would also allocate more time to make changes between the alpha and beta phases.
We are continuing to collect feedback and learn from our users. Following the publication and promotion of the e-learning we have chosen to keep a link to a SmartSurvey form asking for feedback. So, if you try our data visualisation e-learning, please let us know what you think of it by filling in our survey.