Tactile testing | Evaluation with blind and visually impaired audiences
Maya Sharma shares her experience of evaluating an engagement project with blind and visually impaired audiences
I recently completed an evaluation of the Tactile Collider project, a really exciting project delivered by physicists at the Cockcroft Institute in Warrington. Tactile Collider, which offers fun and hands on ways of engaging with particle and accelerator physics, is delivered by scientists and was developed specifically for blind and visually impaired young people.
Whilst The Audience Agency routinely works with hugely diverse audiences, this was the first time I’d evaluated with young visually impaired people and so I was excited to be involved.
My top take homes for undertaking evaluation with visually impaired audiences:
Co-design the approach with access specialists
Challenge any assumptions about which methodologies may work
Test and pilot the tools
Consider if and how digital can work
- Be careful about how you involve support workers in the process
Two students in the workshop using tactile objects as a tool to enhance learning
Some further thoughts about the project evaluation...
Involving an access specialist and audience gatekeepers in designing the approach
We kicked off by bringing the Tactile Collider team together to develop an evaluation framework. It was great to work with such a rounded team, not only scientists but also an access consultant – herself blind – who worked with the team throughout the project and a specialist teacher who worked with visually impaired pupils. This meant that we could really drill down into what kind of research tools might work with the young people.
Magnet poles diagram, using brail and 3D stimulus
Creative approaches are not always the best option
I initially thought we might need to explore some really creative methodologies but learnt that wasn’t the best solution for this group. We discussed a range of creative evaluation techniques but settled on some fairly simple methodologies for working with the young people:
- Before-and-after questionnaires
- One-to-one interviews
- Observations at the sessions
Creating the tools and learning about the challenges along the way
We then got to work designing the tools. We originally devised a paper-based system for the young people to use before and after the sessions, presented in braille and a variety of large prints to accommodate the range of levels of vision. This seemed, at the time, the ideal way to deliver the questionnaire. In reality though, it proved far from ideal!
Practical issues arose. Young people and their support workers arrived in dribs and drabs, some late, making it hard to administer the survey in an organised and systematic manner. Support workers were helpful in facilitating the young people to answer the questions, but in some cases, their help also involved some unnecessary steering of the young people’s responses.Pupils engaging in workshop
The whole approach was the ‘right’ way to do it…the nine-month period really helped – we were properly trained and the tool came out of the process
- Scientist practitioner
After this pilot run, we decided to try a digital version of the questionnaire that could be used via mobile phones, tablets or laptops. We’d originally been apprehensive about this, assuming that it would be very hard to create an accessible digital survey. We soon discovered however that, with close reference to best practice guidelines on designing digital surveys for visually impaired people, it really wasn’t that hard to create a simple online questionnaire. We then – crucially – tested it with some blind and visually impaired people and were delighted to hear it described by one person as one of the most accessible surveys they’d encountered! Delighted, but also slightly taken aback.
“You’re actually shown things, this really worked. It gives you a better grasp than being told things, you’re actually experiencing it”
- Youth participant
We then used this survey at several Tactile Collider sessions. There were still some practical issues, less to do with the tools and more to do with the practicalities of bringing together groups of young people and their support workers. We found that the digital surveys worked far better. The young people were able to choose whether to use their mobile phones (the preferred option), use tablets or complete the survey with the help of an adult. We saw that the young people were able to answer these survey questions in a far more independent way.