How we set up digital user research at MTVH
My role as User Experience Designer is to make sure our websites are easy to use and make sense to the people who use them. One of the ways to understand how to improve a website is through usability testing. In a usability testing session, we ask participants to complete tasks (raising a repair, for example) on a prototype or website, thinking out loud as they do so.
Talking out loud reveals misunderstandings and expectations of what will happen next. We watch for people’s mental models, motivations, worries and assumptions. This gives us a better understanding of how our users perceive our site, a list of things people struggled with, and ideas on how to improve.
Previously, digital research and development at MTVH was done in partnership with digital agency DXW. In 2020, we worked on bringing this research and development in–house. We’ve started a regular schedule of usability testing sessions, and this article explains how we set up the sessions, and the challenges we had to overcome.
We decided to test two of our in-house offerings, MTVH Online/MyTVH and SoResi.co.uk, running a full day of sessions once a month, for each product. This regular calendar of testing is regarded as best practice, when embedding usability testing into the digital development lifecycle.
As it is 2020, the usability testing would all be done remotely. After researching several tools, we chose Zoom. It has been wildly popular over the last year, making it more likely our participants are familiar with it. At the participants request, we would be happy to use their preferred software.
Our target research audience is our residents. On our resident engagement site alpha.mtvh.co.uk, we created a page explaining what usability testing is, and a sign–up form. This allowed us to build a list of people interested in participating. Then, a few days before the scheduled sessions, we sent these people an invitation email, with a link to a “You Can Book Me” page, on which people could choose a time to take part.
Planning the research session
The decision of what to test was made by consultation with the relevant Product Managers. We prioritised the areas of the product where there were most risks and unknowns, usually focusing on new features or known problem areas. A test script was then produced, breaking down the main learning goal into questions, scenarios and activities.
Doing the research session
Once a session has been booked, the Zoom meeting link was auto–generated by You Can Book Me, and sent to both us and the participant. On the day, the session was led by a moderator, with a notetaker and the relevant product manager observing if possible.
The moderator followed the script, leaving room to react to what the participant was saying, amending questions, tasks and scenarios as appropriate to learn as much as we could from the participant.
The test script was set up as a table, questions on the left; session notes on the right. When it came to review, it was easy to scan the right-hand column for what the participant said and did for each question.
After the session, we would then jump on a separate call, discuss and quickly capture the highlights and interesting observations.
Each participant was given a £30 Love2Shop voucher. Initially these were paper vouchers. Luckily, we didn’t have many, as shops are not open during the lockdown, and nobody could spend their paper vouchers! We have now ordered e-vouchers, which hopefully will work better for everyone.
A few sessions in….
At the time of writing, (end of 2020), we’ve completed 11 usability testing sessions with residents. We ran sessions on:
- The new ‘request a repair’ flow (MTVH Online)
- Legacy TVH resident’s reactions to the new branding that is soon to be implemented (MyTVH)
- How people search for shared ownership property (SO Resi)
- What people understood from the property listings (SO Resi)
- Too many invited! – Our first invitation email went out to 150 people, all 6 available sessions were booked up within 5 minutes. This led to frustration from people who tried to sign up after 6 minutes.
- Be ready for both mobile and desktop – To try and minimise the variations in the testing, we asked that people only join from a desktop or laptop. In those sessions, not a single person joined from a desktop or laptop! We learnt we must be ready for any device when testing.
What’s worked well
- Seeing people in their home environments, using their own devices. We’ve seen children interrupting, slow devices, and the phone ringing. This gives us real insight into the conditions in which people use our products.
- Screen sharing on Zoom on both mobile and desktop. We’ve been able to see people interact with our live sites and prototypes.
- Being flexible during the sessions. If a participant mentions something that is relevant to another area of the product, we can tease that information out of them, and make the most of the session with them.
What’s been challenging
- Rate of no-shows – Despite all 6 sessions being booked, on average only half would attend – by which time it’s too late to get a stand-in
- Understanding and learning how to get the best out of a session. Often it’s only afterwards that we realise we could have done something different, but this is a skill we are quickly developing. Moderation is an art form!
- Remote testing – We don’t know how the participant will join, so any prototype we make has to work for any device.
- How to test with real data – Using account holders’ real data means the usability test is a lot more meaningful, participants can point out things that relate to their real world. However, with real data there are many privacy and security issues to manage.
- How much to test – We knew the sessions would last an hour, but didn’t know how long each activity would take. Equally, if participants turn up late, we have to squeeze all the planned activities into the time remaining.
- Diversifying our participants – The sessions so far have only been held between 9-5, meaning some people are excluded from participating. Also, we have only been able to recruit people who already use the sites.
- Accessibility – While we are pretty confident MTVH Online is accessible, we need to run usability testing sessions with people with access needs to know for sure. The challenges here are
- Recruitment – we have begun opening up conversations with colleagues in the care and support side of the business, to see if they are aware of any residents with access needs, who may be able to help us with our research.
- Tools – we need to review and refine our set up to be able to talk to people with different access needs.
Our aim is to include more research earlier in our process. With these logistics and framework defined, it will be easier to have those open conversations with our residents, so we can continue to build people powered products.
If you are a MTVH tenant or homeowner, you can sign up to take part in research.