User testing for content and wayfinding

We know that testing content is very important. But how do you actually do it in a way that gives meaningful results and does not take up too much time and mental energy?

Like so much design and strategy work, being organised and having a process comes in really handy when testing content and wayfinding.

strategyAsset 3.png

This is especially true when you are working in teams with limited time and multiple cycles of testing. You need everyone to get on the same page very quickly.

I have created a testing process that works really well for me. It is something that can be very quickly adapted to fit new contexts, personal working styles, objectives and teams.

 

The eight steps of content testing

Elle Geraghty content testing process

1 - Trigger

When it comes to testing content there are bunch of questions you want to know the answers to before you start. For example, ‘what are we trying to find out” and “is research the best way to do that?” 

Let’s be really clear on why we are doing this work, and very importantly what resources we have to address the findings. There is no point doing research if you don’t have the capacity to act on your findings.

It is important to understand really well what has triggered this research. Is it because of new or refined features, widgets, visual design or content? Or maybe a new technology or project direction. Or are we just doing a long overdue content review :)
 

2 - Build

Build the prototype
It doesn’t matter if it is paper sketches, a google doc, a sketch/ invision prototype or another favourite toolset - just make sure it all works as expected and highlights the content you want to test really clearly.

Build the test plan
I usually do this in a google spreadsheet with questions that will show me that users can understand and act on the content I am testing. I never ask if users like the content, rather I ask them to complete tasks that will show me if the content is usable or not. This is based on best practice user testing. 

Brief in other research team members
It’s really important the team who are testing with you understand what you have built and hence what you are trying to test. Treat this as a test of your test. I usually find at least one or two ways to improve based on areas that are fuzzy for my teammate.
 

3 - Recruit and manage candidates 

Candidate selection and booking generally takes a lot longer than expected so it’s important to get on to this early. If possible develop a pool of candidates you can come to regularly. When I was working at Qantas we were very lucky to be able to drop into the airport and find users there - conveniently with time up their sleeves waiting for a flight - but mostly it can be a hard slog finding the right users.

Writing a candidate brief is really important no matter how you are conducting your research. Part of this is asking screening questions and having a good exit strategy in place if you realise that your candidate is not appropriate. This can happen even with candidates supplied by an agency. Life’s too short to be working with candidates who can’t give you actionable insights.

4 - Rehearse

Rehearsing your testing session is essential. First things first - familiarise yourself with the prototype - you need to know it back to front.

Then do a full tech check and practise using whatever comms methods and other equipment you will be using, for example meeting room, signage, microphones, recording equipment, phones, skype etc, invision, screen sharing etc. You want to be fully focussed on the candidate not whether you are recording or not :)

Will you be running the research with someone else? If so, ask them to join you to rehearse the tasks from start to finish. This ensure you both know your roles, how the prototype works and what level of note taking is required.

If you are working by yourself ask a colleague to act as a candidate for you so you know exactly how the session will run and address any issues early.

Create a checklist for yourself so you know you have everything you need, for example,  make sure you have access to the internet, your phone, the correct links to prototype, candidates contact details, incentives etc.
 

5 - Conduct and record

I’m not going to go over this area in great detail as there are so many great resources about how to conduct user testing already, but I there are a couple of things that I always do during an interview which helps quickly understand what actions should be taken as a result of the testing overall.

Do a mini review at the end of each session and note the top three relevant findings from the candidate - look for things you know you need to change first and then other signicant findings.

If you are working in paris (which is my preference) as part of your post interview mini review also ask for feedback on your interview style and adjust where necessary - your colleague may notice you leading at a certain task or have a suggestion about some extra guidance that is needed due to a prototype failing.

Notes should indicate not just pass / fail, but also levels of uncertainty and comments and questions asked by the candidate if relevant. This is particularly relevant with a fail response. If a candidate can’t complete a task - what was their ‘wrong’ understanding or undesired behaviour. This may show you how the prototype or copy needs to be changed for the next round of testing.
 

6 - Review

At the review stage I do two things.:

  • Conduct a research team review of data to ensure it has been recorded correctly and we are all in agreement.
  • Code results into a consistent roll up that shows - preferable with colour - which tasks passed and which failed. This will provide you prioritisation guidance for the next sprint/ focus area.
     

7 - Share

Once a cycle of testing is finished decide the best way to share with the rest of the project. Sometimes you just need to improve the prototype and test again but I find it is useful to share high level findings with your team so they understand what you are up to and why you are making the decisions you are. It is also a good reference as time passes and those decision get harder to remember :)

It is good idea to take the time to visualise or these findings to make it easy to see high levels findings at a glance.
 

8 - Repeat

Make improvements to the prototype and do it all over again! 


How are you testing your content? Do you have a clear process? How are you improving that process? I'd love to hear your story.

Elle GeraghtyComment