The Insighter

January 2011

View all articles on our new site

Neo Insight's e-newsletter on Customer Experience topics and techniques.

We invite you to subscribe to our monthly
e-newsletter:

In this issue


Move people forward - new research techniques to improve navigation

We’ve been applying some new online research techniques to the evaluation of both proposed and existing information architectures on our clients’ websites. These techniques focus on the words and labels that drive visitors through their top tasks. In this article, you’ll learn to combine techniques and tools to provide both qualitative and quantitative insights to improve your information architecture.

“The primary purpose of web navigation is to help people to move forward. It is not to tell them where they have been, or where they could have gone.” – Gerry McGovern

Results from these types of studies will help you identify which words or labels are moving people forward, and which are hindering progress. Your studies can compare different information architecture models and performance across groups of people. Aggregating metrics of success, speed, paths taken and confidence will validate what is working well and identify barriers or gaps in your designs.

Choose a technique to improve navigation
Let’s assume you already have identified the top tasks on your site. You may have performed a task-voting project or triangulated on your top tasks using a variety of data from your site. When developing a site that supports users' top tasks, many clients first develop an information architecture. These tools and techniques allow for very early testing of that architecture and the navigation links against the top tasks, well before any visual design or even page templates have been designed. Here are three ways you could test the top task navigation:

1. Unmoderated test: Conduct quick and iterative online tests to produce quantitative data on success rates and overall speed from a large pool of visitors.
Try a test yourself in a quick 3-task demo.

2. Moderated test: Conduct a remote screen-sharing study with a small group of visitors, using wireframes (mock-ups of page designs) or an early prototype, or even just the navigation hierarchy, to collect success data along with qualitative insights gained by probing for the reasons behind each participant’s choices.

3. Blended approach: Combine the tests above to develop a rich picture of the interaction between top tasks and the website design, iterate the design and test again.

Which words and labels move people forward?
By launching an unmoderated test, you can quickly find out whether visitors are being misled by particular headings or are getting lost in similar sets of links. One tool we’ve used recently is TreeJack, by Optimal Workshop. We use it to test the phrasing and organization of navigation links and menus. We put in the top tasks and the tree structure of your architecture and assign the ‘correct’ answers for each task. Site visitors are invited to spend 5-10 minutes on an exercise to improve the site. We find it most effective to have them perform a random subset (5 to 10) of a larger set of tasks, to ensure complete coverage of all top tasks.

Example of the TreeJack task process

treejack_example

Moderated sessions provide the 'Why'
In parallel with the collection of the tree data, you can hold moderated sessions – online of course, using remote screen sharing. In a recent study, we had the participants perform a set of TreeJack tasks using a "think aloud" protocol. That is, they talked about their choices as they made them. Then the participants performed a set of tasks (from the same pool of TreeJack tasks) but using wireframes that looked more like a home page, with links brought forward under headings. After each task, participants were asked to rate their confidence that the last link they clicked would ultimately take them to the task solution. Probing on these confidence questions provided insights into participants’ behaviours – why they chose one category and not another. These insights, together with the quantitative data, provided the necessary background to inform and recommend design changes.

Example of a Wireframe Mock-Up for Moderated Sessions

wireframe example

 

Results identify poorly-differentiated and misleading links
People move forward when links are clear and easy to differentiate from other links. By testing your current information architecture, you can identify poorly-differentiated categories at the lower levels. In a recent test, there were three categories under a particular heading, only one of which held the answer for the task. Looking at the data, we could see that all of the participants successfully made it to the correct heading, but after that, spread their answers across the three categories. Thus half of the participants selected the wrong category. The testing enabled us to identify where visitors' success rate is low, and where the web team could focus their efforts with high impact. The team is currently redesigning those three categories to clarify and differentiate them.

Example of Results for Poorly-Differentiated Categories

Undifferentiated category results


The test may also reveal that top level headings aren’t clear to visitors. The data may show that many try a particular path, only to return and try another. The category labels themselves can be misleading – we’ve seen a top task in which over 80% of participants selected a completely different category than the ‘correct’ category as designated by the design team.  Insights from the moderated session have helped the team understand the issues underlying the poor fit, and they will be moving the task content out of that category.

Improving top task success with information architecture testing
Moving people forward is about improving the success rate on top tasks. The methods we've described can be used during the design process to compare an existing architecture to a new design – not to show that one is better than the other, but to learn from both. We recommend incorporating the learning into a revised design, and then testing again to ensure the weaknesses are resolved. 

Finally, don’t feel constrained by the online tools. Many are flexible or can be adapted into the tool you need. In one case, we adapted an online survey tool to mimic a tree test that would allow multiple selections within the categories. We’ve also mocked up wireframes to do multi-click tests when the one-click options like Chalkmark or video tools like Usertesting.com didn't meet a particular research need.  We would love to help you test your interaction design and information architecture. Contact us at 613-271-3001 or email us.

Related book and articles

Back to Top


Free webinar: Telecom websites: Helping customers help themselves

Regular customers want to complete their tasks quickly when they visit their account or online support areas of Internet and Telecom provider sites. It is in the interest of companies and customers alike to provide simple, unambiguous paths to content that will answer queries or resolve problems. What are the pitfalls that threaten your support site, and those of your peers? How can you avoid these pitfalls, and service your valuable customers?

The monthly bill may be the largest reason your regular customers visit your website. But is your website optimized for this increasingly important task? Or do you treat regular customers like you treat everybody else? Do you offer unique navigational menus to regular customers, or do you treat them like a stranger? Once customers sign in, do you remember what they tell you? Or do you show them web content and ads for the general public, and miss opportunities to generate interest in products and services that are important to them?

Hear Gerry McGovern, Mark Crowley, and Neo Insight's Scott Smith show recommendations of how to improve support for the top tasks of your regular customers. We will show examples from Internet and Telecom service providers and give recommendations that would also apply to any organizations that have regular customers.

The webinar takes place Wednesday, March 9, 2011 11:00 AM - 12:00 PM EST

Hear Gerry McGovern's recent discussion with John Blackmore of IBM on how they tripled sales leads; now available in video, slide and text format.

Back to Top


Quote of the month

"Using more than one user research method during a project provides a more complete and reliable picture of the entire user experience."

Beyond the Usability Lab


If you have any comments on The Insighter, or ideas on usability topics you'd like to send us an email.


We invite you to subscribe to our monthly
e-newsletter.

 
 
  Home   About Us   Services   Case Studies   Training   Teamworks