• Looking for a Top Flight Web Development Team?
  • Professional & Affordable Web Design Services
  • Shaping Imaginations using Cutting-Edge Technologies
  • Dynamic Solutions for Dynamic Businesses
  • Analysis.. then Solutions with a High Tech Flair
  • Satisfied Customers In Over 30 Countries
  • National Association of
    Accredited Internet Professionals
  • (561)948-6074
Designing a website

Step 2 - Designing your site

Usability testing on the cheap

If you have a reasonable—or even a modest—budget for developing your website, you should invest in professional usability testing, conducted by neutral, experienced usability experts in a lab-like environment.


But if your budget makes this unrealistic, you can conduct serviceable user tests on your own with a few hours and a few willing volunteers. Even this simple setup can reveal important insights into how users will see your site.


"Even a user test of one person reveals an awful lot," says University of Illinois professor Michael Twidale, who offered some of the advice that follows. And, as Steve Krug says in his excellent book Don't Make Me Think, "Testing one user is a 100% better than testing none."

1. Set up your "lab"

The setting for your usability test can be as simple as sitting behind a co-worker at his desk and watching over his shoulder as he uses your new site. But to make your tests a bit more professional, you should look first to a setup in a real usability lab.


Most usability labs set up volunteers at a computer with a video camera pointed at them that records their facial expressions. Another camera records what they're doing on screen. This video is used in two ways: It's recorded to tape—for future use and study—and it's run through wires to an adjoining room, where observers can watch the test in real-time, seeing the user on one TV screen and a simulation of the web browser on another. (Some labs have the volunteer sit in front of a two-way mirror, through which they can be observed.)


Equipment needed

  • Computer with access to the Internet or just the site being tested

  • Video camera or camcorder

  • Adjoining room with video hookup (optional)

  • Television for simultaneous playback of user monitor (optional)

  • Second television for playback of user's face (optional)

2. Recruit volunteers

The type of volunteer you recruit will have an important impact on what you find. Professional usability labs usually draw from long lists of potential volunteers, solicited through phone calls or email by a professional screening service.


Recruiting volunteers

If you're building a large-scale usability effort, you'll probably want to hire a professional recruiting service. But you can recruit volunteers on a less expensive and less formal basis by looking to the following:

  • Customers. Existing customers are prime candidates for usability tests, because they're already familiar with—and interested in—your products. You can work from existing client lists or solicit volunteers through an ad on your site.

  • Co-workers. Co-workers can make fine informal interface testers, provided they don't work directly on the product being tested. Keep in mind, though, that internal testers know more about the product than real users. Also, their responses may be politically motivated.

  • Friends and family. There's nothing wrong with recruiting friends and family members to test new sites for you. Just know that they may not be entirely honest. Their comments may be more diplomatic than normal users. (Or less diplomatic, depending on your friends.)


Screening volunteers

Your volunteers should match the profile of your target user as closely as possible:

  • Familiarity with content. If your site is targeted to users with a specific interest, need, or area of expertise, your volunteers should share it.

  • Familiarity with technology. Your volunteers should match your audience in terms of their comfort level and familiarity with internet technologies.

  • Familiarity with test computer. Volunteers should be comfortable using the test computer and browser.

3. Give them a task

Some user tests are open-ended, allowing volunteers to explore the interface and make of it what they will. This is fine for some sites and some purposes, but it's generally not the best technique.


Rather than have volunteers drift aimlessly about your site, you should give them one or more tasks to accomplish. This engages them more actively and also mimics real-world use more closely (because most users have a goal in mind when they arrive at a site).


Volunteers inevitably feel a little nervous in user tests. When they're given a task to accomplish, they feel like they're being tested—as if they were back in school, being graded for their performance. Now, a nervous tester doesn't deliver realistic or helpful results, so it's very important to assure them that they're testing the system, not vice versa.


In fact, you might want to say just that. Mike Kuniavsky, author of Observing the User Experience and founder of Wired's usability lab, would always begin tests by reassuring volunteers: "Remember, you're testing the interface. The interface is not testing you. You can't do anything 'wrong.'"

lay out a clear task


Give the volunteer a plausible scenario about what she's trying to accomplish on the site—whether it's finding the location of a store, transferring money in a bank account, or finding a photo of the scarlet tanager. It's important that the volunteer understand what she's trying to accomplish, even if it's something she may not do in real life.


You must resist the temptation to show the volunteer what to do. This is difficult, of course, when you're testing a site you built yourself. (You'll want to explain it to them, or defend it, or show off its best features.) But you have to let the user struggle on her own, if you're going to get a clear picture of how people use your site outside of a usability lab.

4. Observe

As your volunteer makes his way through the task you've assigned, the most important thing to do is keep quiet and watch. Some of the things to watch for include the following:

  • User path. Mentally follow the user's path through the site, and notice where he deviates from the expected or recommended route. Pay attention to how he handles a "dead-end" situation: Can he find his way back and correct the mistake?

  • Hesitation. Notice where the volunteer hesitates or falters, perhaps hovering the cursor over several links. This points to an ambiguous choice in the interface. Even if this user chooses correctly, another may get it wrong.

  • Searching and scrolling. Pay attention when the volunteer seems to be searching for a link but not finding it.

  • Emotional reactions. Notice when the volunteer registers surprise ("Whoa!") or frustration ("Arg!").


Ask questions. As you become more experienced with testing, you can ask questions to clarify the volunteer's thoughts. But take care with your phrasing.

  • Ask open-ended questions. Don't ask, "Does this look like a link to the help section?" Instead ask, "What would you expect to find behind that link?"

    Don't ask "Why did you do that?" People associate that with the classroom. They may become defensive, and they may simply invent an answer to satisfy you.

  • Don't ask, "Why did you do that?" People associate that with the classroom, says Prof. Michael Twidale. They may become defensive, or may invent an answer. "Instead of asking, 'Why did you do that?'—implying that it was the wrong thing to do—I might say, 'Why do you think the computer led you to do that?'" he said.

  • Just say "Hmm." When you want to understand what's causing frustration or confusion, try prompting the volunteer by saying "Hmm." "It's very strange," Twidale says. "People will often elaborate when you say, 'Hmm.' If you say, 'Can you tell me what you're thinking?' that sounds scary. But if they just said something like, 'Whoa,' and you say, 'Oooh,' they'll start elaborating on what's going on."

5. Pinpoint problems

By watching volunteers struggle with your interface, you can identify the major usability issues pretty quickly. Usually, a small test sample of three to five volunteers will uncover the major problems in a site's interface. By watching volunteers closely, you should be able to tease out not only the problem areas, but the underlying causes. Here are some of the things to watch for:

  • Problems with placement

  • Problems with labeling

  • Problems with grouping

  • Problems with pacing or the order of events

  • Problems with the mental model

If you catch a clear, simple problem early on in the testing process (on the first volunteer, perhaps), you may want to fix it before the next volunteer begins. By removing an initial obstacle, you can focus on the other, less-obvious issues.

6. Reward volunteers

If you're going to take up someone's time, you'll need to give them something back. In standard user tests, volunteers are paid with cash. But people are motivated by other things as well. Merchandise and "shwag" sometimes work (your company's products, or T-shirts, bags, mugs, etc. with your product logo. Gift certificates—especially for a free movie or meal—also go over well. If you're recruiting within your company, employees may respond to an offer of an extra day or afternoon off.


Buy this book!

Observing the User Experience: A Practitioner's Guide to User Research

by Mike Kuniavsky (Morgan Kaufmann, $44.95)

Lesson from the trenches: how usability testing can go wrong

"It's really hard to trust the data, because you can take what you want from it."

—Sheryl Cababa

"Usability is like spell-checking. Spell-checking doesn't make your essay better. It just makes it correct."

—Jeffrey Veen

Usability testing may well be the best thing that happened to the web since—it may be the best thing that happened to the web, period. But it's not an undisputed force of good in the web universe. As designer Jeffrey Zeldman wrote in his book, Taking Your Talent to the Web, "There is good [usability] testing and there is worthless pseudo-science that promotes banality. Unfortunately... it's hard to tell until you're working at a web agency whether its testing practices are informative or a shortcut to hell."


If you want to stay out of hell, it helps to recognize the signposts. Here, then, are a few ways usability testing goes astray:

  1. You don't know what you're testing for. It's important to decide up-front, before you begin a test, what aspects of the site you're testing. This helps focus the test, and also helps you to ignore unhelpful, extraneous input from the volunteer.

    "One of the keys to testing is figuring out what information you want to know, because you do not get to control what the person comments on," says Lance McDaniel, VP of Creative at SBI and Company. Volunteers love, for instance, to talk about color. "The client says, 'Oh, they didn't like the color!' And you have to say, 'But we're not testing the color, we're testing the check-out process. And luckily for us, they actually liked the check-out process.'"

  2. You test something that can't be changed. Many usability tests have been wasted because the volunteer or the tester focused on elements of the site or interface that couldn't be changed.

    "The first thing you need to know, going in, is what can you change and what can't you change," says designer Jim Frew. "Like if your logo is green, and it has always been green, and it always will be green, don't ask if they think the logo should be red. It's never going to be red."

  3. You test something that doesn't matter. Too often, usability tests go astray by focusing on the wrong things—aspects of the site's features or interface that don't really affect usability. Again, color is the major offender. People love to comment on color. And color does impact the user experience, but it rarely affects usability unless it renders the site illegible.

    "People don't leave a website because they don't like purple," says Lance McDaniel. "They leave a site because it doesn't load, or because they can't find what they need, or because there isn't anything on the site that interests them."

  4. The tester hasn't prepared a good script. When you're running your own usability tests, you have to make sure your tester can clearly, succinctly, and non-judgementally explain to the volunteer what's expected.

    "I've done user testing at the really bare-bones level—the pizza and beer level—and the in-house people doing the testing aren't always familiar with how to encourage people, how to direct the testing process," says designer Jim Frew. "So whole tests would go out the window because the person doing the test couldn't get the person he was testing to the right points."

  5. The volunteer isn't a typical user. The whole point of usability testing is to get inside the users' heads and follow them through a typical experience on your site. But this goes wrong if your volunteer isn't representative of your user group. If your volunteer has fundamentally different attitudes, skills, or computer experience than your typical user—if she knows too much or too little about the subject matter, too much or too little about the web—you won't get valid results.

    "Your research will be exponentially better as the users you talk to are closer to your audience," says Jeffrey Veen. "The more effort you spend on accurate recruiting, the better the testing will be. Always."

  6. The results are misinterpreted. Like most things in life, usability tests are open to interpretation. And conclusions drawn from the same test can vary wildly, depending on the profession and personality of the person watching. A designer, an engineer, and a salesperson may have vastly different take-aways. And a good test poorly interpreted is worse than a bad test.

    "When my product is up for usability testing, I try to go to every single test," said Sheryl Cababa, a product designer for Microsoft. "And then I see how the usability engineers sum up that data and what they glean from it. And it could be totally different from what I glean from it. It's really hard to trust the data, because you can take what you want from it."

  7. You try to test whether a feature will be used. This is the classic problem with usability testing: People will interpret positive test results as confirmation that they're doing the right thing. But usability testing can't tell you this. All usability testing tests is whether you've designed a feature that people can understand and use, when prompted. It doesn't tell you whether they'd voluntarily decide to use it.

    As Martha Brockenbrough, former managing editor of MSN, put it: "Usability testing can't test whether a feature will actually be used."

  8. The tester—or the volunteer—has an axe to grind. When they fall into the wrong hands, usability tests—or their results—become a weapon in the pettily political battles so common in the workplace. The tester may lead the volunteer toward a particular conclusion, and the person interpreting test results may interpret them to suit his department or his own opinion. And the volunteers, too, can throw things off, if they have a particular beef with the company, the site, or the world.

  9. It's expected to solve all your problems. "The biggest misconception my clients have is that usability is a solution," says Jeffrey Veen. "Usability is really like spell-checking. Spell-checking doesn't make your essay better, it just makes it correct. And usability is the same way. It's the final, last little step in a whole series of things that you should do to understand who your users are. And all it does is check that your assumptions are right.


to Top