• Looking for a Top Flight Web Development Team?
  • Professional & Affordable Web Design Services
  • Shaping Imaginations using Cutting-Edge Technologies
  • Dynamic Solutions for Dynamic Businesses
  • Analysis.. then Solutions with a High Tech Flair
  • Satisfied Customers In Over 30 Countries
  • National Association of
    Accredited Internet Professionals
  • (561)948-6074
Custom Website Designs
 
Testing and Usability

Now that the Web portfolio site is up, it is time to insure usability by performing testing. Usability testing is a very hot topic in human computer interaction and e-commerce. The ability for the user to get information easily and quickly is cornerstone to the scientific principles and theories surrounding the area. One pioneer in the area of usability is Jakob Nielson.

 

Nielson’s work deals with research and testing on usability and interface design, particularly on the World Wide Web. Nielson reports on his usability Web site useit.com that the study of heuristics is on the rise. He cites over 14,000 hits on Google pointing to heuristic evaluation. If you are wondering, heuristic evaluation is defined by Jakob Nielson on his Web site, http://www.useit.com, as: “Heuristic evaluation is the most popular of the usability inspection methods.

 

Heuristic evaluation is done as a systematic inspection of a user interface design for usability. The goal of heuristic evaluation is to find the usability problems in the design so that they can be attended to as part of an iterative design process. Heuristic evaluation involves having a small set of evaluators examine the interface and judge its compliance with recognized usability principles (the ‘heuristics’).” The work presented in the following by Nielsen and Molich has value on the system level as well as on designer and developer levels. The designer lacks usability foresight in many cases and needs to go back to grass roots usability design on paper. Creating flowcharts and storyboards, the designer fights their way back to usability standards and the brink of digital design insanity.

 

If the designer had only followed the heuristic scale, the workload would be shortened immensely. The heuristics that Nielson refers to on his useit.com site are explained in this list of ten heuristics for heuristic evaluation developed by Nielsen and Molich: • Visibility of system status The system should always keep users informed about what is going on, through appropriate feedback within reasonable time. • Match between system and the real world The system should speak the users’ language, with words, phrases and concepts familiar to the user, rather than system-oriented terms.

 

Follow real-world conventions, making information appear in a natural and logical order. • User control and freedom Users often choose system functions by mistake and will need a clearly marked “emergency exit” to leave the unwanted state without having to go through an extended dialogue. Support undo and redo.

 

• Consistency and standards Users should not have to wonder whether different words, situations, or actions mean the same thing. Follow platform conventions.

 

• Error prevention Even better than good error messages is a careful design which prevents a problem from occurring in the first place. • Recognition rather than recall Make objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instruc- tions for use of the system should be visible or easily retrievable whenever appropriate.

 

• Flexibility and efficiency of use Accelerators — unseen by the novice user — may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions. • Aesthetic and minimalist design Dialogues should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility.

 

• Help users recognize, diagnose, and recover from errors Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution.

 

• Help and documentation Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user’s task, list concrete steps to be carried out, and not be too large (Molich & Nielsen, 1990, p. 1). By applying Nielsen and Molich’s usability heuristics in a customized fashion to test Web portfolio usability, we enable a path to further research which can focus on the specific information product structure we see in the Web portfolio.

 

Each of the heuristics listed earlier has a direct influence on the outcome of the user. In Web portfolio design, it is important to employ a heuristic evaluation as well. A good usage of the rules devised by Nielsen and Molich would be to adapt them to evaluate the usability of the Web portfolio. Many of the rules were addressed in the initial design of our Web portfolio.

 

However we can learn by application of this usability theory to our own human computer interaction vehicle, the Web portfolio. Most people, after posting the Web portfolio, ask their friends and colleagues to “check out the Web site”. That approach is great when you know that the Web portfolio site going to be successful. We can’t be assured of that unless we ask some critical evaluation questions before releasing the site to the mass public.

 

We can use a sample set of users, maybe friends, maybe colleagues, maybe strangers (most honest and valuable sub- jects), to explore the site, to test its usability from the real world view of the unassuming user. With this in mind, I propose adaptation to the Nielsen and Molich work for the purpose of establishing usability evaluation questions for the usability of the Web portfolio. These questions can be administered to a group of subjects in order to test usability. This process can be low tech, or as Nielson describes, low fidelity paper prototype — a fancy term for a sketched paper prototype (your completed story boards qualify).

 

Another usability testing media that can be used is a high fidelity paper prototype. These are screens that are printed out and administered to users for evaluation of usability. You can print and use screens designed in Fireworks or Photoshop for this type of usability test. If you have extensive text, you can print out the actual Web portfolio pages from a Web browser. Or simply pencil in text on pages that have only amounts of text. You can also place the text in the page in the screen design in an image editing application (Fireworks or Photoshop). Make the text portions separate slices so that you can delete them and replace the holes with editable HTML text. By using text in the image editing application, you are simply using the text for position only. You should however use the HTML font that you plan to use when and if you replace the text in Macromedia Dreamweaver later on. A quick review on Web text fonts —  as mentioned before, we should only use Times or Times New Roman; or Helvetica or Arial for all HTML, Web based text.

 

I am partial to the clean look of Arial. Use whichever font appeals to you, but keep it consistent throughout (hint: use cascading style sheets). Enough review, lets discuss Web portfolio usability standards some more. Ask the user the following questions of the Web portfolio design and usability:

 

• In this Web portfolio, is there a visible, clear navigation path to the body of work?

• Was the navigation presented in easy, understandable terms?

• Did you feel you had control of the interface and portfolio content?

• Would you call the site the Web portfolio site consistent in its visual appearance?

• Did you encounter errors when using the Web portfolio site?

• Was it easy to quit out of the Web portfolio pop up windows?

• Was the design cluttered or confusing?

• Did you require help at any time while using the Web portfolio site?

• Did the music in parts of the portfolio make the experience better or worse?

• Were you able to control the multimedia (sound and animation) to you liking?

• How would you rate the Web portfolio experience you just discovered? With the user giving honest answers, a scale of values would be developed to determine levels of user satisfaction, access attitudes, and usability ratings. A pilot study of usability in Web portfolios may be an interesting research topic not only from an academic standpoint but also from a communication perspec- tive. The variable of persuasion can begin to be measured to determine the effectiveness of the portfolio and its work on user attitudes about the candidate or company.

 

This is a broad topic that deserves more research. This will most likely occur due to the growth of Web portfolio popularity and governance. On a simple level, testing the Web portfolio means going through it page by page, asset by asset, link by link to determine what is not working and what does not look good.

 

Back
to Top