The Internet Encyclopedia (Volume 3)

(coco) #1

P1: JDW


Zimmerman WL040/Bidgolio-Vol I WL040-Sample.cls June 20, 2003 17:20 Char Count= 0


514 USABILITYTESTING:ANEVALUATIONPROCESS FORINTERNETCOMMUNICATIONS

human computer interaction. Baddeley (1999) provides
a succinct overview of memory, and Barsalou (1992)
provides an overview of cognitive psychology. Cogni-
tive science provides a strong theoretical foundation for
building deeper understandings of how users process all
forms of information. Osherson (1995) edited a multi-
volume compendium with individual chapters by leading
researchers in cognitive science.
Carefully developed, research and evaluations can pro-
vide heuristics or guidelines for developing communica-
tions and products. For example, the National Cancer In-
stitute’s (NCI) Web site (n.d.) provides an orientation to
usability testing and then provides specific research-based
guidelines. The NCI organized the guidelines around cat-
egories: design process, design considerations, content/
content organizations, titles/headings, page length, page
layout, font/text size, reading and scanning/links, graph-
ics, search, navigation, software/hardware, and accessibil-
ity. For each guideline, the NCI had a group of interface
and Wed design researchers develop a rating scale and
rate the support of the guideline based on research and
evaluation. The site provides an explanation of the rating
process and criteria used.

USABILITY METHODOLOGIES
Usability testing methodologies emerged in the early
1980s as companies increasingly recognized customers
were having trouble with computers. The methodologies
enable usability practitioners to identify problems with
hardware, software, and computer content as users inter-
act with the technology.
By early 2002, usability testing of e-commerce Web
sites became exceedingly critical. Unless e-commerce sites
are easy to use, consumers will not stay long enough to
purchase items (Nielsen & Norman, 2000b).
Usability protocols include Web site evaluations by
software engineers and site developers, usability and hu-
man factors experts, and intended users or consumers.
They take many forms. For example, software engineers
may conduct cognitive walkthroughs of a conceptual Web
site to visualize possible discrepancies between online

tasks and users’ expectations of those tasks. Usability ex-
perts may engage in a structured examination of a Web
site prototype, or a heuristic evaluation, to assess its us-
ability against generally accepted standards. Software en-
gineers, usability professionals, and users may assess a
Web site prototype collaboratively in pluralistic usability
walkthroughs. In addition, variations on these protocols
range from simple usability inspections and site design re-
views to complex distance testing with sophisticated soft-
ware programs over the Internet.
In recent years, a variety of automated ways of as-
sessing Web site design have been proposed. Among the
more commonly used is click-stream analysis (Burton &
Walther, 2001; Lee & Podlaseck, 2000; Murphy, Hofacker,
& Bennett, 2001) and server logging programs. Usability
practitioners install a software program on the server or
individual computer that captures keystrokes and page
visits as a user navigates the Web site. The programs pro-
vide various data, such as the more frequently used pages,
the time spent on individual pages, and the least visited
pages. Usability practitioners vary in their use of such pro-
grams. That said, such programs provide promise in giv-
ing an alternative, unobtrusive measure of users’ interac-
tions with Web sites.
Although usability practitioners can use many meth-
ods, the following discussion elaborates on four examples
of usability techniques. They are representative protocols
modeling the range of methodologies. They are card sort-
ing, contextual inquiry, heuristic evaluation (expert re-
view), and verbal protocol analysis (see Table 2 for a list
of references related to each protocol).

Card Sorting
Web sites often fail to transfer information to consumers
because developers and users have different frames of
reference—they use terms differently, may not agree on
the meaning of page labels, or they may organize informa-
tion differently. Card sorting provides insights into how
users classify, label, and organize information. Web site
developers can use these insights to design, label, and
structure a Web site to match users’ frames of reference.

Table 2Usability Testing Methods and Related References

METHOD RESOURCES
Card sorting Fuccella (1997); Fuccella & Pizzolato (1998); Koubek & Mountjoy (1991);
Martin (1999a, 1999b, 1999c); McDonald, Dearholt, Papp, &
Schvaneveldt (1986); Neilsen (1993, 2000)
Contextual inquiry Beabes & Flanders (n.d.); Beyer & Holtzblatt (1995); Holtzblatt & Jones (1993)
Heuristic evaluation IBM (2001); Nielsen (1994); Van der Geest & Spyridakis (2000)
Verbal protocol analysis Bailey (1996); Barum (2002); Ericsson & Simon (1984, 1993); National
Cancer Insitute (2001); Preece (1993); Nielsen (1993, 2000a, 2000b);
Redish & Dumas (1994); Rosson & Carroll (2002); Rubin (1994); van
Someren, Barnard, & Sandberg (1994)
Usability testing Bailey (1996); National Cancer Institute (2001); Preece (1993); Nielsen
(1993); Dumas & Redish (1994); Rosson & Carroll; (2002), Rubin (1994);
Society for Technical Communication (2002); Usability Professionals
Association (2002)
Free download pdf