The Internet Encyclopedia (Volume 3)

(coco) #1

P1: JDW


Zimmerman WL040/Bidgolio-Vol I WL040-Sample.cls June 20, 2003 17:20 Char Count= 0


FOUNDATIONS OFUSABILITYTESTING 513

subdirectories. Employees can set up directories on their
personal computers to back up their e-mails. What if an
employee drags an e-mail from her inbox on the server
to a directory on her personal computer, but the message
stays in her inbox? She tries it again and yet again, but it
stays in the inbox.
Clearly, the problem represents a user behavior–system
design clash. The user’s assumption about how the pro-
gram works and how programmers designed the program
to work are in conflict. Without prior experience using the
particular program, users tend to fall back on their prior
experience. They repeatedly try to use the program like
they have used other programs, using prior mental models
of how other programs work. The result is an increasing
loss of time and, on the part of users, increasing frustra-
tion because the program “won’t work” when viewed from
their perspective.
If each employee spends just 5 minutes trying to solve
this problem, with salary, benefits, and overhead aver-
aging $50 per hour per employee, the costs can exceed
$4,000. Here are the time and dollars lost:

5 minutes×1000 employees=5,000 minutes
5,000 minutes/60 minutes per hour= 83 .3 hours
83.3 hours×$50 an hour=$4,165

Although one problem doesn’t seem excessive, consider
the impact on lost time and dollars if the e-mail system has
10 such problems and 500 employees spend 20 minutes a
day trying to solve the problems:

500 employees×20 minutes each day=5,000 minutes
5,000 minutes/60 minutes per hour=83.3 hours/day
83 hours×5 days a week=415 hours/week
52 weeks/year×415 hours=21,580 hours
21,580 hours×$50/hour=$1,079,000 per year

The time lost trying to solve problems quickly mounts
and becomes extraordinary when costs are calculated for

all employees. Moreover, this does not consider the impact
of the lost productivity.
Usability testing can cost from a few hundred to several
thousand dollars depending on the methods, audiences,
and the products. Bias and Mayhew (1994) provided a
framework for justifying the costs of usability testing in
more than a dozen chapters outlining issues and case
studies.

FOUNDATIONS OF USABILITY TESTING
The social sciences provide a range of methodological and
theoretical foundations for usability testing. If followed,
they can guide usability testing, minimize pitfalls, and en-
hance the evaluation of Web sites.

Methodological Foundations
A wide range of social science research methods have been
adapted for usability testing. One of the major usabil-
ity testing methodologies, verbal protocol analyses has
its early foundations in cognitive psychology and prob-
lem solving research. Psychologists Ericsson and Simon
(1983, 1993) developed verbal protocol analysis to inves-
tigate how people solve problems. Their works provide a
detailed, critical review of the strengths and weaknesses
of verbal protocol analysis. Other methodologies include
questionnaires and surveys, in-depth interviewing, parti-
cipant observation, and focus groups (see Table 1).

Theoretical Foundations
Specific areas of psychology provide insights into how
people interact with Web sites, process information, and
think about and react to the content provided through
Internet products. Bailey (1996) explored human engi-
neering acceptable performance, human characteristics,
limits and differences in sensing and responding, cogni-
tive processing performance, and memory and motivation
and then provides detailed chapters on diverse aspects of

Table 1Selected Social Science Methods and Resources for Enhancing Usability Testing and Evaluations

METHOD RESOURCES
Overviews of social science methods Babbie (1998)
Case studies Stake (1995); Yin (1994)
Content analysis Weber (1985)
Ethnography, participant observation Bryman (2001); Fetterman (1997)
Evaluation methodologies Rossi, Freeman, & Wirth (1979)
Experimental design Campbell & Stanley (1963)
Focus groups Kruger (1994); Morgan & Kruger (1998)
In-depth interviewing Rubin & Rubin (1995); Zimmerman & Muraski (1995)
Nominal Group Technique, Delphi
Technique

Delbecq, Van de Ven, & Gustafson (1975); Moore (1987); Zimmerman &
Muraski (1995)
Protocol analysis Ericsson & Simon (1984, 1993); van Someren, Barnard, & Sandberg (1994)
Surveys Babbie (1992, 1998); Dillman (1978, 2000); Fink (1995)
Qualitative research Crabtree & Miller (1992); Erlandson, Harris, Skipper, & Allen (1993)
Questionnaires and surveys Babbie (2001); Dillman (1978, 2000); Fink (1995); Salant & Dillman (1994)
Validity & reliability Campbell & Stanley (1963)
Unobtrusive measures Webb, Campbell, Schwartz, & Sechrest (1966, 1999).
Free download pdf