Managing Information Technology

(Frankie) #1

194 Part II • Applying Information Technology


through four servers that process the customer inquiries
(tier 2) by accessing data from the company mainframe
(tier 3). A Canadian supplemental health insurer began its
migration to client/server technology by concentrating on
its most mission-critical system—processing claims for
prescription drugs sold at more than 3,500 pharmacies
across Canada—into a three-tier environment. The
clients were PCs, running Windows, located in the phar-
macies (tier 1); the application servers were Sun worksta-
tions and Hewlett-Packard midrange systems (tier 2); and
the database server was a Unisys mainframe computer
(tier 3). Programmers used the C and C++ programming
languages to develop the tier 1 and tier 3 components of
the system. They used a specialized development tool,
BEA (now Oracle) Tuxedo, to develop the transaction
processing component (tier 2) (Ruber, 1997).
In the twenty-first century, there is a renewed
emphasis on the thin client model to service remote areas,
small locations, and traveling employees, where it is diffi-
cult to update the client software regularly. As an example,
Maritz Travel Company, a $1.8-billion travel management
company, used a thin client approach based on Microsoft’s
Windows NT Terminal Server Edition and MetaFrame
software, from Citrix Systems. With the Citrix approach,
applications execute on a server and are merely displayed
on the client, with the client acting as a “dumb” terminal.
Maritz initially licensed 15,000 Citrix users and plans to
extend the applications to nearly 50 of its remote offices.
Richard Spradling, the Chief Information Officer of
Maritz, identifies many advantages to the thin client
approach. According to Spradling, it is much easier to
update only the servers; users automatically access the
most current version of an application; performance of the
applications has improved; and, over time, Maritz will
spend less money on hardware by purchasing thin client
devices rather than standard PCs or other fat clients
(Wilde, 1999). Ten years after the initial thin client rollout,
Maritz still uses thin clients in its call centers for all its
customer service representatives.
Xerox Corporation is also adopting a thin client
approach. Until recently, Xerox replaced employees’ PCs
every three years, meaning about 10,000 employees got new
machines each year. Starting in 2005, Xerox adopted less
expensive thin clients, moving many key applications—such
as those supporting sales and service personnel—to servers.
Centralizing software will reduce support costs and will also
provide better security because the applications are not
scattered among tens of thousands of client devices. “We’re
trying to be more efficient and want to do more with less
money,” says Janice Malaszenko, Xerox’s Vice President
and Chief Technology Officer for Information Management
Strategy, Architecture, and Standards (Chabrow, 2005).


Virtualization

An increasingly popular way of delivering IT services is
throughvirtualization, which comes in several flavors.
With server virtualization, a physical server is split into
multiple virtual servers. Each virtual server can run its own
full-fledged operating system, and these operating systems
can be different from one virtual server to the next. The
physical server typically runs a hypervisor program to
create the virtual servers and manage the resources of the
various operating systems. Then each virtual server can be
employed as if it were a stand-alone physical server, thus
reducing the number of physical servers needed in an IT
shop and saving the organization money and space.
With desktop virtualization, the desktop environ-
ment— everything the user sees and uses on a PC desktop—
is separated from the physical desktop machine and accessed
through a client/server computing model. This virtualized
desktop environment is stored on a server, rather than on the
local storage of the desktop device; when the user works
from his or her desktop device, all the programs, applica-
tions, and data are kept on the server and all programs and
applications are run on the server. The server does almost all
the work, so a thin client is a very appropriate desktop
device; of course, a standard PC, a notebook computer, or
even a smartphone could also be used as the client.

Service-Oriented Architecture
and Web Services
As we begin the second decade of the twenty-first century,
client/server systems are still important, but service-oriented
architecture and Web services are the hot buzzwords when
considering the development and deployment of appli-
cation systems. Service-oriented architecture (SOA)is an
application architecture based on a collection of functions,
or services, where these services can communicate (or be
connected) with one another. A service is a function that is
well-defined and self-contained, and that does not depend
on the context or state of other services. Then there must be
some means of connecting services to each other, when the
services might be running on different machines, using dif-
ferent protocols, and using different operating systems and
languages. The key advantage of SOA is that once services
are created, they can be used over and over again in different
applications—only the connections will vary. Furthermore,
the services could be developed within an organization, or
the software for the services could be purchased from a ven-
dor, or the services could be obtained from a vendor on a
fee-for-use basis.
Though built on similar principles, SOA is not the
same as Web services, which is a particular collection of
technologies built around the XML (eXtensible Markup
Free download pdf