The Internet Encyclopedia (Volume 3)

(coco) #1

P1: C-173


Cook WL040/Bidgolio-Vol I WL040-Sample.cls June 20, 2003 13:12 Char Count= 0


STANDARDSBODIES 321

Expert Group (MPEG) groups. Independently subsets of
the standard generalized markup language (SGML) are
being developed for electronic document and data-mining
purposes.
Electronic banking is not just about the ability to
access an account but the need to automatically and se-
curely record transactions, which are increasingly becom-
ing smaller in size and more numerous. The problem here
is one of cost efficiency per transaction and the protocols
and standards used cannot themselves be allowed to make
the transaction prohibitive.
Enabling technologies for the future of the Internet are
those concerned with the explosive growth in mobile and
wireless applications and subsequently the use of personal
networks. Without adequate standards growth will con-
tinue to be sporadic.
It is with these emerging technologies that we look to
the future and bring the chapter to a close.
The evolution of standards has historically been ad
hoc and in the field of communications is known to date
back to the Greeks; however, we are not concerned with
these early signaling systems but with the advent of elec-
tronic communications in the past century. Initially devel-
oped as internal mechanisms for developing the owners
marketplace the standards have grown in influence from
the recognition of the global need to cooperate as corpo-
rate bodies outgrew their national boundaries and were
forced to allow standards bodies to grow and act indepen-
dently.
In order to understand the need for standards and the
consequent proliferation of standards bodies it is instruc-
tive to look at the nature of telecommunications traffic in
general. There are fundamentally two contexts, voice and
data, with multimedia being a mix of the two. Each of
these is carried on a network that may be of two types,
circuit switched, e.g., telephone, or packet switched, e.g.,
the Internet. The consumer end may be fixed, mobile or
wireless. This is further complicated by the capabilities of
the technologies available at a given time. This was under-
stood by the ITU, who classified into three “generations”
the technological advances (IMT-2000, 2000). First gener-
ation (1G) is the traditional analog systems, second gener-
ation (2G) systems are digital and cellular, and third gen-
eration (3G) systems are digital and cellular with higher
data transfer rates. At each stage and for each type suit-
able recognized standards must be available to the service
providers so that they can be assured of the structure of
their marketplace.

STANDARDS BODIES
No standard will survive for long if there is no organi-
zation to support and promote its use. Some of these
are supported by government funding but a surprisingly
important sector is not. These are voluntary and unpaid
collections, not only of companies but also scientists, re-
searchers, and other interested parties. There are several
of primary importance to the data communications world
and particularly the future of the Internet; this section
discusses these key organizations. Perhaps the most
important are the ITU and the International Organization
for Standardization (ISO).

International Organization for
Standardization (ISO)
Established in 1947, the ISO is a collection of national
standards bodies from around the world. Each country
is allowed one representative standards body as a mem-
ber and shares in the running costs of the organization.
The agreements reached within the ISO are published as
agreed international standards (the initials ISO are not an
acronym but come from the Greek word “isos,” or equal).
All technical fields are covered (there are approximately
2850 committees) by ISO except electrical and electronic
engineering, which is the responsibility of the Inter-
national Electrotechnical Commission (IEC). Joint
committees formed with the IEC produce standards in
the information technology fields.

The Internet
The Internet is extraordinary in the sense that it began
(and mainly continues) as a collection of voluntary and,
in the main, open bodies collectively determining the cur-
rent direction and future of the most important networked
system in the world today. The Internet Society (ISOC),
which was established in 1992 as the need for some con-
trol was appreciated, consists of professional Internet ex-
perts that concern themselves with policies and oversee
a variety of boards and task forces dealing with Internet
issues (ISOC, 2002).
Several key associated groupings exist such as the
Internet Engineering Task Force (IETF), the Internet
Engineering Steering Group (IESG), the Internet Archi-
tecture Board (IAB), and the Internet Assigned Numbers
Authority (IANA). The IETF is the protocol engineering
and development arm of the Internet Society. Although it
had existed for some time, the group was formally estab-
lished by the IAB in 1986. It is this group that proposes
protocols for consideration as standards, although it is
possible for any organization to do so. This is due to the
use of the requests for comments (RFC) process. The tech-
nical management of IETF activities is conducted by the
IESG, which also has responsibility for the Internet stan-
dards process. It guides the standards process according
to the procedures that have been published by the ISOC.
The IAB, originally known as the Internet Activities Board,
first came into being in 1983. It was the guiding force be-
hind the organized approach to the Internet. Under the
IAB both the IETF and the IRTF (Internet Research Task
Force) were formed in 1986. It was reconstituted in 1992
as the Internet Architecture Board and now serves as the
technology advisory group to the Internet Society and as
such is responsible for defining the architecture of the
Internet. IANA is in charge of all IP (Internet protocol)
addresses and any other parameters defined for the Inter-
net. A major part of its task is to ensure the uniqueness of
such parameters.
In order to ensure that network research and devel-
opment was conducted in an open manner the RFC pro-
cess was formulated in 1969 as a result of the ARPANET
development commissioned by the DoD (Department of
Defense, USA) for research into networking. Each RFC is
assigned a unique identifier no matter what its content.
Some were just for fun, e.g., RFC 527 and RFC 968;
Free download pdf