Concepts of Programming Languages

(Sean Pound) #1
Summary 623

The FORALL statement specifies a sequence of assignment statements that
may be executed concurrently. For example,

FORALL (index = 1:1000)
list_1(index) = list_2(index)
END FORALL

specifies the assignment of the elements of list_2 to the corresponding ele-
ments of list_1. However, the assignments are restricted to the following
order: the right side of all 1,000 assignments must be evaluated first, before
any assignments take place. This permits concurrent execution of all of the
assignment statements. In addition to assignment statements, FORALL state-
ments can appear in the body of a FORALL construct. The FORALL statement is
a good match with vector machines, in which the same instruction is applied to
many data values, usually in one or more arrays. The HPF FORALL statement
is included in Fortran 95 and subsequent versions of Fortran.
We have briefly discussed only a small part of the capabilities of HPF.
However, it should be enough to provide the reader with an idea of the kinds of
language extensions that are useful for programming computers with possibly
large numbers of processors.
C# 4.0 (and the other .NET languages) include two methods that
behave somewhat like FORALL. They are loop control statements in which
the iterations can be unrolled and the bodies executed concurrently. These
are Parallel.For and Parallel.ForEach.

SUMMARY


Concurrent execution can be at the instruction, statement, or subprogram level.
We use the phrase physical concurrency when multiple processors are actually
used to execute concurrent units. If concurrent units are executed on a single
processor, we use the term logical concurrency. The underlying conceptual model
of all concurrency can be referred to as logical concurrency.
Most multiprocessor computers fall into one of two broad categories—
SIMD or MIMD. MIMD computers can be distributed.
Two of the primary facilities that languages that support subprogram-level
concurrency must provide are mutually exclusive access to shared data struc-
tures (competition synchronization) and cooperation among tasks (cooperation
synchronization).
Tasks can be in any one of five different states: new, ready, running,
blocked, or dead.
Rather than designing language constructs for supporting concurrency,
sometimes libraries, such as OpenMP, are used.
The design issues for language support for concurrency are how competi-
tion and cooperation synchronization are provided, how an application can
Free download pdf