For the processor to function, at a logic gate
level, it needs to receive the whole word – all
eight bits of information – at once. Looking
back to that AND gate example, it’s not going
to work properly if the two signals to the
inputs come in one after another on a single
line, and sure enough, it’s the same when we
scale up to far more complex processes.
So, you’ve got two 8-bit binary number
words sat in some sort of memory address,
or represented as holes punched on a piece
of card, and you’re ready for them to be added
together or multiplied or have whatever other
operation performed on them. In those punch
card days, getting the data into the system
was relatively straightforward – each hole
in that card has a direct line to an input of the
processor’s circuit and each line is simply read
in one fell swoop. But how do we transfer data
from our electronic memory device?
We could stream all the digits of each
word down a single wire, one after the other
in series, making the voltage on that wire
high or low in some sort of regular pattern in
accordance with each digit. It would be very
simple in terms of cabling, and you could
easily send data long distances this way.
However, the CPU would then need a way to
tell each digit and word apart, plus it would
need some form of temporary storage area
to hold each digit while the system waits for
the rest of the word to arrive.
That all requires a lot of extra circuitry and
complication, which would add a great deal
of unnecessary cost and potential for error in
early computers. Instead, why not just have
eight wires in parallel and send all the bits
at once? Sure enough, that’s what was, and
still is done in many cases, with so-called
parallel busses being an integral part of PC
connections right up until this day.
Parallel communications
The advantage of parallel communication is
plain to see, with it clearly being much easier
to throw multiple bits of data down multiple
wires all at once. However, beyond the
very most basic applications, it comes with
some complications.
For a start, you need to ensure that all
those bits stay in sync with each other as
they propagate down the wires, and we still
need to be able to tell when one bit ends
and another begins. To solve this problem, a
further wire is added that provides a regular
clock signal that ticks on and off, telling the
receiver when each bit starts and ends. It’s
a system that can work extremely well,
particularly over shorter distances, and that’s
why parallel interconnects still form a major
part of computer connections.
The problem is that all those wires make
for a bulky connection system, resulting in
big, thick cables or highly complex traces
on circuitboards. If you recall the old parallel
IDE hard drive and CD-ROM connections
of yesteryear, you’ll be all too familiar with
the chunky, relatively short cables that were
required, not to mention the half-inch thick
parallel cables required to attach a printer. As
cables get longer, it’s also more difficult to
ensure that each conductor (the actual copper
strands of the cable) performs in exactly the
same way, so it’s easy for the signals to get
out of sync.
In other words, it’s a system that can only
scale so far. So, to get around these issues, we
can return to the idea of sending data down a
single cable in a continuous series, but first we
have to overcome those initial problems with
this system.
Serial communications
The first hurdle with so-called serial
transmission is being able to temporarily store
the individual bits of each incoming chunk
of data. Most modern data transmission
uses some degree of buffering of data (for
error checking and so on) but it’s utterly
foundational to serial comms.
All that’s needed is for the data to be
temporarily stored in some sort of local
memory (whether it’s local to the CPU or to
the device), but in the early days of computers
electronic memory wasn’t a readily available
commodity. Once processors and memory
became capable enough, though, this was a
relatively easy problem to solve.
With that out of the way, the next stop is to
work out how to separate bits of data. Sticking
to an 8-bit system again, if you’re transmitting
the number 170, you end up with a binary
signal of 10101010, which the receiver should
have no trouble interpreting, as long as it
knows when the number was supposed to
have started.
However, what happens if you’re
transmitting the number 128, which is
Boolean logic gates describe the principle
Boolean operations that can be performed
on any given one or two inputs to a circuit
Parallel cables require dozens of conductors, making them large and unwieldy