New Scientist - USA (2013-06-08)

(Antfer) #1

ruby


java


C#


internet C++


j sCript


C--


javasCript


>


“It’s just amazing how tribal people get
with their programming languages,” says
Crista Lopes, a professor in the informatics
department at the University of California,
Irvine. But beneath all this bilious tribalism
lurks a home truth: most of the languages
are poorly designed, making programming
incredibly abstract and difficult. And they
require extensive patching every time a
new web application appears. Though
computer code runs the modern world,
it’s actually a shambles.
You might argue that as long as your
machine runs Angry Birds and Facebook, why
bother about the online ramblings of angry
herds. But that would be naive. The writhing
mess that is computer programming in the
21st century is everyone’s business. It goes
beyond losing all your work when Windows
shows you the Blue Screen of Death. As more
and more of the world is digitised, software
bugs carry ever bigger consequences, from the
highly inconvenient to the undeniably tragic.
They ground planes and trigger financial
meltdowns. They have even been known to kill
when they pop up in healthcare equipment.
Fortunately, things may be about to change.
Let’s start with the obvious question:
why are there so many languages? Ask this
question on the internet and your first answer
will be as snarky as it is rhetorical: “Why are
there so many kinds of bicycle? Why are there
so many different wrenches?” The reason,
you will learn, is that different machines
speak different languages. Your printer and
an F-22 fighter jet don’t respond to the same
commands, probably for good reason.
However, in reality it’s much more
complicated than that, even though all of
these languages do basically the same thing:
move strings of binary digits – bits – between
the circuits of a microprocessor. Until the
1950s, computer programmers did this by
operating levers and switches, sometimes
with punch cards. Before they could do that,
however, they would first need to carefully
translate what they wanted the machine to
do – in other words, the program – into the
correct string of 0 s and 1 s that a computer
could understand. This so-called “machine
code” would tell the computer how to
correctly assemble the logic in its bowels.
If this sounds complicated, abstract and
difficult, that’s because it was. Computing
was time-consuming and expensive, and the
programs unsophisticated. It’s not surprising
that only a tiny number of people knew how
to speak this machine code.
Then along came Fortran. Invented by
researchers at IBM’s Watson Laboratory, this
“high-level” language made programming

I


’VE been using computers for decades
now. It’s probably time I taught myself
how to program one, but first I have to find
the answer to a simple question. There are
thousands of programming languages out
there – which one should I learn?
Fortunately, the internet has lots of
answers: programming blogs and forums are
filled with people asking that very question.
Unfortunately, those answers are often less
than helpful. The Java language “sucks”,
apparently, and “all Java programmers are
morons”. C++ is “baroque and ugly”. Critics
of Ruby are more plain-spoken; to them, this
language is simply “a piece of shit”. To the
uninitiated the bile is a little bit frightening.

Modern software is a


bug infested swamp,


says Michael Brooks.


Luckily there are paths


to a better world


8 June 2013 | NewScientist | 37

130608_F_Program_Languages.indd 37 30/5/13 14:14:32

Free download pdf