Australasian Science — May-June 2017

(C. Jardin) #1
of cases in one study). Nevertheless, the impacts of increased
aggression and depression in warfare give us reason to be very
careful about how such enhancements are used in practice.
Similarly, while reducing PTSD is undoubtedly a good thing,
some ethicists worry that reducing the emotional impact of
traumatic events in wartime could lead to an increase in the
number of wartime atrocities committed. In her assessment of
ways that enhancements impact moral responsibility, Jessica
Wolfendale states in The American Journal of Bioethics
(http://tinyurl.com/mzng6wq): “Propranolol, it seems, modi-
fies subjects’ capacity to respond to and assess information rele-
vant to rational decision-making, and as a result it would arguably
affect the degree of moral responsibility we could assign to
them”.

In case this comes across as overly pessimistic, this is not to
suggest that cognitive enhancements or efforts to reduce PTSD
ought to be discarded; that would be throwing the baby out
with the bathwater. Instead, the point here is to recognise that
the technological interventions are not simple, and the context
of decision-making in warfare is very complex. Things often
don’t go as planned. What is needed is an active part of the
research and development process that looks at any such tech-
nological intervention’s possible impacts on, and reduction in,
the capacity to follow the laws of armed conflict.
Furthermore, as the side-effects of deep brain stimulation
show, long-term and in-conflict monitoring of the enhanced
soldiers is needed to see if the interventions actually work as
hoped, and/or if they have any demonstrable or recognisable
effects on following the laws of armed conflict. Much like a
phase IV clinical trial, the interventions must be continually
studied in their application. Moreover, thought needs to be
given to who is legally culpable should such an intervention
reduce a soldier’s adherence to the laws of armed conflict. It is,
perhaps, an open question about who is morally responsible:
individual soldiers, their direct commanders, those who imple-
mented the enhancement program, or the designers and
providers of the product? Questions of legal culpability need
clear answers.
There is also an equivalently complicated set of questions
about the way that enhancements could impact the adversary’s
treatment of one’s own soldiers. Enhancements to increase
endurance and reduce sensitivity to pain are of obvious interest
for military applications, so if the enemy military and civilians
hear that incoming soldiers have been enhanced in such ways,

it would seem possible that these enhanced soldiers are perceived
as something more than, or perhaps less than, human. Simi-
larly, if a group of soldiers encased in exoskeletons bear down
upon their enemy, it would seem possible that these enhanced
soldiers are perceived as something other-than-human.
In Humanity: A Moral History of the Twentieth Century,
Jonathan Glover writes : “There are far weaker social pressures
against hostile treatment of members of other groups. And in
war the pressures often support group hostility.” Seeing other
soldiers as enhanced half-robots and potentially not-human
can make hostile and harmful treatment by the other group
even easier and likely. Here the worry is that the enhancement
negatively impacts proportionality because of the way it changes
perception by the adversary.
This is not to say that we jettison the laws of armed conflict,
or lower the legal criteria around discrimination, proportion-
ality or treatment of prisoners of war. Rather, it is to recognise
that significantly changing our soldiers could have an impact on
how our enemies treat them. Hence there is a case to be made
for anthropological research in the military context, to get some
idea of how those who we are fighting with understand the
enhancements and what these different understandings mean
in practice.
As a final note, it is important to recognise that much of
what has been discussed here is speculative. While some of
these enhancement technologies are being used and trialled,
others are still in the research stage.
Moreover, the concerns that I’ve been pointing to are “what
if” scenarios. What if these technologies cause a decline in
adherence to the laws of armed conflict? What if the enemy
starts treating all our soldiers like they can’t feel pain?
In this sense, what I’m speaking of is speculative. However,
simply because this is speculative doesn’t mean it is fantasy.
First, the military interest in enhancement technologies exists
and is substantial. Second, in terms of the ethics of such future
technologies, we can take an approach that technology philoso-
pher Philip Brey calls “anticipatory ethics for emerging tech-
nologies”.
Rather than waiting to see what happens with these tech-
nologies in practice, and then patch any problems after they
have caused unjustified death and destruction, given that we
know there is interest in these technologies we can anticipate
them and attend to the ethical concerns in a way that is evidence-
based and pre-emptive. Attention to the full range of impacts
of such technologies at the research stage, including compli-
cated issues like adherence to the law and adversarial perception
of enhanced soldiers, is a necessary element of the development
of any enhancements.
Adam Henschke is an ethicist at the National Security College, the Australian National
University. This research was supported by the Brocher Foundation in Geneva.

16 ||MAY/JUNE 2017


“... significantly changing our
soldiers could have an impact on
how our enemies treat them”.
Free download pdf