Change search
ReferencesLink to record
Permanent link

Direct link
Is it morally right to use Unmanned Aerial Vehicles (UAVs) in war?
KTH, School of Architecture and the Built Environment (ABE), Philosophy and History of Technology, Philosophy.
2011 (English)In: Philosophy & Technology, ISSN 2210-5433, ISSN 2210-5433, Vol. 24, no 3, 279-291 p.Article in journal (Refereed) Published
Abstract [en]

Several robotic automation systems, such as UAVs, are being used in combat today. This evokes ethical questions. In this paper it is argued that UAVs, more than other weapons, may determine which normative theory the interpretation of the laws of war (LOW) will be based on. UAVs are unique as a weapon in the sense that the advantages they provide in terms of fewer casualties, and the fact that they make war seem more like a computer game, might lower the threshold for entering war. This indicates the importance of revising the LOW, or adding some rules that focus specifically on UAVs.

Place, publisher, year, edition, pages
2011. Vol. 24, no 3, 279-291 p.
Keyword [en]
UAVs, laws of war, robots
National Category
Philosophy Ethics
URN: urn:nbn:se:kth:diva-32432DOI: 10.1007/s13347-011-0033-8ScopusID: 2-s2.0-80052618840OAI: diva2:410638

QC 20110414. Updated from submitted to published, 20120316. Previous title: Is it morally right to use UAVs (unmanned aerial vehicles) in war?

Available from: 2011-04-14 Created: 2011-04-14 Last updated: 2014-04-11Bibliographically approved
In thesis
1. Autonomous Systems in Society and War: Philosophical Inquiries
Open this publication in new window or tab >>Autonomous Systems in Society and War: Philosophical Inquiries
2013 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

The overall aim of this thesis is to look at some philosophical issues surrounding autonomous systems in society and war. These issues can be divided into three main categories. The first, discussed in papers I and II, concerns ethical issues surrounding the use of autonomous systems – where the focus in this thesis is on military robots. The second issue, discussed in paper III, concerns how to make sure that advanced robots behave ethically adequate. The third issue, discussed in papers IV and V, has to do with agency and responsibility. Another issue, somewhat aside from the philosophical, has to do with coping with future technologies, and developing methods for dealing with potentially disruptive technologies. This is discussed in papers VI and VII.

Paper I systemizes some ethical issues surrounding the use of UAVs in war, with the laws of war as a backdrop. It is suggested that the laws of war are too wide and might be interpreted differently depending on which normative moral theory is used.

Paper II is about future, more advanced autonomous robots, and whether the use of such robots can undermine the justification for killing in war. The suggestion is that this justification is substantially undermined if robots are used to replace humans to a high extent. Papers I and II both suggest revisions or additions to the laws or war.

Paper III provides a discussion on one normative moral theory – ethics of care – connected to care robots. The aim is twofold: first, to provide a plausible and ethically relevant interpretation of the key term care in ethics of care, and second, to discuss whether ethics of care may be a suitable theory to implement in care robots.

Paper IV discusses robots connected to agency and responsibility, with a focus on consciousness. The paper has a functionalistic approach, and it is suggested that robots should be considered agents if they can behave as if they are, in a moral Turing test.

Paper V is also about robots and agency, but with a focus on free will. The main question is whether robots can have free will in the same sense as we consider humans to have free will when holding them responsible for their actions in a court of law. It is argued that autonomy with respect to norms is crucial for the agency of robots.

Paper VI investigates the assessment of socially disruptive technological change. The coevolution of society and potentially disruptive technolgies makes decision-guidance on such technologies difficult. Four basic principles are proposed for such decision guidance, involving interdisciplinary and participatory elements.

Paper VII applies the results from paper VI – and a workshop – to autonomous systems, a potentially disruptive technology. A method for dealing with potentially disruptive technolgies is developed in the paper.

Place, publisher, year, edition, pages
Stockholm: KTH Royal Institute of Technology, 2013. ix, 57 p.
Theses in philosophy from the Royal Institute of Technology, ISSN 1650-8831
UAVs, drones, military robots, laws of war, justification for killing, ethics of care, care robots, functional morality, moral responsibility, Moral Turing Test, robot morality, artificial agent, artificial agency, autonomy, norms, disruptive technology, co-evolution, scenarios, autonomous systems, security, decision guidance, technology assessment
National Category
urn:nbn:se:kth:diva-127813 (URN)978-91-7501-820-1 (ISBN)
Public defence
2013-10-02, Kapellet, Brinellvägen 6-8, KTH, Stockholm, 10:00 (English)

QC 20130911

Available from: 2013-09-11 Created: 2013-09-06 Last updated: 2014-06-17Bibliographically approved

Open Access in DiVA

No full text

Other links

Publisher's full textScopus

Search in DiVA

By author/editor
Johansson, Linda
By organisation
In the same journal
Philosophy & Technology

Search outside of DiVA

GoogleGoogle Scholar
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

Altmetric score

Total: 162 hits
ReferencesLink to record
Permanent link

Direct link