Show/Hide Menu
Hide/Show Apps
Logout
Türkçe
Türkçe
Search
Search
Login
Login
OpenMETU
OpenMETU
About
About
Open Science Policy
Open Science Policy
Communities & Collections
Communities & Collections
Help
Help
Frequently Asked Questions
Frequently Asked Questions
Guides
Guides
Thesis submission
Thesis submission
MS without thesis term project submission
MS without thesis term project submission
Publication submission with DOI
Publication submission with DOI
Publication submission
Publication submission
Supporting Information
Supporting Information
General Information
General Information
Copyright, Embargo and License
Copyright, Embargo and License
Contact us
Contact us
Military Robots: Ethics of Lethal Autonomous Weapon Systems
Download
Military Robots.pdf
Date
2023-10-5
Author
Gülmez, Salih
Metadata
Show full item record
This work is licensed under a
Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License
.
Item Usage Stats
57
views
90
downloads
Cite This
In this thesis, the ethical impacts of Lethal Autonomous Weapon Systems (LAWS) have been investigated. The main focus of the thesis is the question of whether LAWS lead to a responsibility gap. The responsibility gap argument posits that no one bears responsibility for the actions of LAWS, resulting in a gap in responsibility assignments. However, I introduce the concept of vicarious responsibility, demonstrating that designers of LAWS can be held morally responsible for their design due to their moral entanglement. The central argument of the thesis posits that it is possible to attribute moral responsibility, albeit in a vicarious sense, to the designers of LAWS, thereby bridging the responsibility gap.
Subject Keywords
Moral Responsibility
,
LAWS
,
Ethics of AI
URI
https://hdl.handle.net/11511/105528
Collections
Graduate School of Social Sciences, Thesis
Citation Formats
IEEE
ACM
APA
CHICAGO
MLA
BibTeX
S. Gülmez, “Military Robots: Ethics of Lethal Autonomous Weapon Systems,” M.A. - Master of Arts, Middle East Technical University, 2023.