Please use this identifier to cite or link to this item:
Author(s): Marsili, M.
Editor: Ignas Kalpokas
Julija Kalpokienė
Date: 2023
Title: Military emerging disruptive technologies: Compliance with international law and ethical standards
Volume: 390
Book title/volume: Intelligent and autonomous: Transforming values in the face of technology
Pages: 25 - 50
Collection title and number: Value Inquiry Book Series (VIBS)
Reference: Marsili, M. (2023). Military emerging disruptive technologies: Compliance with international law and ethical standards. In I. Kalpokas, & J. Kalpokienė (Eds.). Intelligent and autonomous: Transforming values in the face of technology (pp. 25-50). Brill.
ISSN: 0929-8436
ISBN: 978-90-04-54726-1
DOI (Digital Object Identifier): 10.1163/9789004547261_004
Keywords: Emerging and disruptive technologies
Direito internacional -- International law
Ética -- Ethics
Law of war
Armed conflict
Geneva convention
Direitos humanos -- Human rights
Inteligência artificial -- Artificial intelligence
Arms machine
Tecnologia militar -- Military technology
Big data
Computação quântica -- Quantum computing
Abstract: The major powers focus on science and technology development in order to build military power with strategic impact. High-technology weapons, available also to non-state actors, are assumed they would shape the nature of warfare in the twenty-first century. Semiconductors, cloud computing, robotics, and big data are all part of the components needed to develop the AI that will model and define the future battlespace. Artificial intelligence will apply to nuclear, aerospace, aviation and shipbuilding technologies to provide future combat capabilities. The incorporation of AI into military systems and doctrines will shape the nature of future warfare and, implicitly, will decide the outcome of future conflicts. Before fielding a weapon system, military and political leaders should think about how it can be used and should it be used in a certain manner. A strong and clear regulatory framework is needed. The use of automatic processing of plans and orders (automatic control) needs a policy control. Autonomous machines need some level of human control and accountability. Imagine what could happen if a system, like HAL 9000 or the War Games supercomputer, could make an autonomous decision. Some fictional stories have imagined a dystopian future where machine intelligence increases and surpasses human intelligence until machines exert control over humans. As Freedman concludes in The Future of War, most claims from the military futurists are wrong, but they remain influential nonetheless. The tendency of humans is to give more responsibility to machines in collaborative systems. In the future, automatic design and configuration of military operations will be entrusted more and more to the machines. Given human nature, if we recognize the autonomy of machines, we cannot expect anything better from them than the behavior of their creators. So why should we expect a machine to ‘do the right thing’? In the light of what has been discussed here, it could be argued that some military applications of EDTs may jeopardize human security. The total removal of humans from the navigation, command and decision-making processes in the control of unmanned systems, and as such away from participation in hostilities, makes humans obsolete and dehumanizes war. Because of the nature and the technological implications of automated weapons and AI-powered intelligence-gathering tools it is likely that boots on ground will become an exception. Cyber soldier probably will be a human vestige behind the machine. The rules that will apply to battlespace are unknown. Increased machine autonomy in the use of lethal force raises ethical and moral questions. Is it an autonomous system safe from error? Who will bear the responsibility and accountability for the wrong decision: politicians, low-makers, policy-makes, engineers, or military? Guidelines are needed, and ethical and legal constraints should be considered. Lexicon and definition of terms are essential, and the international community should find common, undisputed and unambiguous legal formulations. The difference between conventional/unconventional, traditional/non-traditional, kinetic/non-kinetic, and lethal/non-lethal seems to be outdated. A knife, a broken bottle neck (if it cuts your jugular), even a fork, a hammer, a baseball bat, or a stone – according to the biblical story David kills Goliath by hurling a stone from his sling and hitting him in the center of forehead – are all unconventional, kinetic, and potentially lethal weapons. Nevertheless, distinguishing between weapons, their effect and consequence, is necessary in order to avoid a cascade effect and undesirable outcomes. The LAWS can lead to an acceleration of a new arms race and to proliferation illegitimate actors – non-state actors and terrorist groups – cyber-attacks and hacking, lowering of the threshold for the use of force. The debate on the application of technology to warfare should cover international law, including IHL, ethics, neuroscience, robotics and computer science. It requires a holistic approach. It is necessary to investigate whether the new domains are actually comparable to the classical ones, and whether current rules are applicable, or if new ones are necessary. Further considerations deriving from the extension of the battlefield to the new domains of warfare concern the use of artificial intelligence in the decision-making process, which, in a fluid security environment, needs to be on target and on time in both the physical and virtual informational spaces. It is not just a legal debate, but also moral and ethical that should be deepened. A multi-disciplinary approach would be useful for designing the employment framework for new warfare technologies.
Peerreviewed: yes
Access type: Open Access
Appears in Collections:CEI-CLI - Capítulos de livros internacionais

Files in This Item:
File SizeFormat 
bookPart_97461.pdf485,28 kBAdobe PDFView/Open

FacebookTwitterDeliciousLinkedInDiggGoogle BookmarksMySpaceOrkut
Formato BibTex mendeley Endnote Logotipo do DeGóis Logotipo do Orcid 

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.