Publication: Uluslararası insancıl hukukun otonom özellikli silah sistemlerine uygulanmasına ilişkin temel sorunlar
Abstract
Güncel teknolojilerin muharebe vasıtalarına uygulanmasıyla ortaya çıkan yeni silah sistemlerinin hukuken sınırlandırılması meselesi, uluslararası insancıl hukukun tarihsel olarak sürekli mücadele etmek zorunda olduğu bir problemi oluşturmaktadır. Bu sorunun günümüzdeki yansımalarından biri ise yapay zekânın silah sistemlerine uygulanmasıdır. Yapay zekâ teknolojisinin silah sistemlerine uygulanmasıyla insana duyulan ihtiyacın azaldığı yeni bir döneme girilmiştir. İnsan müdahalesinin gittikçe azalması ve ilgili silahların vuruş gücünün artması karşısında uluslararası insancıl hukuk düzenlemelerinin vereceği cevap, muharebe alanlarında yeni kullanılmaya başlayan diğer tüm silahlara benzer şekilde tartışma konusu olmuştur. Otonom teknolojiyi münhasıran düzenleyen bir andlaşmanın yokluğunda hukuki incelemenin nasıl yapılacağına ilişkin soru işaretleri ortaya çıkmıştır. Özellikle insancıl hukuk kurallarının oluşumunda, silahı kullanan bir operatörün varlığının esas alınmasından dolayı hukukun uygulanmasına ilişkin önemli belirsizlikler yaşanmaktadır. Bu çalışmada, otonom teknolojinin silah sistemlerine uygulanmasına ilişkin bir andlaşmanın yokluğunda, bu sistemlerin uluslararası hukuk normları ve standartları çerçevesinde insancıl hukuka uygunluğu incelenmiştir. Mevcut ve potansiyel hukuki sorunların çözümüne ilişkin genel insancıl hukuk normlarının kapsayıcı ve esnek olup olmadığı sorusunun cevabı aranmıştır. Silah sistemlerinde otonom teknolojinin kullanılmasından dolayı tümüyle yasaklanmasını gerektirecek hukuki bir gerekçenin söz konusu olmadığı sonucuna ulaşılmıştır. Otonom teknolojinin silah sistemlerinde kullanıldığı bileşen veya operasyonel şartlara göre hukuka aykırılıklar oluşturanlara yönelik ilave önlemlerin öncelikle alınması ve buna rağmen söz konusu aykırılıkların giderilemediği durumlar için vaziyetin gerektirdiği ölçüde ve belirli koşullar altında sınırlandırılması veya yasaklanması söz konusu olabilecektir.
The issue of the legal restriction of new weapon systems, which emerged with the application of developing technologies to methods or means of warfare, constitutes a problem that humanitarian law has historically had to constantly struggle with. One of the current reflections of this problem is the application of artificial intelligence to weapon systems. A new era has begun in which the need for humans has decreased with the application of artificial intelligence technology to weapon systems. The response of international humanitarian law regulations, despite the gradual decrease of human intervention and the increase in the striking power of the relevant weapons, has been the subject of discussions, like all other weapons that have just started to be used in combat areas. In the absence of a treaty specifically regulating autonomous technology, the question has arisen about how to deal with legal issues. Legal uncertainties have emerged in the implementation of humanitarian law rules that have been formulated by assuming an operator to use weapon systems. In this study, in the absence of a treaty on the application of autonomous technology to weapon systems, the compliance of these systems with humanitarian law within the framework of international law norms and standards has been examined. The main question is whether the current IHL framework is comprehensive and flexible enough regarding the solution to current and potential legal problems. It has been concluded that there is no legal justification that would require it to be completely banned due to the use of autonomous technology in weapon systems. Depending on the specific technical characteristics and its performance in operational conditions in which autonomous technology is used in weapon systems, additional measures should be taken primarily against those who create illegality, and in situations where such violation of IHL cannot be eliminated, it may be restricted or prohibited to the extent required by the situation and under certain conditions.
The issue of the legal restriction of new weapon systems, which emerged with the application of developing technologies to methods or means of warfare, constitutes a problem that humanitarian law has historically had to constantly struggle with. One of the current reflections of this problem is the application of artificial intelligence to weapon systems. A new era has begun in which the need for humans has decreased with the application of artificial intelligence technology to weapon systems. The response of international humanitarian law regulations, despite the gradual decrease of human intervention and the increase in the striking power of the relevant weapons, has been the subject of discussions, like all other weapons that have just started to be used in combat areas. In the absence of a treaty specifically regulating autonomous technology, the question has arisen about how to deal with legal issues. Legal uncertainties have emerged in the implementation of humanitarian law rules that have been formulated by assuming an operator to use weapon systems. In this study, in the absence of a treaty on the application of autonomous technology to weapon systems, the compliance of these systems with humanitarian law within the framework of international law norms and standards has been examined. The main question is whether the current IHL framework is comprehensive and flexible enough regarding the solution to current and potential legal problems. It has been concluded that there is no legal justification that would require it to be completely banned due to the use of autonomous technology in weapon systems. Depending on the specific technical characteristics and its performance in operational conditions in which autonomous technology is used in weapon systems, additional measures should be taken primarily against those who create illegality, and in situations where such violation of IHL cannot be eliminated, it may be restricted or prohibited to the extent required by the situation and under certain conditions.
