Showing posts with label Unmanned Aerial Vehicles. Show all posts
Showing posts with label Unmanned Aerial Vehicles. Show all posts

Friday, June 28, 2024

On the Use of Unmanned Aerial Vehicles

Operationally, remote-controlled weapons are force multipliers. Politically, unmanned aerial vehicles (UAVs) can cut both ways: their use minimizes casualties, but they can also be used in “inconvenient” ways. Ethically, the military use of UAVs or drones are controversial in two ways: their use for assassinations and their overall use in the context of Just War Theory.

An MQ-9 Reaper UAV flies over the Nevada Test and Training Range and performs live-fire exercises.
(U.S. Air Force photo by Airman 1st Class Victoria Nuzzi)

Use for Targeted Assassination

Probably the most controversial use of drones is when the target is an American citizen. This is exactly what happened on 30 September 2011 when Anwar al-Awlaki was executed in the second of two CIA drone missile attacks against him.

In reading biographical information on al-Awlaki, one must ask oneself: in what sense was he a citizen?

Al-Awlaki was born of Yemeni parents who were living in New Mexico at the time, but the family returned to Yemen when he was seven years old. He later returned to the US and soon got on the FBI radar, which of course amounted to nothing even though he has certainly either influenced or played an operational role in numerous terrorist actions against the United States1:

  • He had extended contact with at least two of the 9/11 hijackers
  • He had direct contact and influenced Nidal Hasan to commit the 2009 Fort Hood shooting
  • One of his students, a member of al-Qaeda, was the "Underwear Bomber" who attempted to bomb a Northwest Airlines flight on Christmas 2009
  • The Times Square bomber was also a follower of Al-Awlaki
  • He issued a fatwa against the organizer of the "Everybody Draw Mohammed Day" contest
  • He also influenced Omar Mateen's shooting at the Pulse Night Club in Florida. (I lost an acquaintance in that attack.)

Through his teachings, Al-Awlaki denounced America and influenced others to take direct action against her. Regardless of whether birthright citizenship is legal or ethical, Al-Awlaki was a US citizen only in the sense that in some database, a box next to his name was checked.

There is apparently a legal framework against extrajudicial killing. Section 2.11 of the 2008 amended EO 12333 explicitly prohibits assassination2 3. It is unclear how the assassination was squared with that EO.

Al-Awlaki was charged in absentia in Yemen for being a member of al Qaeda. It isn't clear at all why he couldn't be tried in absentia in a US court. The rules for such trials are well-established but were not followed by the Obama administration.

Nasser al-Awlaki, Anwar's father, filed for injunction prior to both drone strikes, but Nasser lacked locus standi. It is unknown how Nasser learned that his son was on the targeted killing list.

The only objectional part of Al-Awlaki's death by drone strike, I claim, is that he wasn't tried in absentia. It is uncertain whether Al-Awlaki could have been extradited from Yemen, which is where he was killed. Lawsuits before and following his death seeking the release of the standards under which an American can be so targeted have apparently been unsuccessful.

So, was al-Awlaki's execution legal? We may never know as the standards for extrajudicial killing still appear to be a secret. All we know is that it couldn't have happened to a more deserving person.

Overall Drone Use

The use of drones has been addressed by just war theorists, including Bradley Jay Strawser. In his 2010 paper “Moral Predators: The Duty to Employ Uninhabited Aerial Vehicles,”4 he begins by distinguishing “man-in-the-loop” or “man-on-the-loop” weapon systems from “man-out-of-the-loop” systems. He rejects the latter but argues that the use of man-in-the-loop and man-on-the-loop weapons are morally justifiable.

Strawser raises and addresses six objections to UAV usage. Here are two of them.

One of his objections is that UAVs lead to violations of the jus in bello Principle of Discrimination, the idea being that poor-quality or unreliable video feeds would limit the ability of the pilot to discriminate between combatant and non-combatant. This is a limitation that currently is not true, and Strawser quotes statistics showing that, in comparison to conventional forces, UAVs resulted in far fewer noncombatant deaths relative to combatant deaths. Further, by flying at lower altitudes than manned aircraft, more accurate targeting is to be expected. Finally, Strawser notes that the manufacturer of a missile designed for use on UAVs (a drone on a drone) has achieved “urban warfare precision,” and that the missile is controlled after it has been launched (“fire, observe, and update” as opposed to “fire and forget”).

Another objection is that UAVs create an unjust asymmetry in combat, in that “one side literally does not take any life-or-death risks whatsoever (or nearly so, since its warfighters are not even present in the primary theater of combat) whereas the opposing side carries all the risk of combat.” His response is perhaps the strongest part of his paper:

[T]here is no chivalrous reason for a just combatant to ‘equal the playing field’ or ‘fight fair.’ If combatant A fights under a just cause, while combatant B fights for an unjust cause, combatant A owes nothing to combatant B by way of exposing his/herself to some minimal threshold of risk. Thus, it is right for combatant A to reduce the risk in an engagement with the unjust enemy.

Worded another way: if we find ourselves in a “fair fight,” we make it unfair to our opponents, however possible.

One objection Strawser doesn’t raise is the problem of the drone pilot’s location. Range matters, and the remote-controlled weapons from previous wars all required the pilot or operator to be either in-theater or very close.

Again, range matters. At some point, the range of American artillery exceeded that of an enemy’s. Does this mean that the enemy surrenders without a fight? No: the enemy will simply attack from a position so close that the use of our artillery would be prohibitive because of the likelihood of harming our own troops.

What would an enemy do when facing-off against a UAV system? There are at least four points of vulnerability: the drone itself, the takeoff and landing site(s) of the drone, the communication system controlling the drone, and the drone’s pilot. Just as in the case of the artillery, the enemy will move close to attack. This means that the pilot is a legitimate target.

In the case of drones, though, the pilot could be located within the United States. For example, Anwar al-Awlaki was executed while in Yemen, but the drone pilot was located at the Creech Air Force Base outside Las Vegas, Nevada5.

Thus, there was a legitimate target for the enemy within our own borders.

This is not to say that this renders the use of UAVs unethical, but it does raise an issue that policy makers do not seem to consider.


Footnotes

  1. Shane, “The Enduring Influence of Anwar al-Awlaki in the Age of the Islamic State.”
  2. Executive Order 12333 United States Intelligence Activities
  3. Slick, S. “Modernizing the IC “Charter”: The 2008 Amendments to Executive Order 12333, United States Intelligence Activities”.
  4. Strawser, “Moral Predators: The Duty to Employ Uninhabited Aerial Vehicles.”
  5. Zegart, Spies, Lies, and Algorithms: The History and Future of American Intelligence

Bibliography

Shane, S. “The Enduring Influence of Anwar al-Awlaki in the Age of the Islamic State.” CTC Sentinel (9) No. 7, July 2016. Retrieved 28 June 2024 from https://ctc.westpoint.edu/the-enduring-influence-of-anwar-al-awlaki-in-the-age-of-the-islamic-state/

Slick, S. “Modernizing the IC “Charter”: The 2008 Amendments to Executive Order 12333, United States Intelligence Activities”. Studies in Intelligence (58), No. 2 (Extracts, June 2014). Retrieved 27 June 2024 from https://www.cia.gov/resources/csi/static/2008-Amendments-Executive-Order.pdf

Strawser, B. “Moral Predators: The Duty to Employ Uninhabited Aerial Vehicles.” Journal of Military Ethics 9 no. 4, 2010. https://doi.org/10.1080/15027570.2010.536403

“Executive Order 12333 United States Intelligence Activities” Code of Federal Regulations. Retrieved 27 June 2024 from https://dpcld.defense.gov/Portals/49/Documents/Civil/eo-12333-2008.pdf

Zegart, A. Spies, Lies, and Algorithms: The History and Future of American Intelligence. Princeton University Press, 2022.

Thursday, June 27, 2024

Review of Strawser’s "Moral Predators"

Historical Introduction

Remote-controlled weapons are not a new technology: the Soviets used “teletanks” – radio-controlled tanks with a range of just under a mile – during the Winter War, and the Nazis experimented with the “Goliath tracked mine” during World War II that had a similar range. Going back further, in World War I the Germans developed the “Ferlenkboot” - remote controlled boats with a range of approximately 12 miles – for use against British ships in the English Channel and off the coast of Flanders.

Three aspects make contemporary unmanned aerial vehicles (UAVs) different from these other unmanned weapons: first, the pilot of a UAV can be on the other side of Earth from the UAV itself; second, they are a proven and effective weapon system; third, they were developed and are heavily used by the United States. This last difference is the main reason why UAVs have drawn the attention of just war theorists – notice the lack of outcry over the suicide drone boats used by Iran and Houthi rebels.

In his 2010 paper “Moral Predators: The Duty to Employ Uninhabited Aerial Vehicles”1, Bradley Jay Strawser argues that there is nothing wrong in principle with the use of UAVs, and in fact their use should be obligatory. He explains the “obligatory” part as being a duty, but later in the paper2 he provides a better explanation. To explain why UAVs are not wrong in principle, he addresses six objections that can be made to their use.

A General Atomics MQ-1 Predator, armed with AGM-114 Hellfire missiles, piloted by Lt. Col. Scott Miller on a combat mission over southern Afghanistan. (U.S. Air Force Photo / Lt. Col. Leslie Pratt)

Objections Addressed by Strawser

The first objection is that the adoption of UAVs will lead to fully autonomous weapons, or what Strawser calls IAWs (independent autonomous weapons); IAWs are morally impermissible; therefore, UAV development is impermissible. Strawser raises this issue to delimit the scope of his paper strictly to “human-in-the-loop” or “human-on-the-loop” weapon systems and avoid “human-out-of-the-loop” systems. He rejects this argument because there is no proof that development of UAVs necessarily leads to IAWs, and that it is possible to continue UAV development and usage while banning IAW development – which he recommends Western nations do3.

The second objection is that UAVs lead to violations of the jus in bello Principle of Discrimination, the idea being that poor-quality or unreliable video feeds would limit the ability of the pilot to discriminate between combatant and non-combatant. This is a limitation that currently is not true, and he quotes statistics showing that, in comparison to conventional forces, UAVs resulted in far fewer noncombatant deaths relative to combatant deaths. Further, by flying at lower altitudes than manned aircraft, more accurate targeting is to be expected. Finally, Strawser notes that the manufacturer of a missile designed for use on UAVs has achieved “urban warfare precision,” and that the missile is controlled after it has been launched (“fire, observe, and update” as opposed to “fire and forget”).

It mustn’t be forgotten that UAVs are still piloted vehicles, their pilots are just not aboard them. This can lead, so the third objection goes, to cognitive dissonance on the part of UAV operators. The pilot can “kill the enemy from their ‘desk’ at work and then go home to dinner and their child’s soccer match,” and this somehow places unnecessary psychological stress on the pilot. In addition, since the pilot can view his profession as a video game, this would lead to either “weakening the operator’s will to fight” or lead to frequent violations of the Principle of Discrimination.

Strawser responds to this by noting that since the pilot is in no personal danger, the temptation to commit jus in bello violations is lessened. Also, the pilot can evaluate the target before firing. All UAV action can be recorded and monitored, and this provides an additional layer of accountability: “an entire team of officers and human rights lawyers could oversee every single lethal decision made by a UAV.” Finally, UAV-pilot-specific cognitive dissonance can be lessened by moving the pilot in-theater.

The fourth objection raised is that targeted killings (assassinations) fall outside the bounds of acceptable Just War Theory4, and UAVs somehow make the practice too easy. The issue here, Strawser notes, is one of policy and not the technology. There are three adjacent concerns: sovereignty issues, “ignoble warfare,” and the extent that UAVs make assassinations easy.

The first corollary is really a legal issue – lawyers can argue that sending UAVs into another country’s airspace is not the same a sending manned airplane or an agent – and is not relevant to most drone strikes. The third concern (that UAVs make assassinations not only possible but easy), is not a strike against UAVs themselves but rather their usage.

The second corollary concern – that using UAVs constitutes a form of “ignoble warfare” – is perhaps the most interesting objection one can make. Strawser states this objection in full as follows: “the battle for the ‘hearts and minds’ of local nationals in a given theater is significantly worsened by what they view as ignoble warfare; UAVs are thought to be ‘cowardly.’”

“So be it,” Strawser responds. He should have stopped there, but he goes on to admit that if studies prove that the winning of ‘hearts and minds’ is indeed made more difficult, then UAVs should not be used.

Strawser’s fifth objection is that UAVs create an unjust asymmetry in combat, in that “one side literally does not take any life-or-death risks whatsoever (or nearly so, since its warfighters are not even present in the primary theater of combat) whereas the opposing side carries all the risk of combat.” His response (except for subsequent over-analysis) is perhaps the strongest part of this paper:

[T]here is no chivalrous reason for a just combatant to ‘equal the playing field’ or ‘fight fair.’ If combatant A fights under a just cause, while combatant B fights for an unjust cause, combatant A owes nothing to combatant B by way of exposing his/herself to some minimal threshold of risk. Thus, it is right for combatant A to reduce the risk in an engagement with the unjust enemy.

The sixth and final objection Strawser addresses is that UAVs lower the jus ad bellum threshold, in that UAVs improve the likelihood of success. This objection is falls in the same way that the fifth objection falls. There is nothing special here about the use of UAVs: any difference in technology between two enemy nations would be exploited to increase the probability of success of any military action.

Two Further Objections

Two additional objections, not addressed by Strawser, can be made to the use of UAVs.

Objection: UAVs are strictly a rich nation’s weapon5.

Response: This is simply not the case. Ukraine (though certainly not a poor nation thanks to American taxpayer support) is effectively utilizing off-the shelf commercially available drones in their war against Russia. These drones do not cost a fortune but can be obtained for at most a few hundred dollars each.

Objection: the drone pilot’s location is… problematic.

Response: At some point, the range of American artillery exceeded that of an enemy’s. Does this mean that the enemy surrenders without a fight? No: the enemy will simply attack from a position so close that the use of our artillery would be prohibitive because of the likelihood of harming our own troops.

Now consider what an enemy would do when facing off against a UAV system. There are at least four points of attack: the drone itself, the takeoff and landing site(s) of the drone, the communication system controlling the drone, and the drone’s pilot. Just as in the case of the artillery, the enemy will move close to attack. This means that the pilot is a legitimate target. In the case of drones, though, the pilot could be located within the United States. Thus, there is a legitimate target within our own borders. This is not to say that this renders the use of UAVs unethical, but it does raise an issue that policy makers do not seem to consider.

Conclusions

As with any weapons system, the use of UAVs introduces a range of problems. This is exemplified by the 30 September 2011 drone strike that killed terrorist Anwar al-Awlaki. Al-Awlaki was an American citizen; he was killed while in Yemen; the drone strike was a CIA operation; and the pilot was located at the Creech Air Force Base outside Los Vegas, Nevada6.

Strawser separates the legal, political, and technological problems raised by incidents like that, and instead focuses on the ethical problems associated with UAVs. He considers the use of UAVs in the framework of Just War Theory and finds that by themselves, UAVs do not alter the justness of unjustness of either a war or an action taken during that war.


Footnotes

  1. Bradley Strawser, “Moral Predators: The Duty to Employ Uninhabited Aerial Vehicles.”
  2. See Strawser’s response to the fifth objection, below.
  3. Does he also recommend non-Western nations ban the development of IAWs, too?
  4. A person holding a “Realist War Theory” would not subscribe to that position.
  5. This wording is used at the start of Simpson & Müller “Just War and Robot’s Killings.”
  6. Amy Zegart, Spies, Lies, and Algorithms.

Bibliography

Simpson, T. & Müller, V. “Just War and Robot’s Killings.” The Philosophical Quarterly (1950-) 66 no. 263, April 2016. https://www.jstor.org/stable/24672810

Strawser, B. “Moral Predators: The Duty to Employ Uninhabited Aerial Vehicles.” Journal of Military Ethics 9 no. 4, 2010. https://doi.org/10.1080/15027570.2010.536403

Zegart, A. Spies, Lies, and Algorithms: The History and Future of American Intelligence. Princeton University Press, 2022.