Top Top Top

Autonomous Divisions

Blended Categories enable unexpected exploits




Militaries typically play to doctrines of:

  • Power (massed units, large-effect weapons)

  • Mobility (speed, maneuvering, and enclosure)

  • Stealth (concealment, ambush)

Blitzkrieg tactics in WW2 were so effective because they combined massed Power, deployed dynamically through Mobility. Modern stealth bombers can combine all three elements, but are very expensive to build and operate.

However, a new military doctrine is presently emerging:

  • Autonomy (swarms, expendability, safety)

Autonomy dictates “rush in, if 90% fail it doesn’t matter”. Autonomy permits long-ranged or lurking weapons, which may permit stay-behind ambushes, combining elements of land and sea mines. Autonomous swarms can function like an expendable blitzkrieg, yet emerging suddenly and unexpectedly, operating in stealth in a land, sea, air, or space domain. 

The mass deployment of drone armies can very quickly change the balance in a conflict, especially autonomous weapons with functions comparable to a human soldier, or a manned military vehicle. The deployment of a large number of $6000 drones to the Ukrainian battlezone is courting controversy. Drones armed with inexpensive MANPADS-style warheads present an inexpensive and increasingly attractive alternative to conventional armies, especially when used in a defensive context. Iron Dome costs $100,000 for every $800 rocket it intercepts. This asymmetry facilitates autonomous wild weasel-style tactics. Autonomous (potentially nuclear-powered) stealth carriers can deploy a legion of drones in a range of configurations, potentially waiting for weeks or months before sudden activation. 

Together, these developments might make conventional war increasingly untenable, amplifying a transition towards AI-driven warfighting through practical necessity. However, if human agency is reduced in conflict, robotic Einsatzgruppen to murder civilians become more feasible, as extreme loyalty is not required, and there are fewer risks of insider witnesses. Policy orgs should ensure that human beings are always in the loop, and no warfighting machine ever makes an autonomous decision on who to kill.


There could be blowback from a snap decision to ban to dissuade the use of AI weapons in war. For example, a conflict with fewer human warfighters may be a less trauma-inducing one, both physically and emotionally. 

Autonomous systems can provide powerful mechanisms for smaller, more vulnerable states to defend themselves from larger aggressors, at least long enough to call upon allies for further support. An international ban may weaken good faith actors who cooperate with international law, whilst leaving bad faith actors, including terrorist groups, unaffected, or even strengthened.

However, there are other applications of AI in conflict beyond war per se. Fifth Generation Warfare is defined by its focus on demoralization of the enemy, whilst attempting to inoculate one’s own group against such effects. FGW attempts to undermine the enemy through means that have plausible deniability, and cannot be directly linked to any actor, or which may even appear to be an accident.

We live in a world where malware, including other neural networks can now be secretly embedded in AI systems, potentially enabling tacit unauthorized control at a later time, perhaps even enabling a plausibly deniable false flag operation. AI also enables Zersetzung-style attacks upon persons of interest such as dissidents, and influential foreign nationals.

A ban on overt use of AI in conflict may therefore create evolutionary pressures towards covert applications instead, which ultimately may be more devastating that a hot conflict. The loss of soldiers or even civilians, whilst tragic, is something that nations can recover from. Conversely, it may not be possible to return from demoralization attacks which foment permanent polarization and the breakdown of trust within societies.

Apocalyptic outcomes may be reached through invisible weapons of mass destruction. This should be taken into account as potential sequelae to legislation against overt AI-enabled warfare.


The breakout of war in Ukraine, and the international allied effort to supply weapons there (though not soldiers), also presents a potential loophole in international law. Legally, deploying soldiers in a conflict is a very different matter than deploying weapons. However, autonomous weapons are classified as weapons, like a rifle, which presents a legal conundrum.

There is a potential for a race to the bottom in terms of safety, with little care afforded to the safety of civilians during and after an engagement. The specter of using autonomous weapons to terrorize civilization populations is also very concerning, especially concealed ‘stay-behind’ semi-active area denial units which deploy latently. Such units may be designed to maim but not kill per se, thereby evading legislation on ‘lethal’ autonomous weapons.

Existing legislation will need to be rewritten to account for these developments, as the furor over massive deployments of armed drones in lieu of human troops creates exploitable loopholes. Dual-use technologies such as ‘autonomous firefighting and rescue equipment’ or ‘weed-blasting laser drones’ might be provided for one ostensible intention, yet rapidly repurposed for another. 

It’s crucial that such insidious applications of autonomous weapons are tackled with urgency. These capabilities are emerging at a time when war itself is becoming increasingly tacit and deniable, an invisible war of demoralization and infrastructure attacks. The future of conflict will be two-pronged, one clandestine and aimed at civilians, and drone-oriented autonomous doctrine where things must inevitably and avoidably turn manifestly hot. 


Another disruptive aspect in war is the inexpensive and routine launch of large payloads into orbit, a price that has drastically reduced in recent years. This may make kinetic bombardment ‘rods from God’ economically feasible, and deployable to orbit on short notice as an intimidation tactic, with yields comparable to a small nuclear weapon, yet without necessarily invoking Mutually Assured Destruction or Non-Proliferation Treaties. Even non-state actors could potentially hold people to ransom by threatening to drop a dense but ostensibly legitimate payload (such as tungsten wing and engine parts, or Radioisotope Thermoelectric Generator isotopes) onto a precarious geological fault or major city. It’s important to ensure that a coordinated response to such threats is possible, hopefully in a manner less likely to risk inducing a Kessler Syndrome.

These capabilities are emerging at a time when war itself is becoming increasingly tacit and deniable, an invisible war of demoralization and infrastructure attacks. The future of conflict will be two-pronged, one clandestine and aimed at civilians, and drone-oriented autonomous doctrine where things must unavoidably become hot. Regulators and legislators must act now to ensure that these technologies don’t return us to a 21st century version of trench warfare. The Geneva Conventions must also be updated to accommodate the protection of civilians in these times.