The Pentagon wants autonomy that scales with fewer operators
The Pentagon is moving toward a much larger autonomous-warfare effort, but a basic operational problem remains unresolved: drone systems may be uncrewed, yet they still demand extensive human labor. A new Defense One report says DARPA is pursuing projects intended to make robots smarter, more self-organizing, and less dependent on constant human direction or vulnerable network links.
The immediate goal is not just to field more unmanned systems, but to make it realistic for a relatively small number of personnel to manage a much larger number of robotic platforms. That challenge has become more urgent as the proposed 2027 budget for the Pentagon office leading drone warfare is set to explode from $226 million this year to $54 billion under a new spending proposal.
Two DARPA programs target the core bottlenecks
According to the report, DARPA has issued two requests to industry that address different sides of the same operational problem. The first, Materials for Physical Compute in Untethered Robotics, is intended to make autonomous systems more intelligent without relying heavily on connections to remote computing resources. The second, Decentralized Artificial Intelligence through Controlled Emergence, aims to help robots form teams and carry out missions collectively.
Taken together, those efforts suggest a strategic shift away from models in which autonomous systems are only as useful as their connection to centralized infrastructure or their access to large human support staffs. The Pentagon appears to be asking for machines that can reason more locally and coordinate more effectively with one another.
That matters in contested environments, where data links can be degraded, jammed, or exposed. If a robot needs to transmit large quantities of data offboard for processing and then wait for commands to return, it becomes both less resilient and less efficient. DARPA’s physical-compute effort is aimed at reducing that vulnerability by giving robots more onboard intelligence while preserving battery life.
Past drone operations show the staffing burden
The scale of the problem is underscored by the historical examples cited in the report. Retired Army Gen. David Petraeus and scholar Isaac Flanagan argued in a recent commentary that earlier U.S. drone operations in the Middle East were constrained not primarily by aircraft numbers, but by the personnel and organizational structure needed to operate them.
They noted that each Predator combat air patrol providing continuous surveillance required nearly 150 personnel. As the demand for drone coverage increased, the limiting factor was not simply the number of platforms available, but the number of trained people and the institutions required to use them effectively.
That observation has direct implications for the Pentagon’s present spending ambitions. Buying large numbers of autonomous systems without solving the operator, training, maintenance, and integration problem could produce impressive inventories on paper without delivering proportional military value in the field.
Why self-organizing systems matter
The second DARPA effort, centered on decentralized AI and controlled emergence, points to one of the most difficult military automation challenges: how to let multiple robots behave as a team without requiring an unmanageable stream of human micro-direction. If machines can dynamically coordinate roles and actions, a single human supervisor may be able to guide a much larger force package.
That does not eliminate the need for human oversight. Instead, it changes the level at which people operate. Rather than directing every movement or sensor task, operators could define objectives and constraints while autonomous systems handle more of the local coordination. For modern militaries, that is one of the most attractive promises of AI-enabled autonomy.
It is also one of the riskiest areas to get wrong. Swarm-like or team-based robotic behavior must still be predictable enough for commanders to trust, maintain, and integrate into broader force structures. The Pentagon’s interest in controlled emergence reflects that tension. It wants adaptability, but not chaos.
Spending growth raises pressure for doctrine and discipline
The proposed jump from $226 million to $54 billion is so large that it changes the nature of the conversation. At that scale, the issue is no longer whether autonomous systems matter, but whether the Pentagon can absorb them intelligently. Defense One notes that much of such funding could be wasted if the military spends before establishing a clear understanding of how operators will buy, train on, use, and maintain autonomous weapons.
That warning is important because defense technology failures often emerge not from a lack of promising hardware, but from weak concepts of operation, fragmented procurement, and inadequate sustainment planning. In other words, the challenge is organizational as much as technical.
The current DARPA projects appear to recognize that. They are not simply asking for better drones. They are asking for systems that reduce the burdens that have historically prevented drone fleets from scaling efficiently.
The broader strategic message
The report captures a military establishment trying to move from drone quantity to autonomous effectiveness. Smarter onboard processing and decentralized teamwork are not abstract research themes in this context. They are attempts to solve a practical bottleneck that has limited the utility of unmanned systems for years.
If those efforts succeed, they could help transform how many robots a unit can credibly employ and how resilient those systems are in contested environments. If they fail, the Pentagon risks spending heavily on autonomy while preserving the same personnel and organizational constraints that made earlier drone operations labor-intensive.
The central lesson is straightforward. Autonomous warfare does not become scalable merely because machines can fly, drive, or sense without pilots onboard. It becomes scalable when the surrounding system of command, computation, and coordination no longer demands unsustainable human effort. That is the problem DARPA now appears to be trying to solve before the Pentagon commits far larger sums to the autonomous battlefield.
This article is based on reporting by Defense One. Read the original article.
Originally published on defenseone.com







