Report 6774
AIID Editor's note: This story is truncated. Please visit the original source for the full report.
On a warm morning a few months ago, Lipa, a Ukrainian drone pilot, flew a small gray quadcopter over the ravaged fields near Borysivka, a tiny occupied village abutting the Russian border. A surveillance drone had spotted signs that an enemy drone team had moved into abandoned warehouses at the village's edge. Lipa and his navigator, Bober, intended to kill the team or drive it off.
Another pilot had twice tried hitting the place with standard kamikaze quadcopters, which are susceptible to radio-wave jamming that can disrupt the communication link between pilot and drone, causing weapons to crash. Russian jammers stopped them. Lipa had been assigned the third try but this time with a Bumblebee, an unusual drone provided by a secretive venture led by Eric Schmidt, the former chief executive of Google and one of the world's wealthiest men.
Bober sat beside Lipa as he oriented for an attack run. From high over Borysivka, one of the Bumblebee's two airborne cameras focused on a particular building's eastern side. Bober checked the imagery, then a digital map, and agreed: They had found the target. "Locked in," Lipa said.
With his right hand, Lipa toggled a switch, unleashing the drone from human control. Powered by artificial intelligence, the Bumblebee swept down without further external guidance. As it descended, it lost signal connection with Lipa and Bober. This did not matter: It continued its attack free of their command. Its sensors and software remained focused on the building and adjusted heading and speed independently.
Another drone livestreamed the result: The Bumblebee smacked into an exterior wall and exploded. Whether Russian soldiers were harmed was unclear, but a semiautonomous drone had hit where human-piloted drones missed, rendering the position untenable. "They will change their location now," Lipa said. (Per Ukrainian security rules, soldiers are referred to by their first name or call sign.)
Throughout 2025 in the war between Russia and Ukraine, in largely unseen and unheralded moments like the warehouse strike in Borysivka, the era of killer robots has begun to take shape on the battlefield. Across the roughly 800-mile front and over the airspace of both nations, drones with newly developed autonomous features are now in daily combat use. By last spring, Bumblebees launched from Ukrainian positions had carried out more than 1,000 combat flights against Russian targets, according to a manufacturer's pamphlet extolling the weapon's capabilities. Pilots say they have flown thousands more since.
Bumblebee's introduction raised immediate alarms in the Kremlin's military circles, according to two Russian technical intelligence reports. One, based on dissection of a damaged Bumblebee collected along the front, described a mystery drone with chipsets and a motherboard "of the highest quality, matching the level of the world's leading microelectronics manufacturers." The report noted the sort of deficiencies expected of a prototype but ended with an ominous forecast: "Despite current limitations," it declared, "the technology will demonstrate its effectiveness" and its range of uses "will continue to expand."
That conclusion was prescient but understated, for the simple reason that Bumblebees hardly fly alone. Under the pressures of invasion, Ukraine has become a fast-feedback, live-fire test range in which arms manufacturers, governments, venture capitalists, frontline units and coders and engineers from around the West collaborate to produce weapons that automate parts of the conventional military kill chain. Equipped with onboard proprietary software trained on large data sets, and often run on off-the-shelf microcomputers like Raspberry Pi, drones with autonomous capabilities are now part of the war's bloody and destructive routine.
In repeated visits to arms manufacturers, test ranges and frontline units over 18 months, I observed their development firsthand. Functions now performed autonomously include: pilotless takeoff or hovering, geolocation, navigation to areas of attack, as well as target recognition, tracking and pursuit --- up to and including terminal strike, the lethal endpoint of the journey. Software designers have also networked multiple drones into a shared app that allows for flight control to be passed between human pilots or for drones to be organized into tightly sequenced attacks --- a step toward computer-managed swarms. Weapons with these capabilities are in the hands of ground brigades as well as air defense, intelligence and deep-strike units.
Drones under full human control remain far more abundant than their semiautonomous siblings. They cause most battlefield wounds. But unmanned weapons are crossing into a new frontier. And while no publicly known drone in the war automates all steps of a combat mission into a single weapon, some designers have put sequential steps under the control of artificial intelligence. "Any tactical equation that has a person in it could have A.I.," said the founder of X-Drone, a Ukrainian company that has trained software for drones to hunt for and identify a stationary target, like an oil-storage tank, and then hit it without a pilot at the controls. (The founder asked that his name be withheld for security reasons.)
The Kremlin's forces are also adopting A.I.-enhanced weapons, according to examinations of downed Russian drones by Conflict Armament Research, a private arms-investigation firm. With both sides investing, Mykhailo Fedorov, Ukraine's first deputy prime minister, said A.I.-powered drones are at the center of a new arms race. Ukraine's defenders must field them in large numbers quickly, he said, or risk defeat. "We are trying to stimulate development of every stage of autonomy," he said. "We need to develop and buy more autonomous drones."
To be sure, the familiar weapons of modern battlefields, all under human control, have caused immeasurable harm to generations of soldiers and civilians. Even weapons celebrated by generals and pundits as astonishingly precise, like GPS-guided missiles or laser-guided bombs, have often struck the wrong places, killing innocents, often without accountability. No golden age is being left behind. Rather, semiautonomous drones compound existing perils and present new threats. Peter Asaro, vice chair of the Stop Killer Robots Campaign and a philosopher and an associate professor at the New School, warned of rising dangers as weapons enter unmapped practical and ethical terrain. "The development of increasing autonomy in drones raises serious questions about human rights and the protection of civilians in armed conflict," he said. "The capacity to autonomously select targets is a moral line that should not be crossed."
The concept of a killer robot is vague and prone to hype, invoking T-800 of "The Terminator," an adaptive mobile killing machine deployed by an artificial superintelligence, Skynet, that perceives humanity as a threat. Nothing close exists in Ukraine. "Everybody thinks, Oh, you are making Skynet," said a captain responsible for integrating new technology into the 13th Khartia Brigade of Ukraine's National Guard, one of the country's most sophisticated units, in which Lipa and Bober serve. "No, the technology is interesting. But it's a first step and there are many more steps."
The captain and other technicians working with A.I.-enhanced weapons said they tend to be brittle, limited in function and less accurate than weapons under skilled human control. Many have a short battery life and brief flight times. Autonomous weapons with sustained endurance, high flexibility and the ability to discern, identify, rank and pursue multiple categories of targets independent of human action have yet to appear, and they would require, the captain said, "a waterfall of money" plus much imagination and time. "It's like the staircase of the Empire State Building," he said. "That's how many steps there are, and we are inside the building but only on the first floor."
As a safeguard against A.I.-powered weapons slipping the leash, humanitarians and many technologists have long advocated keeping "humans in the loop," shorthand for preventing weapons from making homicidal decisions alone. By this thinking, a trained human must assess and approve all targets, as Lipa and Bober did, ideally with the power to abort an individual strike and a kill switch to shut an entire system down. Strong guardrails, the argument goes, are necessary for accountability, compliance with laws of armed conflict, legitimacy around military action and, ultimately, for human security.
Schmidt has emphasized the necessity of human oversight. But at the end of a flight, some semiautonomous weapons in Ukraine can already identify targets without human involvement, and many Ukrainian-made systems with human override are inexpensive and could be copied and modified by talented coders anywhere. Some of those designing A.I.-enhanced weapons, who consider their development necessary for Ukraine's defense, confess to unease about the technology that they themselves conjure to form. "I think we created the monster," said Nazar Bigun, a young physicist writing terminal-attack software. "And I'm not sure where it's going to go."
The Dawn of Autonomous Attack
Bigun's own journey exemplifies how the duration and particulars of the war incentivized the creation of semiautonomous weapons. When Russia rolled mechanized divisions over Ukraine's border in 2022, Bigun was leading a team of software engineers at a German tech start-up. In early 2023, he founded a volunteer initiative for the military that eventually manufactured 200 first-person-view (F.P.V.) quadcopters a month. It was a significant contribution to Ukraine's war effort at a time when low-cost and explosive-laden hobbyist drones, not yet widely recognized as the transformative weapons they are, remained in short supply. His focus might have remained there. But as he and his peers heard from frontline drone pilots, they became concerned about declining success rates of kamikaze drones in the face of defensive adaptations, and they joined the search for solutions.
The problems were many. As more drones took flight, both sides developed physical and electronic countermeasures. Soldiers erected poles and strung mesh to snag drones from the air, and they covered the turrets and hulls of military vehicles with protective netting, grates or welded cages. Among the most frustrating countermeasures were jammers that flooded the operating frequencies used for flight control and video links, generating electronic noise that reduced signal clarity in pilot-to-drone connections. The systems became standard around high-value targets, including command bunkers and artillery positions. They also appeared on expensive mobile equipment, like air-defense systems, multiple-rocket launchers and tanks.
This complex puzzle led to the creation of drones that fly on fiber-optic cables, one solution that has appeared on the battlefield. It also fueled Bigun's interest in a form of computer-enabled attack, known as last-mile autonomous targeting, in which computer vision and autonomous flight control would guide drones through the final stage of attack without radio-signal inputs from a pilot. Such systems promised another benefit as well: They would increase the efficacy of strikes at longer range and over the radio horizon, when terrain or the Earth's curvature interfere with a steady radio signal.
In theory, the technical remedy was simple. When pilots anticipated a break in communications, they could pass flight control to an automated substitute --- a powerful chipset and extensively trained software --- that would complete the mission. With this tech coupled to onboard sensors and a camera, the pilot could lock the mini-aircraft on a target and release the drone to strike alone. The company Bigun co-founded in 2024, NORDA Dynamics, did not manufacture drones, so it set to work creating an aftermarket component to attach to other manufacturers' weapons. With it, a pilot would still fly the drone from launch until it neared a target. Then the pilot would have the option of autonomous attack.
Boosted by funding from the Ukrainian government and venture capital firms, NORDA spent much of 2024 testing a prototype that evolved into its flagship offering, Underdog, a small module that fastens to a combat drone. When flying an Underdog-equipped quadcopter, a pilot with F.P.V. goggles still controls the weapon from takeoff almost to destination. But in a flight's final phase, the pilot has the choice --- via an onscreen window that zooms in on objects of interest, like a building or car --- of approving an autonomous attack in a process called pixel lock. At that moment, Underdog takes over.
Underdog began with tests on stationary objects, but after repeated updates, its software chased moving quarry. Range extended, too. Early modules allowed 400 meters of terminal attack; by summer 2025, with the fifth version of NORDA's software, pixel lock reached 2,000 meters --- about 1.25 miles. By then the modules had been distributed to collaborating F.P.V. teams at the front. "We have some very good feedback," Bigun said. A company bulletin board listed early hits, among them Russian artillery pieces, trucks, mobile radar units and a tank.
One summer afternoon in western Ukraine, Bigun and several employees arrived at a tree line of wild pear, apple and plum dividing agriculture fields. Cows meandered past, swishing tails to shoo flies. Two white storks glided to the ground, alighted and picked their way through the furrows, hunting. NORDA's technicians sent a black S.U.V. with a driver and two-way radio to drive along the fields.
A test pilot, Janusz, a Polish citizen who had volunteered as a combat medic in Ukraine before joining the company, sat in the van wearing goggles and holding a hand-held radio controller. Once the S.U.V. drove away, he commanded an unarmed F.P.V. drone through liftoff. "I'm flying," he said over the radio.
The video feed showed golden fields and green windbreaks, overlaid with dirt roads. The drone climbed to about 200 feet. Its camera revealed the black S.U.V. less than a mile away. Onscreen, Janusz slipped a square-shaped white cursor over the image of the vehicle. A pop-up window appeared in the upper-left corner containing a stabilized close-up of the S.U.V. With his left hand, Janusz selected pixel lock. The word "ENGAGE" appeared within a red banner onscreen. Thin black cross-hairs settled on the center of the S.U.V.
Janusz lifted his hands from the controller. From an altitude of about 215 feet, the drone entered a slow dive. Within seconds it had flown almost to the moving S.U.V.'s windshield. Janusz switched back to human piloting and banked the quadcopter away, sparing the vehicle damage from impact.
At his command, the drone climbed, spun around and resumed pursuit, this time from more than 500 feet up. Its prey bounded along a road. Janusz lined up the cursor and engaged pixel lock again. The drone entered a second independent dive, accelerating toward the fleeing car. Once more Janusz overrode the software at the last moment. The quadcopter buzzed so closely that the whine of its engines was picked up by the vehicle's two-way radio and broadcast inside the pilot's van. Janusz smiled.
He swung the drone around, showing the van he sat in. The cursor briefly presented the possibility of pixel lock on himself. Janusz chuckled and steered the weapon away, back toward the S.U.V. The driver's voice crackled over the radio. "We will right now make a turn," he said.
For the next half-hour, the driver's maneuvers made no difference. No matter what he did, the drone, once pixel-locked, closed the distance autonomously, harassing the moving vehicle with the tenacity of an obsessed bird.
Compared with conditions common in war, the field exercise was simple. Groundspeeds were slow, flights were by daylight, no power lines or tree branches blocked the way. The drone maintained a constant line of sight with the S.U.V., and the software had to lock on a lone vehicle, not on a target weaving through traffic or passing parked cars. But with more training and computational power, the software could be improved to discern and prioritize military targets based on replacement cost or degree of menace, or fine-tuned to strike armored vehicles in vulnerable places, like exhaust grates or where turrets meet hulls. It might be trained to hunt most anything at all --- a bus, a parked aircraft, a lectern where a speaker is addressing an audience, a step-down electrical transformer distributing power to a grid.
For Bigun, the natural worry that such technology could be turned against civilians has been overridden by the imperatives of survival. Beyond coding, his work involves interacting with arms designers from Ukraine and the West, including at weapons expositions, where he seeks partners and clients. But he often visits the Field of Mars, a cemetery in Lviv that is a repository of solemn memory and raw pain.
Bigun's great-uncle was a Ukrainian nationalist during the totalitarian rule of Stalin. For this, Bigun's grandfather was deemed an enemy of the state by association, and shipped to Siberia at age 16. Both men are buried on the grounds, where they have been joined by a procession of soldiers killed since the full invasion. On an evening following one of Bigun's arms-show appearances, mourners at the field sanded tall wooden crucifixes by hand, then reapplied lacquer; a widow sat beside a grave talking to her lost husband as if he were sipping tea in an opposite chair; a family formed a semicircle around a plot covered in flowers with each member taking turns catching up a deceased soldier on household news. The field reached full capacity in December --- almost 1,300 graves --- prompting Lviv to open a second cemetery for its continuing flow of war dead. Just before Christmas, the second field held 14 fresh mounds.
Bigun abhors the need for these places. But beside commemorations of friends snatched early from life by the war, he said, he finds inspiration to continue his work. "This is where I feel the price we pay," he said, "and it motivates me to move forward." By the end of the year, NORDA Dynamics had provided frontline units fighting in the East more than 50,000 Underdog modules.
...
Read the rest of the story at the original source.