Russia is saving hypersonic missiles for future use, and Putin is betting on AI.
So far, Moscow has not employed any hypersonic weapons in large numbers, and Ukraine’s current air defense is impotent against them, according to military expert Yury Netkachev.
“During the entire time of combat action, while carrying out the tasks of the special military operation, the 9-А-7660 Kinzhal hypersonic airborne system has been used only three times,” Netkachev noted.
You've reached your daily free article limit.
Subscribe and support our veteran writing staff to continue reading.
Russia is saving hypersonic missiles for future use, and Putin is betting on AI.
So far, Moscow has not employed any hypersonic weapons in large numbers, and Ukraine’s current air defense is impotent against them, according to military expert Yury Netkachev.
“During the entire time of combat action, while carrying out the tasks of the special military operation, the 9-А-7660 Kinzhal hypersonic airborne system has been used only three times,” Netkachev noted.
“Only three missiles were launched and all at important targets. Meanwhile, back in 2021, in the Southern Military District, an entire platoon of the carriers of such weapons, hypersonic long-range fighter-interceptor MiG-31K aircraft, was created. So far there is no information as to how many planes it is supposed to include. Yet, I think that it is no less than 30. And their Kinzhal weapons capacity should be no fewer than 10 units per aircraft. So, potentially, Russia can hit no fewer than 300 strategic targets in Ukraine with 100% probability,” Netkachev said.
According to Moscow, strikes by Russia’s Military Forces against “Ukraine’s military command system and related electric power facilities” are effective, while Kyiv allegedly denies that most projectiles were downed.
“The Zelensky regime claims that for a greater reliability of its air defense, the US and NATO members should provide new supplies of air defense systems, including Patriot missiles. Both NATO and Ukraine are ready to continue hostilities,” the Russian news site TASS notes.
According to Deputy Economy Minister Maxim Kolesnikov, the AI market has grown significantly over the past three years. However, Putin noted that because the system itself is now controlled by a “narrow circle of states and financial groups,” Russia is constantly under assault when it comes to international payments. Because of this, Putin demanded a system independent of banks and third-country interference.
Economist Vladimir Klimanov told Izvestia that new solutions concerning payments systems are necessary, as the West’s sanctions will be lifted for a while. However, cooperation with friendly countries on this issue is highly restricted because they fear being sanctioned by Moscow if they cooperate.
Turkey has been pounding the Syrian Kurds with airstrikes since Nov. 21, and the United States is now demanding that Turkey de-escalate the situation, Pentagon Press Secretary Patrick Ryder said. Turkey has accused Washington’s allies, the YPG, a group tied to the Kurdistan Workers’ Party (PKK), of being involved in the Nov. 14 terrorist attack in Istanbul. US military bases in YPG-controlled regions provide support to the Kurds.
According to Moscow State Institute of International Relations Deputy Director Victor Mizin, the Americans are not concerned about the possibility of Syrian Kurdish statehood because they are not excited about it, which would result in increased instability in the region. Turkey believes it can act with impunity because the US cannot prevent it, according to expert Mizin. Despite Ankara’s irritation, Washington is irked by Turkey’s actions in Syria, says Mizin. Because the US does not want to alienate Turkey, it will attempt to prevent it from invading Syria via land, sanctioning Ankara, or reducing military-technical cooperation with the US.
On the eve of parliamentary elections and the 2023 presidential election, delivering strikes on the Syrian Kurds benefits the current Turkish political leadership. According to Kirill Semyonov, an expert with the Russian International Affairs Council, however, Turkey is not technically prepared for an incursion on the ground and, for now, is content to monitor the situation.
In all likelihood, Russia hasn’t used AI-enabled weapons in Ukraine, but that could change.
A lethal drone developed in Russia has been used in combat in Ukraine, raising fears about using artificial intelligence in warfare, WIRED reported in March. According to the story, the KUB-BLA, an improvisation of a small kamikaze drone that smashes into and detonates on enemy targets, was developed by ZALA Aero, a subsidiary of Kalashnikov (AK-47 maker), which Rostec, a Russian defense-industrial company, partly owns.
The WIRED story, despite its sensational headline, included a critical caveat: “It is unclear whether the drone was operated in this [an AI-enabled autonomous] manner in Ukraine.” Other outlets, however, disregarded the caveat and re-reported the story.
The press release does not mention the KUB-BLA or military applications; instead, it describes ZALA Aero’s machine-learning AI drone product line for industrial and agricultural markets. Due to their complexity, military applications are significantly more challenging to incorporate machine-learning AI than industrial or agricultural applications. Modern machine-learning AI with deep neural networks can deliver tremendous performance advantages, but this performance is contingent on gathering a large amount of training data during development. Furthermore, that data must closely replicate operational conditions.
It is easier to obtain data from commercial enterprises than from an enemy force if friendly weapons and sensors rarely cross paths with hostile ones. Because military AI systems like satellite reconnaissance are among the most developed, satellites take pictures of Russian and Chinese militaries even in peacetime. Experts digitally label them to become training data. Training data is what machine learning systems learn from. A learning3
Combined algorithm and training data enable AI systems to recognize what is in a picture. On the other hand, application-specific training data is needed. For example, satellite image recognition training data can only be used to teach an AI for a robotic drone’s targeting system (at least with today’s technology).
Developing a tank-targeting computer that modern AI powers require sufficient correct training data. It is possible in theory, but collecting accurate training data is much more difficult in practice, according to the Center for Strategic and International Studies.
In the past, the Rostec and Kalashnikov firms have been open to declaring their attempts to create weapons that combine modern AI and combat autonomy, so it would be strange if they had not developed the KUB-BLA and disclosed it in their marketing materials. Moreover, Kalashnikov has advertised the KUB-BLA both for Russian and international customers.
According to the announcement from Kalashnikov, what are the two methods of delivering the drone and its explosive warhead to target coordinates? “Target coordinates are specified manually or acquired from [the sensor] payload targeting image.”
Many people assumed that KUB-BLA was using AI, but “payload targeting photo” describes many other precision-guided munitions and loitering drones, including those with no AI abilities. A Rostec executive told the KUB-BLA is a domestic version of the Israeli Orbiter 1K drone, which appears to be nearly identical. On the ground, human operators monitor the drone’s sensor video and select targets directly from the video feed.
According to the US Department of Defense, an autonomous weapon system autonomously maintains a target lock and navigates to the target without autonomously selecting and engaging targets. A human has already preselected the objective, and the drone maintains the target lock and guides to the goal. In US Department of Defense policy, the standard for determining whether something is an autonomous weapon system is whether it can “select and attack targets.” The Orbiter 1K, Javelin, and Stinger missiles are not autonomous weapons, since they are incapable of firing and forgetting. Even though these missiles use thermal imaging technology, which was once state of the art, they are not autonomous weapons. Dozens of countries have used these and other systems for decades.
WIRED has yet to prove that the KUB-BLA received a lethal AI-targeting upgrade before being used in Ukraine. Kalashnikov should have mentioned those capabilities if the weapon had them. Kalashnikov mentioned seeking to develop a “fully automated combat module” based on AI deep neural network technology. It is, therefore, unlikely that he would neglect to mention those capabilities if they were present.
There is little reason to believe that Russia is now employing AI-enabled autonomous weapons in Ukraine. That is the good news. The bad news is that if the illegal war in Ukraine continues, Russia may use autonomous weapons with or without advanced AI, which international law prohibits.
A Russian news outlet named RIA Novosti recently published an interview with an unnamed military source (through Microsoft’s automatic translation, which is worth quoting at length).
The Russian reconnaissance and reconnaissance-strike UAVs will be able to automatically identify military equipment used by NATO countries on the battlefield and create a real-time location map directly on the device. The neural network training algorithms are what allow it to recognize a wide variety of equipment in a variety of environments, including those with short exposures (visible for just a few seconds) and those in which only a portion of the sample is visible—for example, just part of a combat vehicle is visible from behind cover.
Training data collection remains a significant hurdle for many military AI development projects, as mentioned earlier. NATO has provided weapons and equipment to Ukraine to create the best opportunity yet to collect operational training data for new AI models and diverse military AI applications, in addition to Russia’s invasion of Ukraine being a disaster in many ways. According to an anonymous source, Russia’s military seems to be considering this opportunity seriously.
Domestic opposition to the war in Russia has caused an exodus of tech workers, and Russian semiconductor chips are scarce because of sanctions imposed on the country. These are significant problems, but advanced AI is not required to give weapons autonomous lethal capabilities in all categories, only a willingness to delegate decisions and give military machines freedom to act. For example, the Israeli-built Harpy unmanned weapon that lurks in the air hunting for enemy radar signals to attack has been around since the late 1980s. The KUB-BLA and the Lancet are two drones made by Kalashnikov. According to Rostec executive Mikhail Voevodsky, the Lancet is a Russian analogue to the more recent Harpy-2.’
The Lancet is a multipurpose weapon with precision strike capabilities. The weapon system consists of reconnaissance, navigation, and communications modules. It can autonomously locate and strike targets without ground or sea-based infrastructure.
The Kalashnikov Lancet seems to be a remote-controlled or independent combat weapon based on user demands. The Russian military has already used the Lancet in Syria in its autonomous mode, but observers have not yet ascertained whether it is being utilized in Ukraine. Nevertheless, if the system performs as advertised by Kalashnikov, Russia will likely be eager to use the Lancet’s autonomous weapon functionality once it is spotted in the country.
Even though many experts predicted that drones would be effective in counterinsurgency operations in the Ukrainian war, they had been viewed as beneficial in many situations before the war; drones were effective against technologically advanced adversaries such as Russia. However, in military technology competitions, every move is countered by another. Since many analysts believe that one of the drone weapons’ current weaknesses is the reliance on a high-bandwidth communications link to their human remote controllers, further jamming and electronic warfare systems will be deployed on both sides of the Ukrainian war for many more months or years. As a result, both sides will seek higher levels of autonomy in their drone weapons.
While most nations worldwide have been cautious about the development and use of autonomous weapons, Russia has acted as a stubborn obstructionist through years of UN expert discussions on the subject. The majority of nations around the world have been cautious about the introduction of military AI. Much of the AI ethics and autonomous weapons debate has concentrated on whether AI increases the risk of unintentional and harmful accidents. The Russian invasion of Ukraine, a country that has been invaded unprovoked, is a tragic reminder that, although unintentional harm to civilians is a real tragedy, the problem of intentional harm to civilians is unsolved. The Russian military has frequently attacked residential areas, hospitals, and humanitarian institutions.
Russia’s human soldiers in Ukraine have reportedly deserted in large numbers and suffered heavy losses. Vladimir Putin would almost certainly employ autonomous weapons in battle if he thought it would give him an advantage, given the frustrations faced by human soldiers.
But we must start thinking about how to ensure that he does not.
2024 Holiday Gift Guide for Real Men Picked by Special Ops & Military Guys. Women, You Can Thank Us Later
SOFREP Daily: Christmas Market Attack Shakes Germany’s Holiday Spirit, Russian Cargo Ship Sinks After Explosion, War Overshadows Bethlehem’s Somber Christmas Eve
Navy SEAL Sniper’s New Video Game Announced: Center Mass – Streets of Ramadi
Join SOFREP for insider access and analysis.
TRY 14 DAYS FREEAlready a subscriber? Log In
COMMENTS
You must become a subscriber or login to view or post comments on this article.