Artificial Intelligence, or AI, is the most transformative technology of our time. It has the potential to drive breakthroughs in medical research, education, space exploration, and, yes, black drone swarm death machines.
Yet AI still needs people to train it, teach it, and improve it, but the genie, many genies, are out of the bottle on AI.
Remember that young kid that Zuckerberg snatched up?
You've reached your daily free article limit.
Subscribe and support our veteran writing staff to continue reading.
Artificial Intelligence, or AI, is the most transformative technology of our time. It has the potential to drive breakthroughs in medical research, education, space exploration, and, yes, black drone swarm death machines.
Yet AI still needs people to train it, teach it, and improve it, but the genie, many genies, are out of the bottle on AI.
Remember that young kid that Zuckerberg snatched up?
Project Maven is an algorithmic system that uses machine learning to identify objects in drone video footage. It provides a saving grace for America’s military by reducing the number of hours they need to spend analyzing drone video footage by 95%. Project Maven promises to do more than this too.
Project Maven is an initiative with the US Department of Defense (DOD) to develop algorithmic software that the military can use to analyze drone video footage. The various drones employed by the U.S. take hours and hours of video footage on each mission that then becomes hours and hours of humans trying to process and analyze that footage, Project Maven aims to assist in the analysis of this footage to one-quarter of the time it now takes.
The idea behind Project Maven is to use machine learning and artificial intelligence to identify objects in these videos that it can then flag for a human analyst to have a look at.
What nobody is talking about is how this tech can be used for drone offensive warfare and the consequences of AI and drone warfare gone wild within the Department of Defense? The DoD says that drones will never operate independently of human operators that make the actual decision on whether ordnance is released on a target. Presumably, this is due to a certain distrust in the decision-making process robots might use to determine whether to unleash a missile on it and create a human chain of accountability when an error occurs.
This type of drone is in the shape of a fish, designed to be hard to spot on radar. The fish-shaped design is also more aerodynamic than traditional drones, making it easier for these drones to fly low and fast. Black swarm drones are equipped with cameras that allow them to fly at night without being spotted by infrared sensors.
Drone experts predict that black swarm drones will soon be used in covert military missions all over the world. This means that future wars might rely on these new drones instead of soldiers or missiles.
Maven is potentially a Trojan Horse DOD contract that will surely lead to weaponized Drone AI at a level we’ve never seen before. If you think accidentally killing Afghan civilians with a Predator was bad… imagine what could happen here.
But there’s another big problem with DOD tech right now, and we’ll talk about it at the end of this report.
Enter Anduril and Palmer Luckey?
(Read the full white paper here.)
Remember that teen kid who built a garage VR headset called Oculus?
Palmer Luckey, the son of a car salesman, was homeschooled by his mother in Long Beach, California (Go home school!).
Lucky developed a new model for VR headsets as a seventeen-year-old and, four years later, sold Oculus Rift to Zuck’s Facebook for over $3 billion.
Anduril Industries is the latest venture of Palmer Luckey, the now 26-year-old entrepreneur. He began work on Project Maven last year, along with efforts to support the Defense Department’s newly formed Joint Artificial Intelligence Center.
Read his Wiki Bio here.
The US military is one of the largest users of AI technology today, and that should scare the hell out of all of us. Just look at who was in charge of the Afghanistan pullout and realize that the same leadership is in charge of weaponized AI.
What could possibly go wrong?
The Pentagon has called Project Maven “the most ambitious machine learning effort yet undertaken by the US government.”
Of course, they would.
Project Maven uses machine learning to automatically identify objects in drone footage. It can do this with an incredible degree of accuracy—up to 95% depending on the complexity of the scene.
This system is a breakthrough for America’s military. Manually analyzing video footage would take many hours, and it’s difficult for humans to reliably identify all of the objects in a complex scene. Project Maven reduces this workload by an incredible amount.
But the benefits don’t end there: Project Maven can also use its data and algorithms to track vehicles and even individuals and control the drone swarm. Coming to a battlefield soon near you.
Artificial Intelligence has changed the way we live and work in ways most people don’t realize and will only become more prevalent in the future.
It has been predicted that within the next 20 years, AI will outperform humans in every cognitive task we now do on the planet. So, how can we responsibly develop and integrate AI on the battlefield?
The major problem we see that we hinted at earlier is that the current (at the top) Department of Defense leadership is weak and barely readable.
And if they can’t prevent a guy in a Chewbacca from storming Capital hill or manage the Afghanistan withdrawal (who took responsibility?), how are they qualified to unleash weaponized Artificial Intelligence on the rest of the world.
That 95% rate of accuracy is pretty good, the problem is the other 5%. That is where the collateral damage would happen in the form of killing the wrong people. We may never be able to attain 100% perfect accuracy from AI in locating, tracking, and targeting terrorists for one very simple reason: The creators of AI are human and not perfect either. When an AI-directed weapon does kill the wrong person, it will be very easy to shift the blame to the robot itself saying it got confused or mistook or misunderstood something. The truth is though that everything that AI program does, or reacts to, or decides to do was programmed by a flawed human.
This is where the fault would truly lie and we should never lose sight of that. We may be able to create perfect AI-directed drone weapons someday, but they will be sent on their missions by imperfect, often badly flawed people.
Promotion of Delta Force Trained General Who Led 82nd Airborne Division During Afghanistan Evacuation Held Up By Senate
Navy Removes Yet Another Officer From Command
Captain Lacie Hester First Woman in Air Force History to Be Awarded Silver Star
US Marine Corps Achieves Full Capability for MK 22 Sniper Rifle
Inside Delta Force: America’s Most Elite Special Mission Unit
Join SOFREP for insider access and analysis.
TRY 14 DAYS FREEAlready a subscriber? Log In
COMMENTS
You must become a subscriber or login to view or post comments on this article.