Michèle Flournoy at Nexus 23, 3000.09, and more
New speakers at Nexus 23, policy developments on responsible AI, REAIM 23, and takes on news
The Nexus Newsletter
Welcome back to The Nexus Newsletter. This week, we announce four new speakers for Nexus 23, comment on recent policy developments on AI and autonomous weapons, and provide our take on recent news from the nexus of autonomy and national security.
New speakers for Nexus 23
We are excited to announce another round of distinguished speakers for Nexus 23:
Michèle Flournoy, Co-Founder and Managing Partner of WestExec Advisors
Capt. Michael D. Brasseur, Commodore of the U.S. Navy’s Task Force 59
We are honored to have these speakers return to the stage for Nexus 23. Last year at Nexus 22, they delivered insightful remarks on lessons from the conflict in Ukraine, operationalizing autonomy in the maritime domain, how AI assurance relates to trust, and how dual-use companies can scale effectively.
Register for Nexus 23 to learn what has changed in the past year.
Comment: Recent policy developments on autonomous weapons
In January, the Department of Defense updated its policy on autonomous weapon systems, per the 10-year review requirement for all DoD directives. DoD Directive 3000.09, first published in 2012, establishes policy and review responsibilities for developing and employing autonomous and semi-autonomous weapon systems “in a responsible and lawful manner.”
The update - which Dr. Mike Horowitz teased at Nexus 22 last year - provides mainly process clarifications and refinements to the Directive, including references to new policies on AI. Read through an annotated version of the Directive by Dr. Paul Scharre, Vice President and Director of Studies at the Center for New American Security, for a detailed accounting of the changes and updates to DoDD 3000.09.
In addition, last week, the U.S. Department of State published a Political Declaration on Responsible Military Use of Artificial Intelligence and Autonomy. U.S. Under Secretary of State for Arms Control and International Security Bonnie Jenkins told Defense One: “The aim of the political declaration is to promote responsible behavior in the application of AI and autonomy in the military domain, to develop an international consensus around this issue, and put in place measures to increase transparency, communication, and reduce risks of inadvertent conflict and escalation.”
Our take: We support the changes to DoDD 3000.09, especially those that recognize the DOD’s efforts to ensure that autonomous, semi-autonomous, and AI-enabled weapons systems are developed and employed in an ethical and responsible manner. Furthermore, we are happy to see that the U.S. is continuing to lead the effort to set global norms on ethical and responsible AI and autonomy.
We especially support the inclusion of both the DOD’s AI Ethics Principles and the Responsible AI Strategy & Implementation Plan, issued in 2020 and 2022, respectively, in the updated Directive. These Principles, the first such by any military, address new ethical challenges that AI raises and ensure responsible AI development and use by the Department. The Responsible AI Strategy & Implementation Plan builds upon them with detailed guidance on topics such as requirements, user training, and test and evaluation (T&E) and verification and validation (V&V) .
The updated Directive also clarifies that covered weapon systems need to go through rigorous T&E and V&V, regardless of the acquisition pathway or DOT&E oversight status, which currently only apply to programs of record. The Directive’s emphasis on “rigorous developmental and operational T&E” is important for new software-defined capabilities, especially those with algorithms that can be “rapidly reprogrammed on new input data”. These processes assess whether systems will remain safe and performant even with changing use cases and operational design domains.
We are glad to see that the updated version of DoDD 3000.09 specifically highlights how modeling and simulation tools should be used to test system software: “Automated testing tools, such as modeling and simulation, will be used whenever feasible.” Virtual testing solutions, including modeling and simulation, enable test engineers to rapidly evaluate system performance at scale in a way that real-world testing cannot.
We urge any autonomy development programs in the Department as well as T&E offices, such as the Office of the Undersecretary of Defense for Research and Engineering (USD(R&E)) and the Director of Operational Test and Evaluation (DOT&E), to prioritize acquiring commercially-proven modeling and simulation solutions to enable rapid, comprehensive, and scalable T&E and V&V.
For more on T&E and V&V of autonomous systems, read our V&V handbook and blog post explaining how virtual modeling and simulation tools facilitate comprehensive T&E efforts for autonomous military vehicles.
REAIM 2023
Last week, our team spoke at The Netherland’s summit on Responsible AI in the Military Domain, REAIM 2023. We explained how militaries can build trust into autonomy by leveraging commercial best practices for autonomy development and testing, and demonstrated how autonomy programs could build a safety case and a scenario-based test plan to provide traceable, quantitative evidence of autonomy trust.
News we’re reading
Autonomous systems are gaining momentum in the national security space. Below, we’ve pulled key quotes from recent articles of interest, plus brief commentary from Applied Intuition’s government team:
Acquisition Talk | The state of Navy unmanned with Dorothy Engelhardt
By Eric Lofgren, Research Fellow at the Center for Government Contracting, George Mason University
Key quote: “We can buy platforms all day long. That’s not the sexy sauce. The sexy sauce is getting our ones and zeros, our command and control, our battery life, our endurance, trying to figure out what autonomy looks like for all of these systems really matters. So, before we go out and buy all of these platforms and figure out, ‘oh my gosh, they’re not talking to each other,’ we’re actually trying to go out and build that ecosystem first with the communications and making is so that when we actually put these platforms out to sea or in the fleet, they are able to communicate and integrate with the manned fleet. So there’s a lot of homework that has to be done. It’s not intentional to delay the platforms. Rather, we’re really trying to take a more, I’ll just say, engineering approach to building the necessary pillars that we need to sustain a capability for MUM-T.”
Our take: Great interview with Dorothy Engelhardt that provides an overview of unmanned and autonomous systems in the maritime domain. Getting the software architecture right first, before acquiring the hardware platforms, is the right approach to maritime autonomy. We are looking forward to hearing more from her at Nexus 23 and are honored to have her as a speaker.
We would also like to take a brief moment to honor Eric Lofgren’s work at the helm of Acquisition Talk. Since 2018, his podcast and blog have been among the best resources for defense acquisition news, in-depth interviews, and commentary in the ecosystem. We wish him the best as he transitions into a new role as a PSM on SASC.
Colin Carroll, Head of Government at Applied, joined Eric Lofgren at the end of last year for a conversation on acquiring autonomous and AI-enabled technologies. Listen now: Scaling AI/ML in defense with Colin Carroll.
Breaking Defense | Air Force ‘committed’ to ‘collaborative’ drones, as budgets will show: Gen. Brown
By Theresa Hitchens, Space and Air Force Reporter, Breaking Defense
Key quote: “We’re getting down the path to have much more capability for uncrewed aircraft,” he told an audience at the Brookings Institute today. “I think you’ll see as we start looking at our future budgets and the analysis we’re doing as part of the operational imperatives that we are committed to more uncrewed capability.” [...]
“As we look into our future budgets, there’s three aspects of this. There’s the platform itself, there’s the autonomy that goes with it, and then there’s how we organize, train and equip to build the organizations,” Brown said. “And we’re trying to do all those in parallel.”
That involves recognizing that the future drones should be capable of providing different functions, and that such aircraft would be paired with different types of piloted aircraft.
“It can be a sensor, it could be a shooter, it could be a jammer. But how does it team with a crewed aircraft and be operated from the back of a KC-46? And we’ll have E-7s eventually — could you operate it from the back of E7? Could you operate it from a fighter cockpit? And we’re thinking through those aspects.”
Our take: It’s great to hear CQ Brown reiterate the Air Force’s commitment to uncrewed systems. His comments about budgeting are particularly intriguing - by separating the platform out from the autonomy stack that flies the system, Gen. Brown hopes to accelerate the development and deployment of a range of systems. We have seen this approach work elsewhere in the Department: The Army’s RCV program has separated the software acquisition out from the acquisition of the hardware, allowing the program to work with leading software vendors (including Applied) to develop a robust, capable autonomy stack without waiting for hardware, which comes with slower iteration cycles. However, we have also seen it attempted with less success (see the AFRL Skyborg effort, which learned many lessons during its tenure, but did not successfully merge separated software and hardware as intended). It will be interesting to see how the Air Force implements those lessons learned in its CCA and MQ-Next efforts going forward.
Interested in continuing this conversation? Find us at Booth #230 at the AFA Warfare Symposium March 6-8.
House Armed Services Committee | CITI Hearing: The Future of War: Is the Pentagon Prepared to Deter and Defeat America’s Adversaries?
By Chris Brose, Chief Strategy Officer, Anduril Industries
Key quote: “Nothing you do in this Congress will make larger numbers of traditional ships, aircraft, and other platforms materialize over the next several years. It is possible, however, to generate an arsenal of alternative military capabilities that could be delivered to U.S. forces in large enough quantities within the next few years to make a decisive difference. Those decisions could all be taken by this Congress.
“The goal would be to rapidly field what I have referred to as a Moneyball Military—one that is achievable, affordable, and capable of winning. Such a military would be composed not of small quantities of large, exquisite, expensive things, but rather large quantities of smaller, lower-cost, more autonomous, and consumable things and, most importantly, the digital means of integrating them.”
Our take: Building large, exquisite platforms takes time; and while these platforms will continue to have a foundational role in our ability to project military power, even if we wanted to replace them, due to the sheer inertia that is the military Services, we agree that we need to think about the problem in front of us differently than we have for the last 70 years. “Alternative military capabilities” - smaller, cheaper, attritable, and autonomous systems - can be developed and produced quickly and at scale, and present more complex dilemmas for adversaries than smaller numbers of large platforms. Attritable, in the case of a PRC-centric conflict, is probably in the $20-30M per platform range - a number that we don’t think DOD or Congress understands yet, because we haven’t internalized lessons from Ukraine in which basically any (or every) platform becomes attritable. The real number that matters is loss of American lives, which is why unmanned attritables are the answer.
Defense News | How to make smart trades across US drone forces
By Dr. Caitlin Lee, Senior Fellow for UAV and Autonomy Studies, The Mitchell Institute
Key quote: Often overlooked in the moneyball analogy is the contribution of an all-star pitching rotation to the 2002 Oakland A’s winning streak. Tim Hudson, Barry Zito and Mark Mulder don’t fit the scrappy underdog narrative — each rank among Oakland’s all-time top 10 in wins, strikeouts and winning percentage — but Oakland couldn’t have done it without them. It’s the same with the U.S. military’s approach to building a combat-credible force to deter China. The U.S. will need to employ advanced fighters, bombers and support aircraft, and combine them in creative ways with existing and new drone technologies.
While low-cost, expendable “position player” drones are critical, especially for decoy and surveillance missions, other U.S. drones will need military-class capabilities as unique as Barry Zito’s famous curveball, such as range, stealth, survivability, autonomy, and sophisticated sensors and payloads.
Our take: Here’s a slightly different take on the “Moneyball” military: The Air Force needs a mix of autonomous assets, including both low-cost attritable systems and more advanced, more capable systems that can tackle tougher challenges. When it comes down to it, however, Caitlin and Chris are saying similar things. Ultimately, we need to build out a broad range of systems, rather than devoting all of our time and resources to developing a single platform that we expect to handle a vast range of missions. After all, it was in the wonderful book “Moneyball: The Art of Winning an Unfair Game” that Michael Lewis tells us “Every form of strength covers one weakness and creates another, and therefore every form of strength is also a form of weakness and every weakness a strength.”
DefenseScoop | US Central Command’s new Task Force 99 begins drone operations in Middle East
By Jon Harper, Managing Editor, DefenseScoop
Key quote: “Our problem was air domain awareness” to include “not just tracking objects in the air, but maybe finding things that could be on the ground about to be launched into the air and how those could be a threat to us. And Task Force 99 was born out of the idea that if we take unmanned technologies and digital technologies and pair them together, and basically teach the robots and the algorithms to solve some of these problems for us, that it could fill some of those gaps,” Lt. Gen. Alexus Grynkewich, commander of AFCENT, said Monday during an Air and Space Forces Association event. [...]
“Don’t think of it as an innovation lab or anything like that. It is a no-kidding operational task force. So it’s a subordinate command that is out there conducting operations. They’re just doing it with different stuff and really getting after these problems,” he said.
The organization is focusing on three main areas: enhancing domain awareness, accelerating the military’s targeting cycle, and “imposing dilemmas on adversaries” by deploying additional assets that they would have to contend with.
“Rather than just having dozens of airplanes that I can fly to certain locations, what if I had hundreds? Even if the sensors aren’t as good, that capacity is gonna matter when we’re prosecuting operations at scale,” Grynkewich said.
Our take: We are excited to see the results of TF 99’s work. Unmanned and autonomous systems will serve as both a force multiplier for manned aircraft and as a way for the Air Force to fill gaps, without imposing significant costs associated with traditional platforms. Rigorous experimentation will prove essential to understanding and developing the capabilities and use cases for UAS, and we are optimistic about TF 99’s role in facilitating this. We think the most interesting lessons that operational Task Forces like TF99 and TF59 will provide are how to work with commercial platform providers in a Contractor-owned, Contractor Operated (COCO) licensing model.
a16z | American Dynamism Summit: AI & The Future of Modern Warfare
Featuring Anduril Cofounder and CEO Brian Schimpf, Skydio Cofounder and CEO Adam Bry, and Shield AI Cofounder and President Brandon Tseng
Key quote: “When I think about autonomy and the importance that plays, we see our adversaries - Russia, China, Iran, North Korea - building up their electronic warfare capabilities. They are jamming GPS, they are jamming communications which are traditional assets that we’ve become so reliant on - our Predator drones, our smaller versions of drones that rely on GPS. That’s what they targeted. They saw what the United States was doing in the war on terror and they said ‘great, a critical vulnerability is their ability to communicate, it’s their reliance on GPS, let’s attack that.’ So, that’s why autonomy plays a super important role, because if you’re going to have the assets that operate in these contested electronic warfare environments, you need to have assets that can maneuver, make intelligent decisions. And it just so happens that you have this amazing technology that the commercial sector invested so much in - self-driving - that is directly applicable to the defense space and to operating in these denied environments.” - Brandon Tseng, Co-Founder and President of Shield AI
“One thing I will say, to DOD’s credit, is that DOD is an organization that over the past 100-plus years has reinvented itself many times over. How they’ve adopted different ways of fighting, using precision munitions, using aircraft, submarines - all of these have been substantial shifts that have changed how they fight, how they organize, how they operate, and they have been able to adopt change. It’s not as fast as anyone wants. I think the discussion has shifted from ‘if we adopt these technologies’ or ‘does this actually make sense’ to ‘how.’ How will we adopt these technologies, what does this practically look like, how do we think about what this costs, what do these programs look like. And I’ve seen that over the last five years in a really substantial way, which is fantastic.” - Brian Schimpf, Co-Founder and CEO of Anduril
Our take: a16z’s American Dynamism Summit hosted a series of fantastic discussions focused on building a bridge between companies, policymakers, and government buyers, and we were thrilled to participate.
Thank you for reading The Nexus Newsletter. Subscribe for more news from the nexus of national security and autonomy.