Welcome back to the Nexus Newsletter, Applied Intuition Defense’s newsletter covering the latest in national security, autonomy, and software-defined warfare.
We’re changing things up. We’re introducing two new sections to the newsletter providing our (lightly filtered) takes on one or two news articles and closing it out with a look at a historic campaign or military engagement. Plus, we riff on why we need to move autonomy and artificial intelligence for aircraft beyond experimentation, give a brief recap on our recent Nexus Talks webinar, and preview some upcoming events. Let’s dive in ⬇️
🗓️ Events
⛰️ Catch us next week in Colorado! EpiSci, an Applied Intuition Company, will be at the Air Force Association Warfare Symposium March 3-5! Swing by booth 1511 to learn how the team is bringing cutting-edge AI and autonomy to the fight. If you’d like a demo, you can sign up here.
🤖 We’ll also be at the inaugural Manifest: Demo Day in Navy Yard, Washington DC, March 17! Applied Intuition Defense will be among 35 companies showcasing real tech — no fluff, no buzzwords, no hypotheticals. You will get to see us demo our collaborative autonomy capability. You can learn more about the demo day here and if you’d like to meet with the team, you can contact us at defense@applied.co.
ICYMI: Our very own Jason Brown, General Manager of Applied Intuition Defense, and Dan Javorsek, President of EpiSci, were featured on a recent Nexus Talks webinar, “Applied Intuition and EpiSci: Multidomain Autonomy for Competitive Advantage.” Some key takeaways:
➡️ Applied Intuition’s acquisition of EpiSci will unlock a breadth of new capabilities for warfighters. Ground, air, maritime, space. We’re doing it all.
➡️ The future of defense: the many and the small, when effectively networked, will always beat the few and the large.
➡️ Our message to the Department of Defense: It’s time to figure out how to procure software effectively.
➡️ We’re hiring! We’re growing quickly and we don’t plan on stopping. Check out open roles here.
🏇 The Parthian Shot
Defense One | Pentagon may break up tech offices in acquisition-policy shift
Key Quote: “This [Trump] administration cares about weapon systems and business systems and not ‘technologies,’” the official said. “We're not going to be investing in ‘artificial intelligence’ because I don’t know what that means. We're going to invest in autonomous killer robots.”
Our Take: This is bound to raise eyebrows on Capitol Hill and in the larger defense acquisition community. As the author points out, this new vision will require buy in from lawmakers and the military services. That’s very much going to be a political battle.
However…
We’d be lying if we said this vision didn’t excite us in many ways. First, the strongest benefits of the Department of Defense working with commercial first dual-use software companies like ours are cost effectiveness and speed. It’s simple— companies like ours have already invested a ton into the research and development of our commercial offerings. Similarly, because we are a software company, we can get something out the door in a matter of hours/days/weeks from start, depending on the scope of the project.
Finally, killer robots make for good quotes in the press. But, reality is much less sexy. Deploying collaborative machines at scale and in a denied environment requires a TON of foundational software work. Otherwise, your autonomous platforms will be bricked at the edge. So if the Pentagon’s new leadership is serious about this vision, it must direct that the most strategic projects include a dedicated software development pathway.
Breaking Defense | EXCLUSIVE: Hegseth draft memo lays out software acquisition reform push
Key Quote: “Software is at the core of every weapon and supporting system we field to remain the strongest, most lethal fighting force in the world. While commercial industry has rapidly adjusted to a software- defined product reality, DoD has struggled to reframe our acquisition process from a hardware- centric to a software-centric approach,” the memo reads. (As the memo has not been signed out, it is possible wording could change in the final version of the document.)
“When it comes to software acquisition, we are overdue in pivoting to a performance-based outcome and, as such, it is the Warfighter who pays the price.”
Our Take: Looks like someone at the Pentagon listened to our most recent Nexus Talk…
💭 Thought Bubble: Moving Autonomy and AI Beyond Experimental Form
The potential for autonomy mixed with AI in the air domain is exciting and, at the moment, open ended. The more applications we find, the more ideas for future applications we uncover. The fast pace of research and experimentation is exhilarating as we push the boundaries of the possible further onto the horizon.
Creating applications that use autonomy is hard. However, repeatedly demonstrating that those solutions are viable for operational use is even harder. Demonstrating a new technology in an experiment tells us that it is possible, but it does not tell us if it is feasible. As autonomy with AI moves out of the experimental phase into the operational phase, we need autonomy that is both possible and feasible.
The exciting and fast pace of AI and autonomy development can be hard for anyone to keep pace, including governmental regulatory agencies. We fully recognize this problem, and our team is designing experimental autonomy solutions with an eye towards operational use in the future. However, that is not enough.
The use of AI mixed with autonomy operationally requires guaranteed safe, secure, and effective use every time. This guarantee is not just hard, it has to date been shown to be infeasible with current regulatory methods used for aircraft airworthiness and operation. We are actively working on solutions to define new regulatory standards that can prove safe, secure, and effective operation of autonomy reliably. We are building autonomy targeted to these new standards so that we can reliably provide to government airworthiness authorities a path for all to follow. We’re working hard to be ready for the day when autonomy becomes operational. On that day, you can guarantee we will be ready to provide a feasible and reliable system that is proven to be secure, safe, and effective.
🪖 Lessons from the Past, Tech for the Future
In each upcoming newsletter edition, we’ll close out with a look at a historic campaign or military engagement. If you think technology has made the study of military history irrelevant, think again.
While technology changes the character of war, the nature of war remains the same. For that reason, the study of war remains an essential part of anyone developing technologies for tomorrow’s warfighter. Check out this useful piece on The Battle of Gallipoli.
Key Quote: “But then the allies stopped. At S Beach, a British battalion was confronted by an overstretched Turkish platoon. But their orders were to get ashore and wait. And so they did. The British commander in charge of Y Beach, where there were no defenders at all, was told to wait for orders to push on. He received no communication of any kind from his higher headquarters for 29 full hours after landing. During this lull, Hamilton remained afloat having chosen not to make the landing at any one place to preserve his situational awareness. This may make sense today with modern communications systems, but in 1915 it rendered Hamilton unable to affect the situation. So much planning had gone into the landings that, once the landings were accomplished, subordinate commanders had no direction.”
Some questions to ponder after reading about Gallipoli:
This scenario illustrates the dangers of inaction. Can autonomous swarms seize the operational initiative by making decisions faster than the adversary?
Skating to the puck is “human in the loop.” Skating to where the puck is going is “collaborative autonomy.” In a scenario like this, how do autonomous systems produce competitive advantage?
Does collaborative autonomy work in denied environments (YES)
If you want to geek out with us on this, contact us at defense@applied.co.