library
IQT is the not-for-profit strategic investor focused on national security. We serve as a trusted intermediary between tech founders, venture capitalists, and our government partners and have been investing in venture-backed early-stage tech startups for 23 years. Looking at cyber from this vantage point gives us a unique perspective on the good, the bad, and the ugly in a very dynamic, arguably overhyped market. We find framing the development of our investment theses against the backdrop of several key perspectives essential.
There is no argument that great power competition is and will continue to be a foundational market driver through the coming decades. In many respects, the competition between Washington, Moscow, and Beijing is nowhere more acute than in the cyber domain. This introduces an extra dimension of complexity as non-market-driven incentives are moving and manipulating technology directions in subtle ways. So, our first consideration is to always pay attention to the Bear and the Panda.
It is impossible to escape the reality that the nature of cyber is dual use — a constant tug of war between offensive units and defensive teams. We see these struggles playing out in the software supply chain, DevOps , and, in particular, the open-source ecosystem specifically. Capabilities designed to identify and mitigate system and software vulnerabilities will continue to be used by both parties with competing intentions. This is an unavoidable truth when investing in cyber for national security.
We know that when building tech, abstraction is a feature. However, in cyber, endless layers of abstraction, where each layer can manipulate the next, create a full stack of opportunities for adversaries. Abstractions like clouds, virtualization, containerization, serverless, software-defined infrastructure, the metaverse, digital twins, and fake personas constrain our ability to observe. Establishing ground truth is all but impractical and is dictated more by perspective bias than principled engineering. Is an observation or event real or not? Applying physical domain analogies common with legacy defense in-depth modeling does not account for the lossiness or lack of observability thus is flawed from the beginning.
The democratization of IT, shift to self-service-based architectures innovation of agile development methodologies, overall complexity, and pace of change of our systems makes it near-impossible to understand the state of the infrastructure underpinning our applications at any moment in time. The exponential pace of innovation is outstripping our ability to quantitatively understand, and in many cases even measure, our operating environments. Our systems are running open loop. Stated simply, control is an illusion. This coupled with the previous perspective; we cannot control what we do not understand.
Finally, cyber is inherently unstable. There is no end state. We are in a high velocity, endless arms and knowledge race. Each breach and new technique observed informs and improves the next. System engineering practices teach us that failures, or breaches, should be expected; why should modern IT systems be viewed any differently? As an industry, our goal should be to drive down MTTR (mean time to recover), controlling and limiting the impact of the event. The scale of this problem is all post-human. Our systems have grown so complex that no one human has the capacity to fully understand let alone defend them. We need to consider new models of trust and new models of operations that maximize visibility and improve system resiliency in the face of uncertainty. This leads us to our last perspective — the future demands a focus on resiliency and minimizing MTTR.
Against this backdrop, how should innovation be evaluated? How do we measure one strategic investment against another? Which technologies and business models will have commercial traction and the necessary long-term viability to advance National Security interests? These are questions that we will continue to explore in this series.