Skip to main content

The Life Cycles of Cyber Threats

Dr. Ben Buchanan Headshot

Technology isn't human but it has stages of life, writes Public Policy Fellow Ben Buchanan.

Technology isn’t human, but it has stages of life. The period after the conception of a new piece of technology is often marked by significant investments of time and resources, often with little tangible return. If this work is successful, the technology begins to enter use, benefiting from iteration and design improvements. It may then begin to spread, gaining in popularity and begetting virtuous economies of scale. If all continues to progress, the technology will mature in the marketplace. Even if it attains market dominance, however, that position will not be permanent. In time, an upstart technology will appear on the scene, and the process will begin again. Sometimes the old technology will stick around in one form or another, carving out a niche role for itself. More frequently, it will be cast aside and supplanted.

The notion of life cycles in technology and innovation is hardly new. Variations of the life-cycle idea can be found in a wide range of theories and case studies, from the economics of creative destruction and Moore’s Law to the case of the landline telephone or the digital camera. For some, this endless cycle of innovation and rebirth is central to progress, and those who drive it forward with new inventions are heralded as visionaries. It is a role the inventors themselves sometimes embrace: Steve Jobs famously said, ‘It isn’t the consumers’ job to know what they want’, a variant on Henry Ford’s likely apocryphal remark that, ‘If I asked my customers what they wanted, they would have said faster horses’. The central place of these two men in American history confirms that, when it comes to mastery of the technological life cycle, to the innovator go the spoils.

Military affairs are in many ways governed by a similar logic. Technologies that were once dominant can be quickly rendered obsolete, changing the course of conflict. History provides a litany of examples, such as the crossbow supplanting skilled archers; the Gatling gun replacing the single-shot rifle (thereby redefining infantry tactics); and the surface-to-air missile largely replacing the anti-aircraft gun. In time, these technologies matured and diffused, eventually being taken up by many states and, in some cases, by dangerous non-state groups.

But what about cyber capabilities? These are looked upon as the newest class of military technology, and there is no shortage of papers arguing for their centrality in twenty-first-century conflict. Many commentators have expressed concern about the low barrier to entry in cyber operations, the ease with which code can be copied and spread, and the dangers of such tools should they fall into the wrong hands.5 For example, a former director of the US National Security Agency (Michael Hayden) recently observed that ‘even … less capable actors can now develop and/or acquire tools and weapons that we thought in the past were so high-end that only a few nation-states could acquire and use them’. It is clear that the life cycle of cyber capabilities, and particularly the prospects of diffusion, merits analysis.

Read the full article at Survival: Global Politics and Strategy

About the Author

Dr. Ben Buchanan Headshot

Ben Buchanan

Global Fellow;
Senior Faculty Fellow at Georgetown’s Center for Security and Emerging Technology (CSET) Assistant Teaching Professor at Georgetown University’s School of Foreign Service Former Postdoctoral fellow at the Belfer Center's Cyber Security Project, John F. Kennedy School Government, Harvard University
Read More

Digital Futures Project

Less and less of life, war and business takes place offline. More and more, policy is transacted in a space poorly understood by traditional legal and political authorities. The Digital Futures Project is a map to constraints and opportunities generated by the innovations around the corner - a resource for policymakers navigating a world they didn’t build.  Read more