Over the past few years, we have heard a great deal about the rapidly emerging revolution in the Internet of Things (IoT). Indeed, major analysts like Gartner and Forrester have predicted that we are now in the process of moving from some 10 billion network-connected IoT devices to approximately 25 billion in the next 3 years alone. This large-scale increase in the number of IP-enabled devices requiring professional IT management will only exacerbate another critical trend – the steadily advancing complexity of enterprise networks in general. This in itself will require significant advances in our ability to understand and respond effectively to all network events, especially as regards cybersecurity threats – and increasingly to anticipate and address problematic developments in advance.
Network complexity is being driven by a number of factors familiar to network operators today. Key among these is the ongoing, large-scale movement of enterprise compute, storage, and application workloads from on-premise deployments to various cloud platforms or to a combination of cloud-based and in-house environments. Again, the burgeoning IoT revolution simply adds another, albeit significant, layer of complexity to the overall equation.
Managing this mix of hybrid, multi-cloud, public/private frameworks has become an increasingly challenging matter, especially as cyber threats continue to advance with each passing year. The technology industry has responded with a plethora of new network management tools, techniques, and processes, such as those embodied in the DevOps movement, to address the pressing need enterprises now have for maintaining their networks’ security, availability, and overall resilience.
Less evident to the public eye, and yet operating alongside and further complicating these challenges, is the parallel movement toward greater integration of modern IT infrastructures and business applications with the realm of systems collectively known as Operational Technology. “OT” refers generally to the technologies deployed, typically in industrial and modern society’s infrastructure management settings, to facilitate various control processes. The nation’s electrical grid is one example of a very large-scale OT environment, one which requires continuous, advanced, automated monitoring and alerting of its myriad components and subsystems to achieve its goal of maintaining a consistently reliable power source for industry and consumers alike. Likewise, manufacturing and distribution enterprises depend on a multitude of business-critical operational technology systems to keep their production lines running, their inventory at proper levels, and their customers supplied as needed. Even the buildings we work in are managed via a complex mix of interconnected monitoring and communication protocols and related systems in order to provide everything from heating and cooling to power provisioning.
In the past, Operational Technology was typically comprised of multiple standalone systems. Precision control systems used for calibrating industrial machinery, for example, had no need to communicate with those that might manage the flow of essential components onto an assembly line. Typically, none of these technologies were expected to cross the functional barrier separating OT from traditional IT, or to be joined to enterprise networks in order to share either data or systems access with any of the usual corporate information systems. From IT’s perspective, these systems were effectively off-limits. They were managed by subject-matter-experts at the shop floor level. From OT’s perspective, this was a status quo that could be easily accepted. It kept things simpler and more tightly under their control for Operations personnel to manage their own technological backyard.
But as industries have steadily evolved in recent years to be more Agile, productive, and efficient, they have accordingly become far more data-driven – even in the more traditional, self-maintained bastions of a manufacturing operation. Likewise, as the need has arisen for OT systems to become increasingly complex, feature-rich, and interconnected with other key systems, the potential benefits of leveraging IT’s already well-developed networking capabilities has become increasingly clear. As a result, the age-old separation of IT and OT has begun to erode in recent years, especially as businesses have realized the benefits of accessing and analyzing the data often contained within OT environments, and in combining those data sets with, for example, enterprise systems such as corporate financials and inventory management. In the age of Big Data, given the strategic competitive advantage it can produce, enterprises are understandably hungry to mine all potentially business-relevant data available to them. Doing so with already-connected systems is challenging enough; leveraging historically disconnected OT systems increases the complexity of that effort considerably.
Of course, in today’s world of constant cyber-threats to our connected infrastructure, increasing the degree of integration and convergence of OT and IT systems on a common enterprise network brings with it considerable risk. To begin with, operational technology platforms are quite often completely proprietary in nature. They often lack standard IT operating systems that can be connected and secured in any manner familiar to IT staff. And even if they do possess something approaching a recognizable desktop O/S, the very long life expectancies of OT systems are likely to involve base platforms such as DOS or early Windows iterations that have long since gone end-of-life and are no longer supported by their suppliers – meaning they are quite likely to be unsupportable in terms both of basic patching for software fixes and, even more importantly, for cyber defense purposes.