Securing Military Communications
Mark Bouch, Managing Director of Leading Change, examines the importance of securing military communications and the critical role training plays in protecting against cyber attacks.
Cyber warfare is a defining capability in the same way the development of the rifled barrel or mechanisation changed the nature of warfare. Nations with cutting-edge cyber capabilities have significant competitive advantage. Interception and interference with military communications poses a significant threat to the UK’s economic interests and security. It affects political and military communications systems but also, in today’s technology-driven world, scientific research, all forms of communication including social media and many other government-related or commercial organisations in the defence supply chain.
Our big challenge is how to keep computers and data networks safe from malign actors. Cyber security is big business. Cybersecurity Ventures (a Californian research agency) assessed this market was worth more than $120 billion in 2017 and predicted global spending on cyber security products and services to exceed $1 trillion cumulatively between 2017 and 2021. The traditional emphasis has been developing technological solutions to protect user end-points and data networks, but we should assume those who seek to harm us by exploiting military communications will find ways to exploit even the best technical solutions.
A 2015 Harvard Business Review (HBR) article quoted a US Department of Defense source with an alarming statistic that the DoD experiences 41 million scans, probes and attacks each month. However, ‘secure’ architecture and state-of-the-art technology are only part of the answer. My background in high-threat environments suggests many, if not most, ‘failures’ find root causes in the human beings designing, building and operating systems. We therefore need to train and rehearse operators to avoid mistakes and to detect and correct issues before they morph into mission–critical failure.
Not all cyber security breaches are well-published, but it was revealed that Islamic State briefly took control of the US Central Command’s Twitter feed in 2015 because the application had not been updated to dual-factor authentication. Technology, like body armour, may create a false sense of security when what’s really required is a high-performance culture which consistently analyses and minimises risk. We identify such cultures in many high-reliability organisations needing to avoid errors and minimise their consequences, including airlines, air traffic control systems, Formula 1 motor racing teams and bomb disposal units. They are technical operations conducted in potentially dangerous and complex environments where systems, sub-systems, human operators and the environment interact to cause dynamic risk that must be addressed before it turns into disastrous and potentially fatal problems.
High-reliability organisations are ‘situationally aware’. They have well-developed awareness of the environment and their own vulnerabilities. One such organisation, the US Submarine Service, has identified six interconnected principles that help them reduce and contain the impact of human error. Many of these will be familiar to anyone with experience in high-reliability organisations:
- Integrity.This describes embedded and intuitive behaviour that eliminates departures from operating procedures and immediately highlights lapses, mistakes or shortcuts without fear.
- Depth of knowledge.When people thoroughly understand all aspects of a system they are more likely to recognise when things go wrong or fall outside normal parameters. Competence is developed through simulation, testing and evaluation.
- Procedural compliance.In high-reliability organisations operators are required to know (or know where to find) standardised operating procedures and to follow them. They also encourage the use of ‘disciplined initiative’ to recognise when a situation develops that doesn’t map to existing procedures and new ones are called for.
- Forceful backup.In high-reliability organisations, even experienced operators are closely monitored by peers and seniors. High–risk operations are generally performed by two people and every team member, even the most junior or inexperienced, is empowered to speak out when a problem is observed.
- A questioning attitude.It’s not easy to cultivate a culture of honesty and openness in hierarchical structures emphasising the need to comply with orders, but this behaviour is fundamental to mission command and vital in any high-reliability organisation. It doesn’t work if leaders practise what L DavidMarquet described as ‘Know All – Tell All’ leadership.
- Formality in communication.To minimise any possibility instructions are unclear or misunderstood at critical moments, operators in high-reliability environments, like aircraft cockpits, are required to communicate in a prescribed manner. They use checklists. They require those giving direction to state their intentions clearly and recipients to repeat back instructions verbatim. By formalising this process, they eliminate communication likely to result in inattention, misplaced assumptions or procedural error.
A 2015 HBR article identified that military cyber security breaches caused by human error invariably involved a breach of one or more of these six principles. Military commanders should simply ask what these six principles mean in their organisations, and ask themselves these questions:
- How do I conduct spot checks on procedure and behaviour?
- How do I respond to lapses in standards and behaviours?
- What training programmes are in place for behavioural aspects of cybersecurity?
- How frequently are those programmes refreshed?
- How do we rehearse responses to high-risk scenarios?
- Are the tasks exposing us to risk of cyberattack identified and subject to formal processes to ensure consistent safe practice and oversight?
Military commanders must ensure accountability is widely shared, making everyone responsible for ‘safety’ and stewardship of military data networks. They should ensure high-reliability behaviours are as embedded in day-to-day routine to the same extent as keeping personal weapons clean and operational. This means everyone must know and comply with basic rules and standards, irrespective of rank.
When failures occur – and they will – military organisations must treat unintentional, occasional errors as opportunities to learn and correct the processes allowing them to occur. Commanders must nurture a culture in which people will speak up when they make mistakes or notice others doing so, while at the same time being publicly intolerant of people who ignore well-intentioned standards and procedures. A Formula 1 client of ours adopted a version of the ‘cockpit resource management’ model to help engineers and manufacturing teams listen to their intuition, identify early causes of defects and take corrective action. They reward contributions to the safety framework and have hard evidence of improved reliability and reductions in safety–critical failure.
Communications security depends on leadership through technology and high–reliability people. Technology alone cannot defend military communications networks. Reducing human error is as important, if not more. Building and maintaining a culture of high-reliability will help, but it’s labour-intensive for leaders at all levels. The return on investment in building high-reliability organisations may be difficult to measure, but it is ultimately worthwhile as military communications security and thus valuable lives depend on it.
If you would like to join our community and read more articles like this then please click here.