Wherever there is an operational requirement for security, there tends to be a legal obligation. Wherever there is a legal obligation for security, there is always an operational requirement. Security law and operational security practices are essentially twinned and indivisible.
There are many situations where the law will impose requirements for security, with examples being for the protection of critical national infrastructure and for the protection of personal data. Many readers will be able to identify the broad thrust of these obligations, for example that there is a need to take reasonable care to protect computers and communications systems from cyberattacks and confidential data from misuse. The taking of reasonable care might be described as taking steps and measures for security, or applying security controls. Utilising some of the legal language in this area, there are many situations where we talk about implementing appropriate technical and organisational measures for security, or words to that effect.
However, the full parameters of the legal duties for security are not always clear, especially when we consider the legislation in this area. Legislative requirements for security are always drafted at a high level, mainly to ensure that the law is “future-proof”, but this leaves those under a duty with a puzzle: where do they find the full detail within the law?
The gap is partially filled by case law, that is the decisions of judges and tribunals within legal proceedings, and by the guidance provided by regulators, but even in the most active legal system many gaps will remain. The legal system itself can only take things so far. In fact, the law as we currently understand it has barely scratched at the surface of what needs to be done to satisfy the duties that it has set. We need to resolve this puzzle.
Take the GDPR as an example. The idea of “appropriate technical and organisational measures” is found in a number of places in the legislation, including in Article 32, which is concerned with the security of processing of personal data. Article 32 requires controllers and processors of personal data to consider a number of factors, including the pseudonymisation and encryption of personal data; the ability to ensure the ongoing confidentiality, integrity, availability and resilience of processing systems and services; the ability to restore the availability and access to personal data in a timely manner in the event of a physical or technical incident; and a process for regularly testing, assessing and evaluating the effectiveness of technical and organisational measures for ensuring the security of processing. These requirements clearly cover a lot of ground and it does not take much imagination to recognise that they will impact upon many more aspects of business than simply the technology and data that need to be secured. After we ponder upon Article 32 for a little while, we recognise the puzzle for what it is, which is that A.32 sets out a series of requirements, the details of which it fails to fully explain.
So we have to look at security law in a different way, if we are going to understand the full detail of its requirements. My solution is straightforward and easy to grasp: the details of security are found in the requirements of operational security itself. In other words, wherever a legal duty for security exists, the duty defers to the requirements of operational security to set the details of the duty. Operational security experts own the legal details.