Security + Observability = Compliance
A simple definition of security is “state of a system free from threats and attacks”. FISMA defines three security objectives: Confidentiality, Integrity, and Availability:
- Confidentiality: Preserving authorized restrictions on information access and disclosure, including means for protecting personal privacy and proprietary information.
- Integrity: Guarding against improper information modification or destruction, and includes ensuring information non-repudiation and authenticity.
- Availability: Ensuring timely and reliable access to and use of information.
There are also other security objectives such as non-repudiation and
authenticity. Security limits the “observation” of the internal state of the system to only those entities that are authorized.
Compliance on the other hand is all about “observing” the internal state of the system and verify that the internal states are “well behaved”. That means we need to put “hooks into the system” that will allow some authorized entity (e.g., an auditor) to “observe” the internal state of the system. Typically these “hooks” are added much later in the development of the system. Adding these “hooks” introduces security issues.
Observability is a concepts that R. Kalman introduced in 1960 in the context of control theory (source: https://www.ece.rutgers.edu/~gajic/psfiles/chap5traCO.pdf)
Observability: In order to see what is going on inside the system under observation, the system must be observable.
With observability baked into the development we can infer/verify/determine the (internal) behavior of the system using only system output.
Both security and observability should be baked into the development of any system from Day 0. When security and observability are baked into the system, compliance and evidence collection for auditors comes for free.