Simplifying Security
Applying philosophical Razors to Cybersecurity
Recently, I’ve been reading various books that discuss the importance of having a set of operating principles. These principles serve as a strong foundation for life, providing a reliable framework for making decisions and handling different situations, whether in your personal or professional life. When researching more on principles, i came across the concept of “Razors” in a tweet from George Mack, these are principles or rule of thumb that allows us to quickly analyze and eliminate unlikely explanations, or avoid unnecessary actions. You can think of Razors as mental shortcuts to help you in problem solving, decision making and critical thinking by reducing the complexity of the situation.
I’ve decided to adapt these on the context of Cybersecurity, as I found that this could be very beneficial for Security practitioners.
Thanks for reading! Subscribe for free to receive new posts and support my work.
When it comes to Razors you might be familiar with Occam's Razor, this is a fundamental concept in science and critical thinking that emphasizes the importance of simplicity and parsimony when evaluating explanations or theories. It was named after the English philosopher William Ockham (1287-1347).
A great example of Occam Razors, can be found in medicine, with the principle “when you hear hoof beats, think horses, not zebras” which means that when diagnosing a patient, doctors consider the simplest and most common explanation for the symptoms. (Horses represents the common, simplest most likely conditions, and Zebras the less likely conditions).

So how can we be apply Occam´s Razor in Cybersecurity?
- The simplest explanation: When analyzing a security incident, consider whether the most straightforward explanation (e.g., a malicious actor exploited a known vulnerability) is more likely than a complex one (e.g., a sophisticated nation-state attack). Can we simplify this explanation further? Yes, how many times the root cause of an incident turn up to be a misconfiguration? It's pretty common to first assume that you are under attack, when in reality there was a problem in the latest update of the CI/CD system or a new change in a service that is generating more traffic than expected. When testing a security hypothesis, prefer simpler explanations over more complex ones.
- Prioritize simplicity in design: When designing security systems, protocols and processes, favor simplicity over complexity. This can help reduce the risk of errors, improve maintainability, testing and make it easier to understand and implement the system. The simpler the system the more reduced the attack surface will be, less opportunities for vulnerabilities, abuses and errors. Simpler system will have less components, interfaces and interactions, as demonstrated by threat modeling the interactions of these components are the ones where the threats manifest. Less components and interactions means less potential threats.
“Simplicity is the ultimate sophistication.”— Clare Boothe Luce
- Avoid unnecessary assumptions: In threat analysis, don't assume that a particular actor or group is responsible for an incident without sufficient evidence. Instead, start with the simplest explanation (e.g., a single individual or a known adversary group) and add complexity as needed and evidence supports it. Occam's Razor suggests starting with this simpler explanation and adding complexity only if there is sufficient evidence to support more complex theories. It's common that people start assuming that the actors that are in the latest news are the ones behind our incidents.
Another useful razor is the Hanlon's Razor, which is a relative modern aphorism, attributed to Robert J. Hanlon’s coined as part of his book “Murphys law Book two: More reasons why Things go wrong!” (1980).
Hanlon’s principle states: “Never attribute to malice that which can be adequately explained by stupidity.” In this context, stupidity can be also understood as ignorance, not knowing, lack of awareness. And here are a couple of examples of its application:
- Error/cause analysis: When investigating security incidents, consider human error before assuming malicious intent. Misconfigurations or mistakes often lead to vulnerabilities as discussed in the previous paragraphs. This part is the one that makes Hanlon’s close to Occam's, Human error will be the simplest explanation over more complex ones like "we are under attack". Many incidents end up being users who are not familiar or aware of certain company policy and perform a task using a program or website that is not allowed, but the security team identifies as an attack.
- User Behavior: Human actions should be interpreted as ignorance or error rather than deliberate malice. Here is the importante of educating users on common security practices rather than assuming they are intentionally bypassing protocols. How many times have you heard a security team member complaining that users are bypassing controls, or not following processes, and when you ask if the users were trained on that process, the answer is “No”. Hanlon’s razor is very useful to avoid unnecessary conflict and escalations, by avoiding attributing things to malice.
That’s all for now. In the next article, I’ll explore more razors and principles that are highly valuable for cybersecurity. Remember… if you hear hoofbeats, think horses not zebras.
Thanks for reading! Subscribe for free to receive new posts and support my work.