The efforts required for a successful cyber-attack do not match the efforts that are invested to build a successful business. It only takes one human error to reduce many years of hard work to dust. Verizon research suggests that in 60% of breach incidents, it only takes minutes for the attacker to be successful. In the many years that I have helped companies deal with their cyber-attacks, it has almost always been the case that a single person performed an undesirable action that in turn let the attackers onto their network.
I have been intrigued by the most common human errors identified by the IBM cyber security intelligence index so I decided to review them using the taxonomy of operational cyber security risk. It is a common belief that the human factor is the weakest link in any cyber security protection programmes. Are we looking at the problem from the right angle?
The problem
Whilst looking at what the IBM report suggests are the biggest in human error, I would like to explore the depth of the problem and solutions that would help reduce the attack surface. More so, I will explore solutions that, in my opinion, have long been ignored.
System configuration:
System configuration errors can be looked at in two ways: action, or lack of action, taken by people. It has been the case on many occasions that trained and experienced security professionals have mistakenly configured systems. We also have security professionals who are not appropriately trained but who end up configuring systems. This inevitably leads to configuration errors. On the flip side of things, we have security professionals who simply to do not take the actions they are supposed to take. The Google hacking database is a great example of consequences of system configuration error. Poor configuration can lead to sensitive directory discovery, error messages that reveal key information about the system, exposure of devices online, exposure of backup files, exposure of files containing sensitive information and much more.
Solving the problem of poor configuration can be complex but is certainly very feasible. The idea of equipping security professionals with more powerful tools is a noble one. However, a powerful tool in the hands of someone who is not fully trained can cause more damage than anticipated. Stephen Bonner, a partner in the cybersecurity team at KPMG, argues that tools are only an amplification factor of the knowledge of security professional. A tool in the hand of a poorly trained security professional can be detrimental to the organisation being protected. I personally believe poor system configuration can be caused by external factors other than training and knowledge. These will be discussed later in this article.
Poor patch management
Some security professionals mistakenly regard patch management as applying updates as they are released. Patch management involves a lot more than that. Successful patch management should contain at least the following elements: a Configuration Management Plan, a Patch Management Plan, a Patch Testing, Backup/Archive Plan, an Incident Response Plan and a Disaster Recovery Plan.
Patch application has traditionally being regarded as the role of IT help desk. Yet, it is not uncommon that end user are allowed to install browser add-ons. For small to medium-sized enterprises, it is very likely that users will be left to manage their choice of browser plugins. What this means is that part of the responsibly of patch management is passed to the end user. This can be a serious problem as most users will not understand what patching entails. Patching programmes are likely to fail due to the lack of knowledge and the lack of actions by those supposed to take those actions.
Having said that, even trained users could fail to successfully apply patches. This will be further explored when we discuss the solution to human error.
Disclosure of regulated sensitive information via incorrect email address
Whilst this sort of error is not very common, sending a message to an undesired party can have negative consequences. There are software solutions that can be implemented to reduce the risk of sending an email to the wrong address. Could it also be the case that a user under pressure mistypes an email address?
Opening infected attachment or clicking URLs
Social engineering is a powerful tool and since it was introduced as a concept in cyber security, there has been a lot of development. Malicious users have been very good at formulating their message in a way to convince the end user to open attached documents. These attachments come in various formats and shapes. I have even seen a double extension where the malicious document pretends to be a voicemail yet it is an executable. Malicious documents are still a popular attack vector.
Having spoken to some end users directly, many of them are under the illusion that if the message was malicious, the corporate anti-virus would have been able to detect it. Speaking to students, they have the tendency to open documents they do not trust on university computers instead of their own. By so doing, they argue that it is reasonable to protect their computer to the detriment of the University as they believe the University is/should be well equipped to deal with malware.
A Verizon report reveals that it takes 82 seconds from the start of a phishing campaign until a computer is infected. This is an alarming observation. Clearly the messages about malicious actors and their tactics have not been heard with the attention they deserve.
Use of default usernames/passwords or easy-to-guess password
A recent survey found that 73% of online accounts are guarded by duplicated passwords. It was also found that 21% of people use passwords that are over 10 years old whilst 47% of people use passwords that are at least 5 year old. The survey also found that people re-use the same password on multiple accounts. This gives the ability to attacker to gain access to multiple systems by breaching one system.
There are too many systems online that expose the use of default settings. The user of default settings is an open door to malicious users. The problem posed by the use of passwords can easily be eradicated. Whenever possible, a strong password policy should be in place and enforced. Even when salted password hashing is used, duplicates of passwords should not be allowed. Companies should check their passwords against databases that have been made public from the various security breaches. Multiple factor authentication should be used whenever possible.
Solution
Many people would argue that developing a strong security culture is the solution but it seems to me that ‘security culture’ is wrongly equated with security awareness training programmes. Much emphasis is put on security awareness. I would certainly agree that security awareness has value, but it is only one part of the puzzle. In fact, how many people measure the effectiveness of their security awareness programmes? The article “creating the culture of security” presents ten points that can be followed to create a truly successful culture.
Further to building a security culture, I have learnt from engineers in the field (especially in control system environments) that work related stress is a major cause of human error. Whilst people would be happy to adopt the security culture, it appears that many people do not have the work conditions required to achieve a state of mind that will help them adopt the security culture. Research by the University of Cambridge outlines the major causes of stress. The pressure of selling or making a profit can cause unrealistic deadlines and expectations. The work load becomes unmanageable hence stressful and prone to errors. The lack of control over their workload has often being a cause of stress. Many workers acknowledge that they would be more productive and less prone to stress if they were given more control over their work, and if their suggestions were taken on board. An aggressive management style is probably the number one cause of stress. The research further identifies more sources of stress. My argument is that the errors observed in cyber security that are then exploited by the malicious actors could be attributed to this, as opposed to the security culture. As shown in Figure 1, it is important for people to be in the right band of stress. Exhaustion, anxiety, panic, anger or breakdown will inevitably put people in situation where they are prone to making errors.

Stress CurveFigure 1 – stress curve

To conclude, human error can be significantly reduced. For this to be possible a strong security culture should be in place. More so, the people involved in or affected by any security programme should develop a positive stress. Negative stress is a major cause of human error. The wellbeing of workers should be ensured to reduce the number of human errors. Even the most well-versed security professionals are prone to error when the work conditions are not conducive to a positive outcome.
 
To contact Nettitude’s editor, please email media@nettitude.com.