PREEMPTIVE FORENSICS An introspective with The Dan Solomon

by Robert Vanaman, MBA, MS

Robert:Dan, would you agree, as asserted by Chris Hargreaves in his article “Pre-Emptive Digital Forensics Research”, that pre-emptive digital forensics refers to any research conducted that is not in response to a current investigation, but rather is conducted in order to acquire some knowledge in advance of encountering a particular technology in a real investigation? (Link for your information:

Dan: I would agree when considering digital forensics. I would also offer an expansion on this definition beyond technology, and not confine it to digital aspects. I would suggest that many of the causes of security failure when considering prevention, and response are not technology related, and therefore a holistic consideration of security needs to embrace the physical and human flaws in the systems. If a pre-emptive approach is adopted, then it should aim to pre-empt any factor that can cause a breach and constrain response. If the broad nature of security is considered, then a specific focus on technology is not enough to prepare an organization. It may also not be enough to justify the investment by many firms unless is offers more guarantees for fixing problems. I therefore advocate a definition that incorporates processes and procedures as much as technology. This is justified by many post-breach investigations that eventually invest as much effort examining processes and procedures as areas where failure has occurred.

Considering the hypercompetitive environment (Read: price wars) that most technology firms exist in, what persuasive methods do you employ in convincing CEOs and CIOs to commit the necessary resources which you state “justifies the testing and exercising” in “analyzing causes of security failure[s]”?

There are a variety of arguments that can be presented, and most need to focus on protecting valuable intangible assets which includes reputation and share price, or reducing current or future costs. For high-tech firms that have invested huge sums in R&D, and continue to do so, there is a very compelling case for checking and ensuring that they are not subject to industrial espionage. When technology firms are under such competitive pressures, they are very vulnerable to aggressive competitors that may offer similar technology at lower prices. The loss of IP and blueprints to a competitor could be catastrophic. But the issue of espionage is very difficult to ‘frame’ because it is less evident when a firm has been compromised, when compared with a clear-cut data breach. For the purposes of countering espionage, it is important to be conducting ongoing investigations and forensic monitoring of the organization’s infrastructure for signs of espionage.

To come back to your question, there are several ways of justifying the investment in testing and exercising. The first is to take a quantitative approach, by building cyber scenarios to illustrate the potential impact of a specific type of attack. Once senior executives are made aware of the potential impact of a breach, they have a very relevant point of reference for considering the level of expenditure they can justify allocating for security. Building scenarios challenges executives to consider factors that are typically unknown, and invariably they raise the question of ‘how prepared are we for this?’. The issue really boils down to trust and assurance in the capabilities that are in place, and there is no way to build assurance unless testing takes place. This also points towards assessing the effectiveness of legacy investments, before deciding what needs to be added or changed to whether new layers need to be added.

A second approach is to show that testing and exercising it the best way to prioritize where to invest in the future. Executives are faced with a situation where IT is demanding more budget, but no one offers guarantees of resilience. So executives have no way of independently verifying whether they are investing in the right controls & measures. It is quite common that firms are investing in measures that are not required, and focusing their investment in the wrong areas. Invariably they may be missing the greatest problem, and that problem may not require a new system or technology. So through the demonstration of vulnerabilities, they can not only identify exactly where they should be prioritizing their investment, and to many firms that represents a saving; they can also identify a program of medium to long term priorities that require attention, which gives them some insight into their future budgetary requirements against a hierarchy of needs.

To expose static security measures, defeat new and sophisticated threats, to provide a greater guarantee that vulnerabilities have been proven or uncovered and to expose the vulnerabilities of single faceted security measure, would you advocate the internal creation or external use of “Red Teams*”?

The creation of internal Red Teams is very difficult because they are ‘internal’. They are exposed to the internal organizational environment, they will have awareness of the historical problems and they can easily be biased in their approach towards attacking. This becomes a greater problem over time as they struggle to bring a fresh approach with each test & exercise cycle and they tend to simulate threats that the firm has anticipated so there is no real-world element of surprise. Many of their activities boil down to providing Quality Assurance of other internal teams and conducting penetration testing. Red Teaming differs considerably from penetration testing particularly when factoring in the testing of physical security, and human firewalls for the purpose of simulating APTs.

In order to gain value from a red team it needs to bring a ‘real-world’ approach that introduces surprise and ‘unknowns’ to corporate defenses to see how they cope, and this is increasingly difficult to an internal team that is neither a specialist red team, nor a team that develops its methods and techniques against varied and different types of defenses. There are other internal ‘political’ aspects that become problematic when one [red] team is trying to expose weaknesses of another [blue] team. The important aspect of the red team is that it keeps up to date with modern attacker methods, and this is very difficult for an internal red team. An example of this is the skills and time required to write custom malware for red team purposes. Ultimately, it would be very rare that an organization could justify supporting and financing an internal red team, when it typically lacks the resources for comprehensive detection and response capabilities, or an enhanced program for countering e-espionage.

Do you believe studying, analyzing, and integrating lessons learned from attacks on other companies and organizations would benefit a corporation’s cyber defense team? If so, where would one go to find and examine such material?

In theory I do, but in reality, the material is very limited and therefore the value is limited. While there is already greater sharing of threat intelligence, and reporting of security breaches, it is almost inconceivable that sufficient data and evidence would be made available by a company about a breach so that other companies can learn from it. Case studies are useful to illustrate specific points, but it would not illustrate the real nature of a cyber-crisis in a form that companies can plan response strategies. There is a strong argument for generating learning material, running workshops and master classes, and supporting the learning process, particularly as part of a security awareness campaign, or to support a scenario building process. However, this method of learning never achieves comprehensive shifts in organizational attitudes and perceptions that are observed after a real crisis, or a simulated crisis. There is no better substitute for learning-by-experience, and some desktop exercises are criticized for being too benign. So I would always recommend conducting a breach simulation that is as close to reality as possible, particularly if you are adopting a pre-emptive forensic approach, because this is the only method that will generate sufficient data for analysis.

Do you consider system based or human enabled cyber threats to be the most prevalent and/or dangerous for most organizations? If you agree that human enabled cyber threats constitutes some part of all e-security and e-sabotage issues, do you consider internal or external human threats to be the most serious?

This will always be a fascinating question. Many commentators feel that the human threat is the most prevalent, and you only need to look as far as Snowden to see what a rogue insider can achieve with the access and motivation. Whether it is knowingly or unknowingly, human ‘gateways’ are without doubt the most prevalent vulnerability, and also the most difficult problem to fix. If you adopt a pre-emptive forensic approach to analysis of human vulnerabilities, you soon realize that it is much more complicated than working with digital evidence and technology performance.

However, a number of breaches have surfaced recently that have been in place for a period of over 3 years, specifically for the purpose of e-espionage. These threats are higher impact, simply because the sheer volume of data they have exfiltrated from organizations, and their ‘advanced’ ability to persist unnoticed. I would like to think that a malicious human threat would not successfully persist over years without being noticed, but advanced tools for espionage are becoming increasingly sophisticated and will become harder to find in the future. However, I am a committed believer that good security needs to be ‘converged’, and both the human and the system elements must be addressed as they are completely interdependent.

*A Cyber Red Team is an independent group that simulates attack on an organization to test security, and prove its effectiveness of security measures and controls.


About the author

Robert E. Vanaman, Microcomputer Consulting Professional, Beta-tester at eForensics Magazine

Robert has been a microcomputer consulting professional since 1983. He formed his consulting firm MicroTraining in 1985. Here, he designed RDBMSs and their associated programs. He has instructed at the collegiate level for over a decade, and within the business community spanning over a quarter of a century. He has a M.S. degree in Database Systems from the University of Maryland University College (UMUC), and a M.B.A. from UMUC as well.


September 5, 2014
© HAKIN9 MEDIA SP. Z O.O. SP. K. 2013