What is Legacy IT?
There are many consequences of running legacy technologies and they are often the result from management decisions based on false economies. However, in this article, I am only going to focus on the impact to cyber security and how they may leave your organisation vulnerable to cyber-attacks.
Before considering the cyber security challenges that legacy IT poses, it is worth defining what legacy IT is. Legacy systems are often defined as old methods, technologies, computer systems, or application programs which are out dated or in need of replacement. I am sure that everyone recognised this in the workplace and there are many reasons why IT departments have to run older technologies.
What impact does Legacy IT have on Cyber Security?
Potentially, the most dangerous aspect of legacy technologies is that the vendors no longer provide any support for that version of the product. When this happens even newly discovered security flaws will not be fixed and be patchable. Resolving this problem is complex especially when it comes to infrastructure software (e.g. operating systems, databases, etc.) on which mission critical business applications are running.
Even if you are operating technologies which are fully supported by the vendors, it is critical to ensure that you apply all security related patches in a timely manner and follow good practices. A good example of what happens when you do not follow this good practice is the cyber breach at TalkTalk (UK Internet Service Provider) in October 2015. It has been widely reported that the hackers managed to obtain customer data using a SQL Injection attack which has been known about for over a decade and could easily have been prevented
In my opinion, ignoring this problem is not an option because it not only endangers your own organisation but potentially that of your customers, partners and everyone else connected to the Internet. Vulnerable and unpatched systems provide cyber criminals, activists and hackers with a platform to attack others. Previously, I have discussed how to approach this challenge in my article titled Unsupported Software: Determine the road blocks and devise an action plan to remove them.
Technologies seem to change at an increasing pace which is largely driven by the technology vendors and the desire of individual consumers to have something new at least every year. However, corporate IT departments are genuinely unable to keep up with this rate of change and further it is not needed (or desired) by their businesses.
The vendors are being driven by their investors to make more money which requires more sales. Individual consumers have been conditioned by the vendors to expect new products on a frequent basis and at the very least have a new version every year. Given the rush to get products to market, does this mean that these products have not been designed, built and tested as well as they might have been? In which case, which elements of the products have suffered? In my almost 30 years career as a technologist (both in development and infrastructure), I know that it will not have been the functionality desired by the end users which will have been sacrificed but security requirements which were viewed as non-functional.
In the past, all too often security was viewed as a barrier and ignored at the beginning of a project and rather than being designed into the solution, it had to be retrofitted (see Taking a ‘Secure by Design’ approach to technology solutions is key to unlocking value). Unfortunately, when you are forced into retrofitting anything into a solution, you will end up having to make some undesirable compromises and it will cost significantly more.
Upgrading an entire organisations infrastructure is a complex task which most Corporate IT departments can only cope with every four to five years. This is an entirely reasonable approach and makes good business sense as all key vendors fully support their products for more than this period of time. However, if the planned upgrade cycles are delayed because of other priorities or financial constraints then this can result in challenges around security vulnerabilities and the availability of patches. In fact, what is key here is to know how long the complete infrastructure refresh programme will take and to start in time to be able to finish before the support finishes. Many organisations failed to take this approach when upgrading from Windows XP and found themselves with their backs against the wall.
However, this is not the image that the technology vendors and commentators are portraying about the current situation in large enterprise. They focus on the newer technologies and methods such as Agile, Big Data and Cloud which are ignoring the challenges of the existing legacy estates.
It is not unusual for legacy applications to have little or no security built into them. For example, if an application does not have individual logins then it will be extremely difficult and potentially impossible to determine which individuals completed which actions. Additionally, many older applications do not log the key actions of their users so cannot provide application audit trails which is key if the application is processing financial or other business critical transactions.
Failure to completely decommission old systems which are no longer used also pose a significant cyber security threat. No one cares about such systems as they are not being used and therefore it is highly likely that they will not be patched. Over time they become increasingly vulnerable because they are still on the network. Common sense says that it should be easy to decommission and remove these systems from the environment but all too often it just does not happen.
Stop creating more Legacy IT
The most important first step towards resolving the challenges is to stop creating more legacy, otherwise you will be forever chasing your own tail. All new technology implementations need to be forward looking and leverage current technologies.
Technology projects are often complex and start out using the current technologies; however, they take a long time to be delivered by which time the technologies are out of date. This is particularly true of software developments but not necessarily the sole fault of the development teams. In many organisations, the infrastructure teams are unable or unwilling to provide and a clear roadmap as to when the underlying technologies will be upgraded to newer versions. On top of this, such roadmaps need to be reliably delivered so the developers have the confidence that the infrastructure will be available when their applications are ready.
We recommend the creation of a common toolkit of security components which has been designed to fulfil the requirements detailed in your particular organisation’s security policies and standards. These can then be leveraged by all new applications which has the following advantages:
- Compliance with Security Policies and Standards – by using these security components, solutions will naturally be compliant because they have been designed to be;
- Improved stability – these security components will be much better tested and exercised as they are being leveraged by an increasing number of solutions and not just one;
- Great agility – should it be necessary to enhance or modify the Security Policies or Standards then only one set of security components need to be changed and modified. This reduces both the time-to-market and the associated cost.
One of the factors that is assisting in preventing more legacy from being created is in the trend toward cloud computing which may be private, public or more likely some form of hybrid. It is also worth mentioning that Corporate IT departments are no longer in control of all the technology solution being used in their organisation as highlighted in JC Gaillard’s recent article, 4 Tips for CIOs to deal efficiently with Shadow IT.
Overcoming the Cyber Security Challenges created by Legacy IT
Going back and retro-fitting secure solutions into the legacy technology solution is usually extremely complex. However, having a common toolkit of security components, as recommended above, will make this change easier. Therefore, it is still necessary to evaluate the benefits of making such changes in terms of appropriately protecting the organisation and cost.
If a particular legacy technology has a limited lifetime and is going to be retired, then it does not make sense to retro-fit security functionality into it. However, this does not mean that you should do nothing but the decommissioning of that legacy technology should be accelerated by looking for alternative solutions (both process and technical) so it is completely removed from the organisation. Whilst this is being done it may be necessary to implement controls to mitigate the threats that these legacy technologies pose. Legacy never dies but needs to be murdered so it is imperative that there is the desire and willingness to make this happens.
In many organisations, this will be a huge task because they have been building legacy for many years. As with all massive tasks, it is all too easy to say that it is all just too big and too complex – leaving it for the next person to tackle. This is “King Canute” behaviour as the associated cyber security threats will not just go away. The only way to tackle this is one piece of legacy technology at a time taking each in turn and making the necessary changes. It is always a good idea to get some quick wins early on, if any exist, and to functions of the technologies which are most critical to your business.
Find out more about how your business can truly protect its future from cyber threats by contacting Corix Partners. Corix Partners is a Boutique Management Consultancy Firm, focused on assisting CIOs and other C-level executives in resolving Security Strategy, Organisation & Governance challenges.