After more than 50 years of developing mission critical systems, at some point during the last decade, the IT industry definitely has reached a tipping point with an interesting characteristic: almost every organization on the planet has systems, application code, and databases that can be labeled “legacy”.

Information Technology, if looked upon as an engineering discipline or research field, is very young compared to most other disciplines. Think about building, chemistry, brewing even: they are all centuries old and disruptive evolutions in the discipline have slowed down to a manageable pace. Not so in the Information Technology however!

Enterprise IT has witnessed the times of the mainframes (IBM, Siemens, Bull, ICL, Unisys, etc.), of the mini’s (Wang, DEC, HP, etc.), of terminal processing, of Client-Server architectures and GUI-interfaces, of the Web and Internet, and now many agree is entering the era of the Cloud. Each time TP-processing or application server technologies, database technologies, programming languages and development environments evolved with these evolutions in a quite disruptive way.

However, a new generation most often did not replace the previous one, but it was put in usage on top of the previous one. To complicate things even further, all kinds of complex interfaces between all these systems of various age and evolution had to be built.

At the same time, what is called the “Consumerization of IT” is putting pressure on companies and IT-staff: users want to work in the same way during business hours then at home: use iPads, Facebook-like social networking, Google-like searching in the vast company data stores….

IT Companies, small and large, have flourished and too often then disappeared again over the last half century. The record of the largest market capitalization in history has been broken by IT companies: Apple, once almost bankrupt, took the lead from Microsoft in august 2012.
The IT discipline has re-invented itself over and over again during its first 50 years, and “innovation” has been the name of the game. Innovation lead to a Darwinism kind of truth: it is not the strongest of the species that survive, not the most intelligent, but the one most responsive to change.

As mentioned earlier, 50 years of innovation however has caused almost every company on the planet, large or small, public or private sector, to have a “legacy” issue. At the same time there is no slow-down in the needs to cope with new request to support their businesses, quite on the contrary.

The following conclusions are ineluctable:

  • Currently enterprises spend on average between 60% and 80% of their IT budget on maintaining existing systems instead of on new developments, and half of these costs are related to time and resources spent on the research and adaption of legacy systems.
  • There is a growing understanding that the problem, if not dealt with properly, will rather increase than decrease: the adoption of every single new innovation risks to create a legacy issue at the same time, if not dealt with properly.

Apart from the budgetary implications, legacy environments are considered problematic for several other reasons:

  • Mere lack of platform support of the (original) vendor, or dramatically increased support and maintenance costs.
  • Diminishing skill base (according to Accenture 40% of the legacy IT-workers will retire within the next 5 years).
  • Potential lack of agility demonstrated by a reduced level of new functionality (an accelerating pace of technology obsolescence), lack of flexibility (vs. new business initiatives: digital enablement creating new demands), or lack of native support for modern enhancements (e.g. web services, mobility, etc.).

The result of this is that companies face a problem, and it is a problem that will not easily go away, if at all. Many approaches to solve the problem have been undertaken in the past.

A very widely used approach has been to replace the legacy environment by rewriting the applications or by implementing packaged software. Too often however, these projects – once over a certain size – have not been successful, because the business processes had to be adapted to the supporting application instead of the other way round, or because the correct specifications could never be documented properly, or not agreed upon, or simply because the implementation became too costly.
Eventually many such projects were even cancelled.

Often, organizations have tried also to componentize and wrap or package the existing legacy code in such a way that it can be exposed to other and new technologies via a Service Oriented Architecture (SOA). While this solution enables flexibility and the opportunity to implement modern enhancements, it will not help to decrease support and maintenance costs, or allow to de-commission an unsupported or poorly supported platform, nor will it fully address the diminishing skill base challenge.

Some organizations have chosen a “pure re-hosting” solution. These solutions enable a company to move its existing legacy environment to an ostensibly lower-cost platform with as little change as possible. The original environment is often then “emulated” on the new platform.
These solutions typically are perceived to generate more immediate cost savings, but for many organizations, the existing applications are insufficient to meet business needs, and preserving them in this way is not desirable. Moreover, this type of solution does not solve the skill evaporation issue (which may or may not be desirable in the short term, but is often a serious issue in the longer term), and it does often lead to new recurrent (emulation) costs, and vendor lock-in.
Last but not least, this solutions leads to one certainty: there will be a second project needed one day to really deal with the issue …..

Last but not least, “transformational re-hosting” solutions, which convert the syntax in order to port the legacy artifacts, often seem to be the proper approach, and is the one Anubex stands for. This route eventually overcomes all legacy issues and can be executed in a two-phase process in which the transformation is followed by a subsequent modernization phase, that can target full object oriented refactoring.
Without any extra costs, this process solves the skills issue, allows for continuous new functionality, and offers full flexibility and native support for desired modern enhancements.

One could even distinguish between two flavors: one in which the programming language is kept and adapted to a new compiler (e.g. COBOL to COBOL), and one in which the programming language itself is also converted (e.g. COBOL to C#, or Natural to COBOL).