It is a rare case indeed that, as developers, we get the opportunity to work on a project that is entirely greenfield across the stack. More often than not, we are working within constraints imposed upon us by systems that are already in place.

These systems are often referred to as 'legacy', a pejorative term that implies they are outdated, clunky and no longer fit for purpose. The reality is somewhat more complex in that the state of a system is typically a culmination of many factors, often non-technical, that leads them to be in their current, undesirable condition.

Nevertheless, integration with legacy systems is a complicated necessity on many projects and often a challenging task as developers may have to ascertain and deal with inherited technical debt and be forced to reverse engineer processes in the absence of meaningful documentation.

Dealing with the challenge of legacy software

So, how do we deal the challenges that legacy systems throw at us? Let's look at this with an example around upgrading a lumbering architectural monolith to improve its performance.

Indeed, it is commonly stated that legacy systems perform terribly. This is often the symptom of other underlying issues that cause poor performance such as monolithic chains of dependencies necessarily running through a convoluted process to produce a desired but slow outcome.

One of the ways we tackle this is to identify what can be decoupled from the monolith and abstracted into its own concern. This allows us to start taking baby steps towards migrating functionality to a newer, loosely coupled architecture that is more able to withstand and navigate some of the common issues and pitfalls of its predecessor.

This process of abstraction is generally no easy task. The challenge generally lies in standing up a new architecture in parallel to keeping the legacy systems up and running. Project budgets and time constraints often dictate an approach that allows for partial decoupling via the implementation of a three-tier architecture, transforming what is consumed from existing legacy systems via the introduction of a middle layer before it is passed to the end user in the presentation layer.

Targeted Transformation

This approach allows us to pick the features that add the most value for a client's budget and target those bit by bit for transformation.

The other key benefit of this approach, is that by decoupling processes from an incumbent monolith we hopefully prevent our new system from also becoming legacy. It should be better able to cater for unknown complexity and be easier to maintain with each concern able to be independently upgraded, amended or replaced in the future.

Hopefully, what this example highlights is that it is crucial that we engineer systems to be flexible enough to deal with what an increasingly complex technological landscape throws at them. Dealing with legacy systems may be challenging but by identifying the core fundamentals and abstracting away much of the rest we at least have an approach that makes them less intimidating to deal with and helps to deliver real present and future value for our clients.

Related Articles_

Blog

BIO / January 23, 2020
How digital technology is transforming healthcare for elderly people

The world population is aging. In 2017 there were 962 million people over 60, more than double the 1980 figure, ...

Blog

BIO / May 23, 2019
Machine learning, ai and the future of payroll software

Over the next few years, new technologies will add more and more automation and other capabilities to payroll software. Here ...

Blog

BIO / March 21, 2019
Sharing knowledge is power: 7 things to consider before starting knowledge management transformation

Whatever the industry, knowledge is one of a company or organisation’s biggest assets. Successful knowledge management will fuel and drive ...