There will always be legacy applications and systems. Legacies are usually a sign of past success, so it is perfectly reasonable that we should treat them with respect. That unfortunately is not always true and some of the associated problems appear to be getting worse.
It is normal to associate "legacy" with old batch applications running on mainframe computers. But in the twenty first century, there are a lot of applications running on Unix systems which are now many years old and are in need of modernisation too. Some of these systems need replacing totally, but others have intrinsic value, particularly those which are inherently batch orientated such as print runs, invoicing, etc. In practice those applications running on mainframes are in better shape than those on Unix, because batch processing was always well supported by efficient schedulers. On Unix systems there was an over reliance on time-sharing and Unix "shell" scripts. Indeed it is very welcoming to note the introduction of third party products such as Cybermation which provide operational management overseeing multiple platforms which will help enormously in the job of bringing older systems and new ones together.
The legacy batch systems can be easily exploited when mixed with newly developed applications, simply by using transaction messaging software to feed the existing batch input streams. The legacy interactive applications can also be exploited, but it not so straight forward. But since many of these systems are of enormous value to the business, particularly the mainframe based ones which exploit transaction processing monitors (CICS in most cases), there has been a lot of development of tools to help with the integration of the new with the old. E-commerce has been a big influence and the best products today are Application Servers using Java and J2EE in particular, such as WebLogic and WebSphere. Nevertheless the legacy systems are still of mixed value. The ones developed in a modular architecture are the most valuable, a lesson which developers of new applications should note.
However, while some valuable progress can be made in extracting further value from the mainframe type applications, there is a much bigger stumbling block to overcome. Unfortunately in more recent years a lot of business critical applications have been developed using the thick client architecture. The bulk of the application logic is executed in a desktop PC, with the database, enhanced by some stored procedures, on the server. This architecture was doomed from the start and only came about because developers had too little experience with building implementable, maintainable systems. The Visual Basic, Power Builder generation knew nothing of robust system design, had never heard of a TP Monitor, and basically didn’t know the difference between multiple single-user systems, e.g. e-mail, and multi-user systems, e.g. order processing. Older professionals, who should have known better, caved in and allowed the lure of the GUI interface (which they didn’t know anything about) to dominate. We completely missed the opportunity of building applications which exploited the best of both worlds, thin GUI clients and robust transaction processing servers. It took IBM for ever to provide a means for a Windows client to call the execution of a CICS transaction for instance, which simply excused the proliferation of bad design.
But there is now a worse problem. The thick client applications have been developed and implemented in pilot situations. It is always a surprise that it takes so long, but while everyone today is talking about Web enabled applications, the bulk of thick client applications are only now coming into play. And there is a whole host of problems piling up because these systems are failing to scale up. They are very difficult to install and even more difficult to maintain. As a result there is widespread user dissatisfaction, which leads to attempts to modify the code, which inevitably, because the architecture is unsuitable, merely introduces further problems.
Many organisations are turning to Citrix to solve this growing legacy problem. But while this is a good idea, it only makes maintenance easier. It does not solve the problem, it is still the same nightmare of the thick client application. Citrix confuse the issue by calling their product a "thin client", which it isn’t., it merely maps the Windows GUI interface across a network and no more. There is only one solution. Citrix can be used as a tool to buy time to redesign the application, throwing away the Visual Basic code as soon as possible. The same hardware can then be used to run the new thin client code when it has been developed.
Martin Healey, pioneer development Intel-based computers en c/s-architecture. Director of a number of IT specialist companies and an Emeritus Professor of the University of Wales.