The rate of advance of technology in the computer industry is remarkable. The software lags a long way behind the hardware, unfortunately, but even that has seen significant advances over recent years. However the imbalance in the rate of development of software and hardware is causing some difficulties. It would be relatively easy if the applications remained static, but they don’t. I would guess that applications and software are advancing at about the same rate, hardware at least twice as fast.
Superficially this doesn’t matter, but it is quite a serious problem in practice. Applications have a long life and they must be supported using the technology available when they are introduced. It follows that nearly all active applications are using obsolete technology! The developers want to run ahead as fast as possible but they are constrained by older technology. The only solution to this dilemma would be a major acceleration in the development of software technologies, focusing on automated design and component techniques. Sadly there are too many vested interests involved for any hope of real progress so that this unfortunate situation is going to plague the IT industry for many years to come.
All applications are designed to make the best possible use of the available technology. Most designers will try to make some allowance for future developments but with the unpredictable situation described above, there is only a limited chance of success. Again the superficial view is that it doesn’t matter because the latest technology can be used for upgrades. This argument fails however because the new technologies can influence how a system is built, making an older design inappropriate. Think of the following examples and admit that it would have been impossible to design ahead for the changes.
Processing power. In the past programmers had to employ a lot of skill to make the application efficient because computers were so expensive. Today’s PCs have more power than mainframes of 10 years ago! Today’s programmers have gone to the opposite extreme and are mindless of resources consumed. This is equally wrong. Over efficient code can only be achieved by using methods which make maintenance a major problem, while on the other hand unconstrained code is so big that it is equally difficult to maintain and bug fix.
Graphical User Interfaces. The PC revolution made GUI interfacing economically viable. Unfortunately it created a split culture, mainframer versus PC whiz-kid, so that client/server applications were built on the thick client architecture instead of taking the best of both worlds and combining thin GUI clients with robust server-based business logic. It was another example of the technology getting in the way.
Storage. I hate to admit this, but I remember juggling with 5MB fixed and 5MB removable discs to get an affordable small business system. The cheap PC I write this on has an 80GB disc! The GUI applications for instance may have been enabled by the Xerox technologies as implemented on Macs and PCs, but they would have been useless without the incredible advances in capacity and price of disc drives. GUI applications were equally dependent on the development of laser and ink-jet printers. It is difficult today to appreciate that fax was a major breakthrough in the support for graphics. Desktop printers in particular have significantly affected the design of an application, on-line printing replacing spooling to a central facility in many applications. Note that this meant redeveloping most office applications as well as many data entry and query systems; the technology enforced a drastic change to the applications, like it or not.
Communications. The development of local area networking enabled the client/server architecture. Not surprisingly this created a problem of performance that is one of the biggest problems today, that of scalability. The effective bandwidth of the network varies with the number of users. Thus a system which works in a pilot with a few users often fails to work with a lot of users. The application programmers should have allowed for network loading in the design of the application, e.g. a thin client architecture, but they didn’t. They were seduced by the advantages of Rapid Application Development tools and knew nothing of the pitfalls.
Wide Area Networks. Now with the Internet we have a whole new set of requirements. The hardware developments have given a mass of external users a computer, but not one with software designed for the job. Systems now have to be built using Web browsers and they have to be integrated with core business systems.
Don’t you wish hardware development would slow down and software speed up?
Martin Healey, pioneer development Intel-based computers en c/s-architecture. Director of a number of IT specialist companies and an Emeritus Professor of the University of Wales.