Interconnecting two or more computers to run individual parts of an application is commonplace today. Ever since the advent of PCs and local area networks a "multi-tier" architecture has taken over from the earlier model of dumb terminals connected to a single central computer.
The immediate benefit of two tiers is the ease with which an interactive graphical user interface can be implemented, and indeed with the current thin-client architecture that is a very good idea. Unfortunately it has all gone horribly wrong. Before analysing this further however, note that other layers than the GUI/host can be usefully employed, notably the separation of the file system and probably the database to a separate machine (or machines). It is also common now to employ separate machines to run some of the software used to integrate new technologies with legacy systems, Web services being a specific example.
Unfortunately there are some serious problems now facing the IT industry with legacy systems. These are not the older batch and mainframe applications, but the thick client applications developed at the end of the 90’s; they have become legacy problems all too quickly. The problem is that the architecture is wrong and huge pieces of an application which should be shared on a server have been coded to execute in the PC client. These are extremely difficult to scale and develop in the first instance and there are a lot of projects that never got beyond the prototype phase. They are almost impossible to maintain in anything except simple cases. The dramatic conclusion is that they will have to be scrapped and replaced with thin client equivalents. This means major rewrites and it is all too obvious that many organisations are unable or are afraid to bite the bullet and are continuing to throw good money after bad to try and justify the waste.
While we should concentrate on sorting out the thick-client mess, we are in danger of being sidetracked into another equally stupid architecture, referred to as peer-to-peer. Simply interpreted peer-to-peer means two entities of equal value interacting as though they were one. To my mind this means two or more systems of similar capability, but the phrase was earlier used by Microsoft to mean something different, namely a form of file sharing. Even the earlier "very thick" LANs, which ran all the logic in each and every PC, with a common file sharing system, used a separate machine for the file and print server. That meant n+1 machines for n users, which was OK for a lot of users but not for three or four. Thus with Windows 3.11 Microsoft introduced a facility in which any PC on the network could be both a client and a file and print server, called peer-to-peer networking, a new meaning to the English language! Because of the frailty of Windows 3.1 this was not a particularly good idea, since every PC crash also crashed the server.
But now we have the Internet and millions of PCs, all more powerful than the mainframes of 10 years ago, looking for something to do which could justify all the hype and expense. It is impossible to cost justify 75% or more of the PCs in use in industry, but the Internet influence means that all those domestic machines probably should be included in the equation somehow. These machines are powerful because the applications that they run need the power; my PC is used for music and photographic editing which consumes vast resources compared to the bulk of desktop applications. Games is another eater of resources in the domestic market of no relevance in business. Basically the domestic applications are single user, the corporate ones multi-user, a vast difference. Corporate desktops can be economically connected to high power servers at high speeds in most cases, while the domestic market must continue to make do with low speed networking. It follows that there will be a lot of developments which are relevant to one sector and not the other, despite there being a common denominator, the PC.
The domestic market took another look at the Windows 3.11 peer-to-peer networking, which is now more practical because of multi-tasking operating systems, a lot more power and a little improvement in reliability. This was applied across the Internet. The question to be answered next is whether or not this is of any relevance to the corporate user!
Martin Healey, pioneer development Intel-based computers en c/s-architecture. Director of a number of IT specialist companies and an Emeritus Professor of the University of Wales.