From the early days of computing there has been a need to manage computer assets since they represent a significant capital investment. But with the advent of PC networks the problem escalated to a serious level.
Today each user has their own computer compared to the earlier days of a central computer with dumb terminals. It is difficult to misuse a dumb terminal, it is difficult not to with a PC! Each user has a full function computer, with storage, operating system, application programs and, worse, private data. This problem is exaggerated because users today deem it a right to have a PC on their desk, many of them so that they can appear to be working. It would be impossible to cost justify all those PCs, but now they have a status rating and few companies can afford to impose realistic limits on who and how many people have their own machine. Thus the problem of managing the company’s assets has reached frightening proportions.
It would not be such a major problem if the software industry had any standards for establishing licence fees, but there aren’t any. Some charge on the number of active users but most charge on the number of potential users. It is surprising that this is legal, but it is.
Asset management has a high profile today because of the proliferation of PCs. Management of mainframes and shared servers is in fact equally important, but does not present the same problems, first because there are less of them and second because they are managed by IT staff, not users. The focus therefore is inevitably on PC management, but don’t forget the other systems!
The first requirement is to accurately evaluate what assets are actually in place. This includes both hardware and software. Further this is a moving target as machines are added, retired, relocated and modified, so that the management systems must be dynamic. Hardware management is far simpler than software management, particularly if no older machines are involved (they didn’t have mechanisms for detecting hardware configuration). In all cases a server keeps a database of details for each and every machine, including the more difficult problem created by laptops. Each PC is loaded with a client which performs the local checks and communicates details with the server. This client can also be used to activate certain changes dictated by the server, such as deleting, adding or updating files. The products on the market today are managed from the server by IT professionals, without any individual end-user involvement.
Obviously there is a wide variety in the functionality of asset management products, ranging for relatively simple stand-alone systems to those fully integrated with all other management functions. The sophistication of the database and the analysis tools designed to work with it obviously varies dramatically.
There are a wide number of uses for Asset Management Systems (AMS), which often results in a mixture of specialised products. Integrating them is as usual a problem due to limited standards, something that is interesting the standards bodies now.
One of the key historical applications which exploit AMS is the Help Desk. It is essential if help is to be successful that an operator can check a user’s system configuration and also to update it if necessary, without leaving the centre. It is essential to ensure that all the correct versions are in use, a very difficult problem because of all the subsidiary routines, normally DLLs, that are used by an application. It is not sufficient to check just the main .EXE files.< BR>
Martin Healey, pioneer development Intel-based computers en c/s-architecture. Director of a number of IT specialist companies and an Emeritus Professor of the University of Wales.