I don’t think that there is a single definition of "peer-to-peer" computing. As far as I am concerned it simply mans that the resources of one machine connected to a network can be used by another connected machine.
It should also imply that each participant has similar resources, for instance that all can be both clients and servers. This is in stark contrast to the classical computer architecture in which specific machines provide the services for other devices. This means specialised servers and equally specific clients. The latter is the long established terminal/computer architecture, but it was also the architecture adopted by the first PC local area networks, dominated by Novell Netware. Most client/server implementations still use the same basic architecture, although the servers provide far more functionality than the simple file and print services of Netware, e.g. databases and application servers.
Most servers are in fact normal computers running appropriate software. Since the servers use multi-tasking operating systems, most run a mixture of services concurrently, although there is some argument for dedicated servers. For an n user network there must be n+1 computers with the normal arrangement. This makes every sense with larger networks, since the server must be significantly more powerful than the clients. The server machine should also have more expensive features such as redundancy, and higher quality, since the system is vulnerable to the single point of failure. In the days of Netware this was a problem for smaller networks however, due to the extra cost of the server machine. The demand for lowering the cost of small PC based LANs could be met by making one or more of the PCs both a client and a server. The problem was that the PC DOS operating system and its bastard child, Windows 3.1, was not a protected mode, multi-tasking system! Microsoft eventually released Windows 3.11, and it was a standard feature, which worked for simple applications, with a low usage, but we all know just how easy it was to crash Windows 3.1 with the simplest GUI application, Word being a prime culprit. Once crashed by a client application, it meant that the server capability also disappeared.
By the time Windows 95/98/NT appeared the cost of PC hardware had fallen so that there was not that much interest in the peer-to-peer LAN solution. But then comes the Internet and a new generation of enthusiastic users and now there is a reawakening of the interest in peer-to-peer computing across the Internet.
The basic idea is to belong to a user group. Each individual’s computer is configured so that certain facilities are declared as available to other users whenever connected to the Net. So far the only practical application is file sharing. Personally I think this is a bad idea, prone to misuse and vulnerable to viruses, but I did experiment with the most famous (or infamous) system, Napster. Napster specialised in making files of music available, encoded in MP3 format. When connected to the Network via Napster, then the subset of files which are declared as Napster files are accessible by any other Napster member logged on. Napster provided a directory of the file names of all MP3 files available at any time. The search engine would thus enable a user to trace if a specific track was available anywhere.
The machine I was using was a fast one with a megabit per second link. When I found another user with a file I fancied I could copy it to my machine (which of course led to all the copyright problems) and other members could copy tracks I had encoded onto my machine to their own. The problem with the whole concept then became clear. If the file I fancied was on another machine with a Mbps link, then the transfer took about 10 minutes. However for machines using dial-up modems, most of them in fact, then a typical 5 minute sound track took hours to copy. When ever a member had done his or her searching and disconnected, then the server disappeared and the file transfer aborted in mid stream. In practice it would have been totally impractical with a modem connection. Until all computers are secure and have permanent, high speed connection, the idea is going to remain for enthusiasts only, and by then the application service providers will have got their act together.