look up "clusters"
I have a 4 computer "lab". The coolest thing I can think of doing with these computers is getting them all to work on the same project - ideally without the projects processes having to know that the cpu cycles are being provided by multiple PC's. I am not nearly advanced enough to know how to do this but I know VMWare and MS have the technology to pool networked resources.
I am not requesting an explaination - maybe just a starting point to start to learn how somwthing like this would work in a Linux environment. I have tried looking up cloud computing,one to many computing and several other options but cannot find information on the topic even though I am sure the technology exists. If anybody can help with a link or keyword or any other info it would be greatly appreciated.
look up "clusters"
What I am thinking is more like the PS3 project which joined the CPU power of many Playstation 3's to crack an encryption algorithm. In VMWare, several machines can be joined together and presented to the operating system as a single host (with a ton of power). That is such a cool concept - many hosts presented as one to the process being operated on.
I also had a suggestion of issuing IRC commands to coordinate the network onto a single task - a very good idea but once the different PC's connect how do I present them to my process as one block of CPU and RAM?
If this is not possible in Linux I understand but so far everything I can do in Windows can be done better and faster in Linux.
I agree with Barry, Clustering is what you are asking for, VMware does something different.
There is an offensive security document you might be interested, it's creating an instant cluster with john and MPI (Message Parsing Interface).
You might want to look at something like OpenMP. So long as your software is SMP capable it should just work for an OpenMP cluster.
Still not underestimating the power...
There is no such thing as bad information - There is truth in the data, so you sift it all, even the crap stuff.