Hello, I had an idea to do with grid computing, but it may be total garbage. I heard about some clever people who started to 'steal' computation from unsuspecting web sites by hijacking the normal function of the site and co-opting its computations into a different program. If these stories are true, surly we could do this with a bit more civility, and set up a bunch of generic 'calculators' through the web which could then be used for grid computing. The way I imagine the system is this... Program starts by searching the web for calculators, the code is compiled for the 'web-engine' so every single instruction is encoded as an HTTP / CGI / XML request, and all instructions are performed over the web on a shifting number of calculators. Actually, I found something similar hear... http://ausweb.scu.edu.au/aw02/papers/refereed/kelly/paper.html I wanted to ask about the feasibility of such an idea. For example if one machine sent all its instructions to another over a gigabit intra net, how much slower would this be than local computation? Is a gigabit LAN 1/2/3/10/100/1000 orders of magnitude slower than internal CPU communication channels? The power of an open source system like this would be if someone like Apache would take the idea on board and release it as part of its standard distribution. However, even if every web server on the web were running such a calculator (why not be ambitious), could the system be fast enough? Naturally there are a lot of issues regarding distribution / allocation / scheduling etc. but before we get into nasty details, is the idea remotely worth consideration? How difficult would it be to make a Java compiler accommodate such a web-engine? Thanks very much for any feedback, Dan.