How Distributed Grid Computing Could Cut Costs and Help the Environment

Cat-5 Cable
Creative Commons License photo: Darren Hester

The dream of distributed computing (or grid computing) is that it can cut the costs of computing and cut carbon emissions. In this post, I am to explain how it works.

Let us imagine a scenario where both Carl and Daniel have computers. Carl has a computer which is twice as efficient – that is it costs him half as much to do the same thing on his computer. Let’s say it costs £1 in electricity for Carl to run a computer model; and £2 for Daniel. In total, it costs £3 to society to run the computer model once for Carl and once for Daniel.

Cost to Carl: £1
Cost to Daniel: £2
Total Cost to Society: £3

With Distributed Computing

Now imagine the same scenario but with one addition: distributed computing. As it costs Carl less money to run the model on his computer than it would cost Daniel, Daniel could pay Carl to run the model for him. Imagine that Daniel paid Carl £1.50. It only costs Carl £1 to run the model Daniel’s model for him, but he has gained £1.50 for his effort giving him a profit of 50p. Daniel only spends £1.50 to have his model run, as opposed to the £2 which it would have cost him to run the model himself.

Everybody benefits by saving money and the end result is the same: Carl and Daniel have both had their model run.

Cost to Carl: 50p (£1 to run his own model, subtract 50p profit from running Daniel’s model)
Cost to Daniel: £1.50 (He pays Carl £1.50 to run his model for him)
Total Cost to Society: £2

What assumptions have we made?

DC-10-30F
Creative Commons License photo: Ack Ook

There are no costs involved in the transaction itself. Imagine that it costs £2 for Daniel to send a copy of the computer model to Carl and then to receive the results. If Daniel had to print out instructions on how to use the model, then FedEx it to Carl and wait several weeks to see the results of the model, that’s perfectly conceivable. In this case, it costs of Daniel asking Carl to run the model for him would be £3 (£1 for Carl to run the model on his computer and £2 in transaction costs). In this case, he might as well have run it himself. Real world transaction costs would include slow network connections and incompatibilities between different computer systems. So for distributed computing to work we need fast, reliable network connections and software compatibility.

Daniel would happily allow Carl to run the model for him. Are there privacy implications for example? Daniel must be confident that he can allow Carl to run the model for him and be equally confident that Carl couldn’t have a peek at the results of his model. After all, there might be trade secrets in there. Similarly, Carl must be confident that Daniel isn’t sending him malicious software which could break his computer. For distributed computing to work, there must be a foolproof and hackproof way for Carl and Daniel to trust each another to keep to their side of the bargain.

Servers
Creative Commons License photo: JohnSeb

Thirdly, Daniel must actually be able to cut his costs. Let me explain. It’s possible that Daniel will have his computer on 24/7 anyway. That is, it’ll cost him £2 whether he’s runs the model or not. If he’s leaving his computer running at 100% but idle and still asks Carl to run the model for him, he essentially pays for the model to be run twice. My computer doesn’t dynamically underclock so whether or not I’m using it, it’s eating up the same amount of energy. For distributed computing to work, our own computers must make much more efficient use of resources. We need to have thin-client computers with neglible costs.

The real world

Distributed computing hasn’t taken off yet on any large scale. The three conditions don’t yet exist:

  • We need fast, reliable network connections and software compatibility. This definitely doesn’t exist at the moment: I don’t trust my own network connection to be 99.9999% reliable. It’s OK for downloading files and sending e-mails but it needs to be good enough for me to be able to send entire computer programmes over the network in under a second. Additionally, software isn’t at the stage where it’s “write once, run anywhere”. We need standards, standards and standards.
  • There must be a foolproof and hackproof way for Carl and Daniel to trust each another to keep to their side of the bargain. There is no way I would let anybody run a piece of software on my computer without me checking it first. If I had to pre-approve every single piece of software, that adds to the transaction costs which I discuss. Virtual machines are one way we can get around this issue by creating safe ways to isolate software and to track it’s progress. Still, I’m not sure if there is a secure way to run software on a computer with the confidence that the owner of the computer can’t take a peek. And I’m not sure if we’ll ever reach the point where people will happily allow third-parties to run software on their computer and have no possible way to find out what it’s doing.
  • We need to have thin-client computers with negligible costs. I’ve already debunked this one. My computer uses exactly the same amount of power whether it’s active or idle. I don’t believe that people will drop the idea of “a computer on every desk in every home” until they are confident the first two criteria have been met. Only then will they accept owning a thin-client computer.

It’s already being used…

Last year I worked at a company which employed distributed computing on a smaller scale. They had a small cluster (~20 computers) with identical hardware, each linked with Gigabit Ethernet. Software ran inside virtual machines and those virtual machines moved around between computers depending on the amount of spare capacity each one had.

The reason why they could employ distributed computing is because within their own system, they knew that:

  • They had a reliable intranet connection and because all the computers were identical software worked on every single computer.
  • Because they only ran a limited number of programmes and all the computers were under their own control, there were no trust issues.

DSCN1751
Creative Commons License photo: Petrick2008

Distributed and grid computing isn’t yet practical on a worldwide scale but I think we’re making progress. Networks are becoming more reliable. Software platforms appear to be becoming more standardised. Virtual machines are coming of age. And our computers are becoming more environmentally concious and adapting their resource usage to the amount of processing power required.

So there is the blueprint to how we can lower the costs of computing. You might be wondering what that’s got to do with the environment. Well, simply replace the £ sign with Joules of energy. Like free-market trade can leads to an efficient allocation of resources in the real world, the trade of computer time in a distributed grid of computers leads to a more efficient allocation of computing resources. And that lowers the energy consumption of computing and it’s environmental impact.

Leave a Reply

Your email address will not be published. Required fields are marked *