Amazing as products like Virtual Server and VMware are, I’ve always felt that they are most applicable to people like application developers who need to run incompatible versions of development tools to support their work and that the vision of true virtualisation wasn’t being realised.

Because I am interested in this stuff I subscribed to the RSS feed at virtualization.info and a few days ago it turned up a treat. Apparently a company called Virtual Iron has come up with a way to emulate a single system which is actually running across multiple machines.

I hope Microsoft is looking at this and seeing how they could apply this idea to their server platform because it could just be the critical element in the Dynamic Systems Initiative. With the arrival of tools like Visual Studio Team System which actively encourage development teams to model their deployment environment it would be excellent for them to be able cast that model out onto an ocean of computers and have that model automatically acquire the underlying computing resources it needs to execute.

It also adds fuel to my argument in this post that developers need to become more aware of concurrency in their application architectures – especially server applications. Last week in my INDUSTRIAL STRENGTH .NET class in Melbourne I predicted that within ten years (or something like that) we could easily see cheap systems sporting 30+ processor cores finding their way to the mid-range market (maybe even the desktop).

If you take that in hand with this offering from Virtual Iron we will have enormous amounts of processing power that can be easily deployed – but only leveraged if you know how to write concurrent software. Before long we will probably start to see attributes we can mark out application assemblies up with that declaritively state what the concurrency characteristics of our code are, although this may only be a temporary measure until runtime optimisers can work across virtual systems that span multiple physical devices.

Interesting times ahead.