The most commonly asked question about utility computing must surely be: what exactly is utility computing?
This is because it is one of those things - convergence and unified messaging being two more - that everyone seems to define in a slightly different way, according to their perspective on the market.
We probably all recognise that it has got something to do with paying as you go for computing power, like domestic utility payment. But beyond that it is all a rather vague promise of millions saved, billing models revolutionised, infrastructures reinvented and paradigms shifted.
So it's not unfair to ask some of those who claim to have developed workable utility computing solutions exactly what it is that they are selling. Then perhaps we will be better placed to make a judgement about whether IT is really poised to be sold in a radically different way, or be forced to conclude we've all been treated to a classic bit of industry flash with no substance.
Clear definitions of utility computing have historically been elusive, admits Gordon Suttie, vision and strategy manager at BT. "Utility computing has been called so many different things in its time: the real-time enterprise, the adaptive enterprise, you name it," she says.
"It's no wonder people are asking, 'What is it?'"
Richard Warley, UK managing director of Savvis Communications, a provider of utility services, takes a robust stance with his definition. "It means on-demand access to a pool of massively scalable computing, storage and networking services," he explains.
"Utility computing today means more than 'pay by the drink'. Utility computing gives businesses the opportunity to become more agile by easily and quickly scaling their IT infrastructure up or down as business needs change."
It's not actually as novel as some probably imagine, he points out.
"Blade servers, storage networks and virtual private networks are all examples of utility technology that have been around for years. The real news is that these technologies are now being used to deliver integrated computing services to run applications. You just plug in and go," says Warley.
Utility computing deployments are for real, and undoubtedly are on the rise, says Craig Nunes, senior director of marketing at utility storage pioneer 3PAR.
"Customers are understanding that utility computing is able to address the utilisation and operational efficiency problems of traditional data centre environments," he says.
Dave Roberts, chief technology officer at Inkra Networks, feels that if there has been a lot of confusion over exactly what utility computing means, blame for this should be shared.
"It has become a hot topic, fuelled in part by the press," he says. "This is not to say that the technology does not provide real business benefits."
Roberts defines utility computing as a three-stage track, down which we are still travelling, many of us at different speeds. First, he identifies the 'virtual computing' stage, otherwise called virtualisation, where networks, computers and storage are split into pools of resources.
Next comes 'on-demand computing', where the virtualised infrastructure is linked to a management solution that allows for automation across the data centre.
While he says these two are here today, he believes the final stage of 'utility computing' - an outsourced, 'pay as you go' billing model - is really only in its first stages.
Suttie agrees that full-on utility computing is only now getting to the point of becoming buyable.
"It's at the stage where organisations are defining the architecture of just what can be effectively used," he says. "People are assuming that everything will become a utility, because that's what they've been led to expect. But in reality some things will happen before others. Storage will be among the first."
Suttie warns not to expect everything to change overnight, and adds that not all users will move at the same speed.
"Organisations won't be looking to put everything on a utility model," he says. "And it will be more appropriate for some organisations than others, just as some mobile phone users like 'pay as you go' and others don't."
Dominique Comte, managing director for EMEA at Platform Computing, points out that while technology can change quickly, habits tend to change more slowly. "It's important not to get carried away. It takes time for ideas such as utility computing to take root," he says.
"Utility computing's potential has not been over-hyped. But arguably today?s reality has. There is a learning curve here that everyone - vendors, users, resellers - is sitting on. This will take time, as will the development of standards."
Guy Bunker, chief scientist at Veritas, is indignant at persistent claims of hype, and as befits a scientist, believes blame for this is attributable to over-enthusiastic marketers. "Like many things," he says, "utility computing was marketed heavily before it was truly ready, and by the time something starts to be delivered, marketing has moved onto the next big thing."
Bunker says he is now seeing Veritas customers starting to deliver IT as a service. "While we are not going to see a complete utility data centre for a while, we are seeing backup, storage provisioning and even disaster recovery all being delivered as a utility," he claims.
If utility computing has been slow to move from theory to reality, there are good reasons, says Michael Hjalsted, EMEA marketing director for servers at Unisys.
"One reason is the lack of clarity about the benefits. Users who are looking into utility computing may discover that they do not need such products after completing an assessment service. Instead, the real benefit may lie in a more consolidated approach," he says.
Hjalsted adds that utility computing models tend to be more expensive than traditional solutions because of the increased flexibility they deliver. "Vendors are not ready to take all the risk for unused capacity, which dramatically increases the pricing," he explains.
He also says ISVs need to support the model with more flexible licence policies, while vendors need to wean themselves off their own definition of utility computing and what it can do. "There is no cross-industry solution available, so current offerings on the market are based around a proprietary solution," Hjalsted says.
The final word on the 'hype or reality' debate goes to John Holden, analyst with the Butler Group. "In common with other technologies, utility computing has probably been over-hyped, but that does not invalidate its relevance for the future," he says.
Holden believes it will prevail as a key end-user choice. "Today's organisations have to operate in a changing and disruptive environment," he says.
"IT has to respond faster than before to accommodate technological and political changes, increased competition and ever-greater customer expectations. In order to cope with these challenges, IT needs an infrastructure that is agile, reduces capital and lifecycle costs, increases the quality of service, and mitigates the risks of over- and under-provisioning."
Holden adds that the early adopters of utility computing include financial services and life sciences organisations, as well as the manufacturing industries.
These and other enterprises, he says, need to consider what parts of their infrastructure would be most appropriate for becoming utilities, and which applications are too strategic to be considered for utility provision, as well as those business applications that could be hosted by a third party.
End-users aside, Holden foresees that utility computing has major implications for the IT industry. "The utility computing infrastructure will have a major effect on the business models present in the industry," he claims.
"For example, application software vendors may well find that they are seeing increased demand for their products, because SMEs will be able to afford the platforms, through utility computing, that were previously not available to them."
Of course, a key utility computing debate is the extent to which large vendors will look to pull the strings, and exactly where they will permit reseller partners to get involved.
Willy Ross, managing director for Europe at DataSynapse, believes there will never be a total migration to a model of buying total service from large external vendors such as IBM and Hewlett-Packard.
"There will be some routine operations where it makes sense to completely service via an external utility," he says.
"And there will be some cases in which organisations will use external utilities for overflow or temporary capacity.
"The reason for this is simple: external organisations will never understand the business of internal organisations and cannot react in a dynamic manner that adjusts the infrastructure as needed to support the ever-changing needs of the business."
Warley believes utility services represent a significant shift in the marketplace and an opportunity for resellers to leverage their relationships and industry expertise.
"Now is a great time for resellers to grow their businesses by partnering with utility services providers such as us. But the game is changing and some resellers could be left behind," he says.
Roberts believes resellers are probably still unsure about their role. "The key shift in the IT industry is away from the simple sale of boxes and toward the 'monetisation' of services," he says.
"Resellers need to be looking at the value-added services they can add on top of infrastructure products they already sell, and also look to new products that can deliver in-house utility infrastructures for their clients."
Neil Louw, director at Dimension Data, predicts that an important channel subplot will be the fight between VARs and system integrators, as both look to provide utility services.
"I believe the systems integrators will win in the end because they have an understanding of the higher level applications that they're implementing, whereas the service providers will just be responsible for the transport medium - in other words, the infrastructure delivering the services," he says.
Who succeeds best at delivering the services depends in part, of course, on where demand really comes from. Louw says that currently a lot of the customers he is seeing are large corporate FTSE 1,000 types.
He expects SMEs to join the party, but adds that many are quite wary and conservative, and harbour much reluctance about going down the utility route.
"It is certainly a good model for SMEs," Louw says, "because smaller companies have defined costs and also don't have the resources to support the complexity of infrastructure.
"The irony is that it's currently the big guys, with the in-house expertise, that are the biggest consumers of utility computing. However, once the model becomes more accepted it won't be long before the SMEs follow suit."
Anthony Foy, group managing director of Interxion, believes utility-based data centres are popular among big listed firms that feel compelled by legislation to ensure operational resilience.
"To manage resilience in-house is costly in terms of infrastructure and management costs," he says. "Outsourcing based on the utility model provides these firms with peace of mind: the knowledge that compliance demands are being cost-effectively dealt with."
Foy says resilience is equally important to SMEs, and claims the model also allows them to outsource. "Consider those that rely on internet trade, publishers with tight lead times, firms dealing with real-time data. The affect of downtime on these firms cannot be underestimated.
"Traditionally, up-front investment in infrastructure has been a barrier to adoption by smaller companies. Utility computing brings down the barrier of up-front investment," he says.
Utility computing's inherent flexibility, as well as its potential resilience, is also stressed by Suttie. "An important aspect of what we offer is that clients can change the profile of the service to suit their current needs," he says.
"If you want a network so you can deliver an intranet across an enterprise, for example, then there's a danger in sizing it for today's needs, not tomorrow's. Our service is all about coping with rapid deployment when it's needed."
Bunker claims that utility computing is as much about principles as products, boosting its applicability across the spectrum of enterprises.
"SMEs can equally well apply utility computing principles to understand where their IT budget is being used and how efficiently they are using it," he says.
"It might be just for simple operations such as backup, rather than for storage provisioning. However, understanding whether there is space capacity within the environment and where it is, and then being able to make an informed decision as to whether the next hardware purchase is required has to be good."
On balance it seems fair to conclude that utility computing is for real, and that while it may have been a long time coming, there are clear signs that it is moving beyond the early adopter phase.
Whether, as predicted by many, it goes on to effectively replace all other models for the delivery of computing power is a moot point. It seems sensible to assume that some end-users will adopt a utility model for some of their needs. But who they turn to for delivery of their utility solution seems to be up in the air.
3PAR (020) 7464 8416
BT Global Solutions (01977) 597 150
DataSynapse (020) 7556 7776
Dimension Data (020) 7561 7000
Inkra (07801) 599 555
Interxion (020) 7375 7000
Platform Computing (01256) 370 500
Savvis Communications (0800) 7288 4743
Veritas (0870) 243 1080
Dell EMC partner 'very keen' to make acquisition
Robotics company UiPath claims to now be valued at $3bn after $225m funding round
Struggling security titan makes three board appointments after investor took 5.8 per cent stake last month
Commvault ousted its CEO in May and has since undergone a radical refocus