Datacentre cost calculations critical

Too many services providers assess their datacentre cost base in a slipshod manner, warns Zahl Limbuwala

Datacentres are a fact of life for the IT industry, from CIOs to the channel. In particular, services providers rely on their datacentre infrastructure, and on running it as economically as possible to guarantee their revenue and margins.

Yet datacentre infrastructure and IT services in general are becoming more commoditised. Apart from the badge on the front and the consequent price, there is no longer a huge difference between one server and another. At the same time, datacentre costs are steadily increasing.

The Uptime Institute's Data Centre Industry Survey 2013 found that about 75 per cent of third-party datacentre providers that responded to their questioning had recently increased their budget by more than 10 per cent.

Is this a golden age? At first, the combination of commoditised hardware and increased budgets might suggest a rise in low-cost, high-capacity datacentres perfect for service provider needs. This is still far from the case.

Datacentres too often are financial black holes, where money is spent with no real understanding of the eventual effect, or even whether or not the investment will be returned.

This can be down to a number of factors. For example, organisations frequently fail to completely understand that datacentres have many intertwined elements which contribute to the total cost of ownership.

The commoditisation of IT hardware alone, for instance, will not greatly reduce the costs incurred by a datacentre's building design and internal layout, all of which add far more to the cost. Similarly, datacentres are often seen as static even though performance, costs and utilisation vary greatly over their lifetime, and require ongoing investment.

Many channel services providers may be offering services without any real understanding of their cost, except in the broadest possible terms. This has a natural effect on pricing.

Without the ability to accurately cost their services, businesses will have to base their processes on a best guess, previous experience, or what the competition offers, none of which will necessary be in the business's best interests.

It might be tempting to compete purely on cost. However, the giants of the services provider world, such as Google and Amazon, have economies of scale that allow them to drive their prices extremely low.

They have also put much effort into understanding their datacentre infrastructure and activity costs, meaning they can perfectly balance their costs against the prices they offer. Competing confidently with this is virtually impossible.

Services providers need to learn from these giants, however, understanding the real price of services and developing customer pricing models accordingly.

It is not enough to simply measure datacentre costs; the provider must also be able to predict these costs, and understand the impact of each potential change it makes – whether that is from building a new datacentre or simply replacing existing hardware.

It can then see the impact on profitability.

Second, such predictions must be based on careful, accurate calculations that take into account all possible influences on cost. Too often compromises, omissions or best guesses are made in order to simplify the process. Needless to say, this is not a path to accuracy.

Lastly, cost must be understood at a granular level, going beyond the cost of each component. Instead, providers should "normalise" and compare the costs of services, especially the cost of providing one unit of value from that service.

For instance, an email service provider should know the cost of sending a single email or of setting up an inbox.

Like any business, a service provider can guarantee its profitability only once it understands its costs. Without this understanding, you are building a business on a black hole.

Zahl Limbuwala is chief executive of Romonet