Quantcast
Channel: Data Center Infrastructure & Critical Facility News » Predictive DCIM
Viewing all articles
Browse latest Browse all 63

DOES THE DATACENTER INDUSTRY NEED A CAPACITY GOD?

$
0
0

 

logo

Published on 18th June 2014 by Penny Jones

DOES THE DATACENTER INDUSTRY NEED A CAPACITY GOD?

????????????????????????????????????????

The divide between facilities and IT teams within the data center created some lively debate this week at DatacenterDynamics Converged Santa Clara. This time the conversation was around unused capacity, cost and risk. Judging by the thoughts of those working here on the US West Coast, the overall responsibility for managing these areas is a real ‘hot potato’ that is turning efforts to drive efficiency and reduce costs to mash.

But it appears to be the fault of no single team or technology. What it really boils down to (not intending to put another potato pun out there!) is a lack of education, or even an ensuing candidate position to assume such a role. It seems IT teams have enough on their plate to start learning facilities, and facilities the same regarding IT. And finance, well they often have other parts of the business to think about, despite paying the power bill. But when things go wrong, this hot potato can cause havoc for all teams involved.

On the evening leading to the event, a roundtable organized by predictive modeling vendor Future Facilities, hosted by industry advisor Bruce Taylor and attended by a number of industry stalwarts and a handful of newer industry members, discussed hindrances to capacity planning. Most agreed that the main reason we have stranded capacity in the data center is that the industry has created so many silos – teams working on individual projects inside the facility – that there is rarely someone tasked with taking on the bigger picture, looking at the farm from the top of the mountain.

taylor

Air flow is complicated, and Future Facilities argues that predictive modeling is the only science that can really help when deploying, then maintaining levels of efficiency as data center demands – and equipment – change.

koomey

Dr. Jon Koomey, research fellow at the Steyer-Taylor Center for Energy Policy and Finance at Stanford University said only when you know the physics of the situation inside the data center, and the effect of changes you are likely to make in future, can you remove the problem of stranded capacity, and in turn drive better levels of efficiency through reduced power use.

“The goal, ultimately, is to match energy services demanded with those supplied to deliver information services at the total lowest cost. The only way to do that is to manage stranded capacity that comes from IT deployments that do not match the original design of the facility,” Koomey said.

He likened the situation today to Tetris, drawing on the analogy of the different shaped blocks in the game.

“IT loads come in to the facility in all different shapes, leaving spaces. Those spaces are capacity, so that 5MW IT facility you think you have bought will typically have 30% to 40% unused.”

Despite the obvious draw for making maximum use of your data center many attendees agreed that predictive modeling, and even data center infrastructure management (DCIM) tools that offer more clarity on the individual situation at real time, can be a difficult sell. Once again, the hot potato (of no one tasked with complete responsibility) often gets in the way.

Markthiele

Mark Thiele, EVP of data center technology at Switch, who has also worked for ServiceMesh, VMware and Brocade, said in most cases there is not a single person in the data center with a vision or understanding of the facility’s entire operations – from design and build to IT, facilities and even economics.

“Today 75 to 80% of all data centers don’t have a holistic person that knows and understands everything about the data center, so the target opportunity for [sale of] these tools is often someone that has no responsibility for managing this in their job description,” Thiele said.

“We also find that a majority of facilities today are still bespoke – they are designed to be repaired after they are created. These are serious thresholds that have to be overcome in the market on the whole.”

But this is a situation the industry has created for itself, according to dinner host and Future Facilities CEO Hassan Moezzi.

“If you go back to IBM, 40 years ago it dominated the mainframe market. At the time, the concept of IBM having blank cheque for customers was a really painful thing but everyone accepted that because it was the only way. IBM built the power, cooled the data center and provided the hardware and software and if anything went wrong with the data center it was all put back on to IBM,” Moezzi said.

Today we have the silos and distributed systems we have asked for. Anyone can buy a computer and plug it into a wall. The shackles have gone, and so too has that one throat to choke – or to sell capacity planning systems to.

Continue Reading this article here: http://www.datacenterdynamics.com/blogs/penny-jones/does-data-center-industry-need-capacity-god 



Viewing all articles
Browse latest Browse all 63

Trending Articles