It seems like a day doesn’t go by where I don’t read something about Software Defined Data Centers (SDDC). While nobody seems to have settled on an actual definition of what a true SDDC is supposed to do, the overall concept seems to have everybody excited. While I don’t dispute that SDDC seem to be a logical path for the industry to take, I don’t see many articles quoting any real sales figures which leads me to believe that many data center operators are taking a “you go first” approach to adoption. This makes sense, since solutions advertised as “all encompassing”, tend to be somewhat confusing when a potential customer just wants to know which server is running the company’s email. While we are all waiting for the Rosetta Stone of SDDC, there are software applications available today that can provide real value in the areas of data center calibration and capacity planning.
Calibrating your data center is a proactive process that enables data center operators to fine tune their facilities and identify potential operational issues at the component level. A service provider, for example, could use this process to maximize the sellable capacity of their facility or to provide actionable criteria within the customer SLAs. This process requires both CFD and component level modeling tools. In recent years multiple vendors have arisen to provide this functionality. Here at Compass we use Future Facilities’ 6SigmaDC product for the CFD modeling component and Romonet’s system modeling tool for the TCO component and system level analytics.
Calibrating a data center is required due to the fact that no two data centers operate exactly alike (except, of course, in our case). The calibration process provides data center operators with the specific benchmarks for their facility that can then be used to determine the impact of operational actions like the moving or adding equipment on the raised floor will have on overall site performance. The calibration process begins during the design process for the facility by evaluating the performance on multiple floor layout scenarios. The adoption of the final layout model then provides the initial benchmark standards that will be used in calibrating the facility. The calibration effort consists of comparing these initial benchmarks to the site’s actual performance during a series of progressive load tests conducted upon the completion of the facility’s Level 5 commissioning.
The completion of the site’s commissioning efforts is important since it eliminates an assortment of extraneous variables that could affect the final values reported during the load testing. During load testing the site’s performance in a number of areas including cooling path considerations like the airflow from AHU fans to floor grills or from the grills to cabinets is documented and compared to the initial modeled values to determine if there are any variances and whether those deviations are acceptable or require corrective action. The conclusion of this process results in the establishment of the performance metrics that apply to that data center specifically.
Certainly the establishment of performance benchmarks for the data center is a valuable exercise from a knowledge perspective, but the real value of the calibration effort is resulting ability for operators to continuously model the impact of future site modifications on its performance. The continuous modeling capability manifests itself in more effective capacity planning. The ability to proactively analyze the impact of site modifications like cabinet layouts, increasing power density or hot aisle/cold aisle configurations enables important questions to be answered (and costs avoided) by determining the most effective mode for their implementation prior to the initiation of the first physical action.
Aside from the practical value of the ability to use currently available software tools to perform calibration and continuous modeling activities, they can also provide operators with the ability to prepare for a software-defined future. Developing an on-going understanding of operationally effecting actions provides a foundation of knowledge that can pave the way for the more effective implementation of a “comprehensive software solution” in the future.