A holistic approach to data centers could result in millions of dollars of savings and a far smaller carbon footprint for the ever-expanding universe of information technology.
That’s the promise of research conducted by Binghamton colleagues Kanad Ghose, a professor of computer science, and Bahgat Sammakia, a professor of mechanical engineering and director of the University’s New York State Center of Excellence in Small Scale Systems Packaging and Integration, or S3IP.
“The amount of energy we spend on running our data centers in the U.S. is about 2.5 percent of the total national energy expenditure,” Ghose said. “That doesn’t sound like a big number, but it’s enough to power a couple of good-sized cities for most of the year.”
The statistics are “sobering,” Ghose said. The number of data centers is growing rapidly because of increasing demand for online services for everything from medical records to shopping.
“The unfortunate fact is there’s a lot of waste in this,” Ghose said. “All data centers are ‘overprovisioned.’ They’re designed to handle the peak loads. And most of the time, they operate at 40 to 60 percent of that. When data centers run at a lower than peak load, the energy efficiency is very poor.”
There are also inherent inefficiencies, in part because most servers run on the Linux operating system, which doesn’t have good power management solutions for servers.
That’s just the bad news from the information technology side. Then there’s the cooling.
“The cooling solutions are also overprovisioned,” Ghose said. “Data centers run hot because a lot of machines are packed into a small space. The loads in a data center fluctuate, and you cannot track that changing load fast enough in a cooling system, so you end up playing it safe. There’s an enormous amount of waste.”
Most of the facilities use chilled water, and it takes some time to lower or raise the temperature of the water by 5 degrees. New York state alone spends close to $600 million on utility costs for running its data centers. Half goes to power the computers; the other half is spent on cooling. And utility costs continue to rise.
Most researchers focus on smart workload management when they talk about “green” data centers, but Ghose and Sammakia say that’s not enough. They’re looking for a comprehensive solution. That will mean finding a way to spread the workload across all the machines, planning in advance for the workload allocation and the cooling budget. Ultimately, it means exercising cooling activities and workload activities synergistically.
Just-in-time provisioning of IT resources and just-in-time cooling are the keys here, said Ghose, who expects to set up an experimental data center with Sammakia and other collaborators soon. Companies such as Emerson Network Power and IBM have already expressed interest in the project.
Sammakia, who’s also the University’s executive director of economic development, said the test facility will give a boost to companies in the region and beyond. “Over the next five years, this will help us create hundreds of local jobs and attract companies to the area,” he said. “It will allow New York state and national companies to showcase their energy-efficiency projects.
“It’s a test facility, but at the same time it’s a real, operational data center. Each company will come in with its latest and greatest equipment.”
Ghose said an innovation that results in an energy reduction of, say, 15 percent could make a big splash. He believes their solution could result in savings of more than 25 percent. And the lessons drawn from data centers could pay off for desktop computers as well.
“The writing’s on the wall,” Ghose said. “Unless we address this now, things will become worse. Most server vendors are trying to pack more into the same space and making the problem worse. What makes it worse is the amount of heat you produce in one cubic foot of space. That’s going up significantly because things are becoming smaller and faster. And of course there’s a carbon footprint, the more energy you spend.”
No Comments