In January, Facebook users spent more than 10.5 billion minutes a day accessing the site just by computer, according to the company's IPO. That takes a lot of energy. Most data centers'large hubs of servers that handle bank transactions, cloud-based email services, and friend requests'devote around one-third of their energy consumption to building operations. Having leased space in such facilities, Facebook wanted its first data center to maximize energy efficiency. Working with Sheehan Partners and AlfaTech Consulting, the company rethought every piece of equipment, from circuit boards to air handling. Thanks to an evaporative cooling system, a custom power-distribution system, and a backyard solar array, the new data center devotes just over one-fifteenth of its power to operations.
To do away with power-guzzling air-conditioning towers, Facebook located the 333,400-square-foot facility in the high desert of central Oregon, where humidity stays low and summer temperatures peak at 90 degrees. Clad in corrugated steel and enclosed by a wall of precast-concrete panels, Sheehan Partners' design functions as a giant cooling system. Large fan walls in the mechanical penthouse push dry desert air through filters; next, misters send fine sprays of water into the air. When the water evaporates, the air temperature drops in a process called evaporative cooling. This arrangement takes advantage of advances in the operation of servers, which now run comfortably at 80 degrees, warmer than the former standard of 68 to 72 degrees.
The architects used earth-toned concrete panels for the perimeter wall, and landscaped with large rocks salvaged during construction. The center's southwest corner houses a small office area with conference rooms and two courtyards with glazed walls, which bring daylight into the compoundlike structure.
'The building is designed around the layout of servers in their racks in rows,' Sheehan says, explaining that the length of the rows is based on the most efficient airflow through them. Facebook's custom servers accept a higher voltage than standard equipment, eliminating extra transformers and the energy loss they create. And instead of a central uninterruptible power supply (another source of waste), each server has its own small power supply that accepts both alternating and direct current.
As a result, the LEED Gold Certified data center's operating costs are 24 percent lower than at the company's leased data centers. A second Prineville center is under construction, with two more under way in North Carolina and one in Sweden. To share its success'and its own model'Facebook publishes the nonproprietary portions of its technical specs through its open-source design initiative, the Open Compute Project. 'Four or five years ago, there was some public discussion about the idea that data centers were going to become huge energy hogs,' says Sheehan. 'This project is the answer to that concern.'
Lamar Anderson is based in San Francisco and frequently contributes to RECORD.
Completion Date: April 2011
Total construction cost: withheld
Personnel in architect's firm who should receive special credit:
Engineers: Alfa Tech (m/e/p)
Metal panels: Metal Sales Manufacturing
Exterior lighting: Se’Lux, Cooper, Bega