In a blog post on Thursday, the social giant announced that the facility — located in Forest City, North Carolina — is online and serving live traffic. This comes 16 months after Facebook first broke ground on the site.
In those 16 months, Facebook says, nearly 2,000 people spent more than 1.2 million hours working on the facility. “It’s no mean feat to bring a service like Facebook to more than 845 million people around the world,” read the blog post from George Henry, the operations manager for the new data center. “And our innovative data centers play a big role in making that possible.”
Tom Furlong, Facebook’s vice president of site operations, tells Wired that the data center has arrived two months ahead of the original schedule. “We took two months out of the construction window because from a business stance, we needed it,” he says.
Like fellow cloud data giants Google, Amazon, and Microsoft, Facebook is designing and building it’s own data center in an effort to keep up with the exponential growth of its web services — and reduce the costs associated with running these massive computing facilities. The company’s first custom-built facility — located in Prineville, Oregon — went live at the beginning of 2011.
As part of its effort to reduce power consumption in the data center, Facebook has also designed its own servers, and the North Carolina facility marks the debut of its second generation of web server designs, which the company is sharing with the rest of the world through the Open Compute Project. Unlike other web giants such as Google and Amazon, Facebook does not view its hardware as a competitive advantage, believing that if it shares its designs with the rest of the world, it can improve them — and drive down the cost.
According to Facebook, its Forest City facility will have a power utilization effectiveness (PUE) measurement of 1.06 to 1.08. PUE measures the ratio between the total amount of power used by the data center and the power delivered to the center’s equipment. A PUE of 1 indicates a perfect transfer of power.
The facility is the first test of the Open Compute Project’s outdoor-air cooling designs in an environment where temperature and humidity conditions are considered outside the range of typical data center operations. In other words, North Carolina gets much steamier in the summer than the Oregonian high desert. “The real question is whether we could make it work in a more humid environment,” Furlong says. “This will be proof of concept that we can run an outside-air economizing system in a humid location.” (Caleb Garling)