Design News is part of the Informa Markets Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

An inside look at today's data centers

An inside look at today's data centers

After the terrorist attack on the World Trade Center and subsequent loss of power to lower Manhattan, Wall Street firms needed access to information that resided in a Minneapolis data center in order to restart trading. A subscription service for stock exchange members called Exchange Resources, it houses data belonging to subscribers. The facility is normally kept "cold," manned only by security and facility management personnel. Obviously IT people were needed to ramp-up the facility, but no one had imagined the problems they would face trying to get from New York to Minneapolis quickly. They eventually arrived by train and hired cars, but later than desired.

"Throughout the over 30-year history of data centers, it's really only been the unexpected and unforeseeable type of item that causes trouble," says Paul Magnussen, head of Magnus U.S. Inc. (Minneapolis), a leading data center contractor. "For example, earthquakes taught us that we needed power from more than one utility substation and telecommunications connections from more than one fiber-optics supplier." In light of the World Trade Center disaster, he says that additional IT people may be employed in remote backup facilities in the future.

So what are they? Data centers-sometimes called "server farms"-are physical facilities that host data from one or more organizations, offering greater reliability, security and connectivity than can be guaranteed elsewhere. They started as banks of mainframe computers spewing out punch-cards, growing gradually in popularity until the dot.com boom of the 1990s. Today some 750 data centers in North America contain dense rows of rack-mounted servers and routers, ranging in size from 5,000 to 400,000 ft2, and they confidently promise clients over 99% uninterrupted operation.

Some, like Exchange Resources, host backed-up information ready to go "live." Others, such as giant Exodus (Santa Clara, CA) offer co-location services, where companies own their own hardware and software, and use the data center for its carefully engineered environment. Yet others, like Agiliti (Bloomington, MN), provide managed services.

Exodus has many facilities, and customers that include all of Microsoft's Internet businesses, Yahoo, Google, and eBay. "Exodus mostly leases space, along with fire suppression, heating and air-conditioning, electrical grounding, seismic and structural considerations, firewalls, redundant systems, consulting, and stacking gear," says Chris Hardin, global operations director of Exodus Operations for Microsoft.

In contrast, Agiliti purchases hardware, database software, and operating systems, and runs customer applications. Typical users include small companies, independent software vendors testing new programs, and divisions of large companies. "Organizations that want the safety of a data center, and don't want to focus on infrastructure matters use our services," says Director of Marketing George Hadjiyanis. He notes that some individual company data centers dwarf many of the commercial centers. "They want the safety of a data center environment," he says.

That safety comes from clean, reliable power supplies, very sophisticated data and physical security, and a thermally protected environment for critical, heat-sensitive electronics.

Smooth, clean power. Data centers have raised floors, under which electrical wiring, fiber-optic connections, and HVAC systems run. The buildings tend to be located in major metropolitan areas to obtain redundant supplies of power and fiber-optic connections.


Since utility-provided electricity is subject to surges, spikes, and variations that can cause data losses or data corruption, data centers need to condition it to produce a smooth, even sine wave.

Utility-provided electricity is subject to surges, spikes, and variations, and needs to be conditioned. "To provide reliable electricity, data centers convert utility ac power to dc with rectifiers, and then invert it back to ac. This process helps to obtain current that produces a very even sine wave," says Frank Nash, market segment manager for data center facilities for Square D (Nashville, TN). The company supplies electrical control equipment to many data centers.

Even with multiple supplies, power sometimes goes down-necessitating further backup. Generators (usually diesel) supply minutes or hours of power. Uninterruptible power supplies (UPS) provide short-term power for the 10 to 30 second delay between loss of utility power and generator startup. They also cleanse incoming utility power.

Backup configurations vary. Some centers have generators for full backup, plus additional units for emergency support. Very critical data, such as the financial data backed up for the New York Stock Exchange, may be protected by full redundancy.

The most common UPS systems are static, using solid (gel) batteries in large numbers. "The batteries are tied together and charge when used, just like car batteries," says Nash. "These systems often need special rooms and use a lot of valuable space. They can also degrade and fail." Nash favors space-saving dynamic or rotary UPS systems, which use flywheels and generators to create kinetic energy that can be stored briefly.

Keeping things cool. Sensitive electronics have to stay cool or face damage. Cooling systems provide consistent temperatures that are allowed to vary only over a small range. The trick lies in getting the cooled and humidity-controlled air to the equipment.

"Vapor barriers in the building are just as important as consistent temperatures," says Paul Magnussen, who builds them in data centers. "If it's too dry you get static, which can damage data. Excessive humidity creates moisture, which can harm hardware."


Data centers typically use one of two appraoches to cool cabinets, which can carry heat loads of up to 6,000W. On the enclosure at left, which has solid or partially perforated doors and sides, air is pulled in at the bottom and is exhausted at the top through forced convection. The enclosure at right has perforated doors to allow air to be pulled horizontally through the enclosure.

Computer enclosures provide the next line of defense. "Air coming in through the floor brings cool air to computer cabinets, which channel it and push out the hot air," says Brian Mordick, product manager, data communications, for Hoffman Enclosures (Anoka, MN). To keep hot air from rising into the next floor of a multi-story facility, each level has a closed cooling loop that feeds the warm air back into the system to be cooled and recycled.

From a heat dissipation perspective, it's not only the density of the enclosures that's a problem, but also the rising number of servers per cabinet. "Some of the new, so-called blade servers measure only 1.75 inches tall, which in theory means you could have upwards of 8,000W per cabinet. Practically, we're seeing on the order of 5,000 to 6,000W today-but the number is increasing," says Felix Klebe, product manager, climate control for Rittal Corp. (Springfield, OH).

As enclosures get thinner, they are also getting deeper, which exacerbates cooling problems. To help beat the heat, Rittal offers a server-style cabinet with fully perforated front and rear doors and roof to efficiently dissipate the heat. A server fan assembly mounts to the rear door of the cabinet to assist in exhausting the heat horizontally. Hoffman is taking a different tack, employing solid doors and sides to direct cold air up and force the hot air out, as if the cabinet were a chimney.

Safe and secure. Data centers provide firewalls for their customers' data security, and a number of very intensive measures for physical security. Exodus customers lease from 56 to 10,000 square feet of computer space and can come and go 24/7. Hardin says that customers need Exodus badges, which get scanned, as does the back of the customer's hand. "The back of the hand is as individual as fingerprints," he says. "It may sound gory, but to make sure no one can gain entry by using a severed hand, the security equipment also scans the pulse."

The company has 300 cameras working all the time at each of its facilities. Former members of the FBI and CIA, who began research into computer hacking, are now on staff for Exodus, continuously refining security arrangements for both the physical premises and data. In addition to firewalls, Exodus offers its customers intrusion detection and hacker tracking.

Financial data requires even greater security, and data centers built to house such data are designed to be bomb-proof. However, no one seems ready yet to answer the question raised by the attack on the World Trade Center, of what to do in the event of a building collapse.


Hide comments
account-default-image

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish