flexiblefullpage
billboard
interstitial1
catfish1
Currently Reading

10 common data center surprises

10 common data center surprises

Technologies and best practices provide path for better preparation.


By By BD+C Staff | March 21, 2012
According to a Ponemon Institute study, an outage can cost an organization an average of about $5,000 per minute. Thats $300,00

A list of 10 common surprises for data center/IT managers was released at AFCOM Data Center World Spring. The list includes information on a surprising cause of data center downtime, what data center managers might not know about that next server refresh, and the growing trend sneaking up on virtually every IT manager.

Ten common surprises:

 

 

 

 

  1. Those high-density predictions finally are coming true: After rapid growth early in the century, projections of double-digit rack densities have been slow to come to fruition. Average densities hovered between 6.0 and 7.4 kW per rack from 2006 to 2009, but the most recent Data Center Users’ Group (DCUG) survey predicted average rack densities will reach 12.0 kW within three years. That puts a premium on adequate UPS capacity and power distribution as well as cooling to handle the corresponding heat output.  
  2. Data center managers will replace servers three times before they replace UPS or cooling systems: Server refreshes happen approximately every three years. Cooling and UPS systems are expected to last much longer—sometimes decades. That means the infrastructure organizations invest in today must be able to support—or, more accurately, scale to support—servers that may be two, three or even four generations removed from today’s models. What does that mean for today’s data center manager? It makes it imperative that today’s infrastructure technologies have the ability to scale to support future needs. Modular solutions can scale to meet both short- and long-term requirements.
  3. Downtime is expensive: Everyone understands downtime is bad, but the actual costs associated with an unplanned outage are stunning. According to a Ponemon Institute study, an outage can cost an organization an average of about $5,000 per minute. That’s $300,000 in just an hour. The same study indicates the most common causes of downtime are UPS battery failure and exceeding UPS capacity. Avoid those problems by investing in the right UPS—adequately sized to support the load—and proactively monitoring and maintaining batteries.
  4. Water and the data center do not mix – but we keep trying: The first part of this probably isn’t a surprise. Sensitive IT equipment does not respond well to water. However, the aforementioned Ponemon study indicates 35% of all unplanned outages are a result of some type of water incursion. These aren’t just leaky valves; in fact, many water-related outages are the result of a spilled coffee or diet soda. The fix: Check those valves, but more importantly, check the drinks at the door.
  5. New servers use more power than old servers: Sever consolidation and virtualization can shrink server inventory by huge numbers, but that doesn’t exactly equate to huge energy savings. New virtualized servers, especially powerful blade servers, can consume four or five times as much energy as those from the preceding generation (although they usually do it more efficiently). The relatively modest savings at the end of that consolidation project may come as a surprise. There is no fix for this, but prepare for it by making sure the infrastructure is adequate to support the power and cooling needs of these new servers.
  6. Monitoring is a mess: IT managers have more visibility into their data centers than ever before, but accessing and making sense of the data that comes with that visibility can be a daunting task. According to an Emerson Network Power survey of data center professionals, data center managers use, on average, at least four different software platforms to manage their physical infrastructure. More than 40% of those surveyed say they produce three or more reports for their supervisors every month, and 34% say it takes three hours or more to prepare those reports. The solution? Move toward a single monitoring and management platform. Today’s DCIM solutions can consolidate that information and proactively manage the infrastructure to improve energy and operational efficiency and even availability.
  7.  The IT guy is in charge of the building’s HVAC system: The gap between IT and Facilities is shrinking, and the lion’s share of the responsibility for both pieces is falling on the IT professionals. Traditionally, IT and data center managers have had to work through Facilities when they need more power or cooling to support increasing IT needs. That process is being streamlined, thanks in large part to those aforementioned DCIM solutions that increase visibility and control over all aspects of a building’s infrastructure. Forward-thinking data center managers are developing a DCIM strategy to help them understand this expansion of their roles and responsibilities.
  8. That patchwork data center needs to be a quilt: In the past, data center managers freely mixed and matched components from various vendors because those systems worked together only tangentially. That is changing. The advent of increasingly intelligent, dynamic infrastructure technologies and monitoring and management systems has increased the amount of actionable data across the data center, delivering real-time modeling capabilities that enable significant operational efficiencies. IT and infrastructure systems still can work independently, but to truly leverage the full extent of their capabilities, integration is imperative.
  9. Data center on demand is a reality: The days of lengthy design, order and deployment delays are over. Today there are modular, integrated, rapidly deployable data center solutions for any space. Integrated, virtually plug-and-play solutions that include rack, server and power and cooling can be installed easily in a closet or conference room. On the larger end, containerized data centers can be used to quickly establish a network or to add capacity to an existing data center. The solution to most problems is a phone call away.
  10. IT loads vary – a lot: Many industries see extreme peaks and valleys in their network usage. Financial institutions, for example, may see heavy use during traditional business hours and virtually nothing overnight. Holiday shopping and tax seasons also can create unusual spikes in IT activity. Businesses depending on their IT systems during these times need to have the capacity to handle those peaks, but often can operate inefficiently during the valleys. A scalable infrastructure with intelligent controls can adjust to those highs and lows to ensure efficient operation. BD+C

Related Stories

| Nov 2, 2010

Energy Analysis No Longer a Luxury

Back in the halcyon days of 2006, energy analysis of building design and performance was a luxury. Sure, many forward-thinking AEC firms ran their designs through services such as Autodesk’s Green Building Studio and IES’s Virtual Environment, and some facility managers used Honeywell’s Energy Manager and other monitoring software. Today, however, knowing exactly how much energy your building will produce and use is survival of the fittest as energy costs and green design requirements demand precision.

| Nov 2, 2010

Yudelson: ‘If It Doesn’t Perform, It Can’t Be Green’

Jerry Yudelson, prolific author and veteran green building expert, challenges Building Teams to think big when it comes to controlling energy use and reducing carbon emissions in buildings.

| Nov 2, 2010

Historic changes to commercial building energy codes drive energy efficiency, emissions reductions

Revisions to the commercial section of the 2012 International Energy Conservation Code (IECC)  represent the largest single-step efficiency increase in the history of the national, model energy. The changes mean that new and renovated buildings constructed in jurisdictions that follow the 2012 IECC will use 30% less energy than those built to current standards.

| Nov 1, 2010

Sustainable, mixed-income housing to revitalize community

The $41 million Arlington Grove mixed-use development in St. Louis is viewed as a major step in revitalizing the community. Developed by McCormack Baron Salazar with KAI Design & Build (architect, MEP, GC), the project will add 112 new and renovated mixed-income rental units (market rate, low-income, and public housing) totaling 162,000 sf, plus 5,000 sf of commercial/retail space.

| Nov 1, 2010

John Pearce: First thing I tell designers: Do your homework!

John Pearce, FAIA, University Architect at Duke University, Durham, N.C., tells BD+C’s Robert Cassidy  about the school’s construction plans and sustainability efforts, how to land work at Duke, and why he’s proceeding with caution when it comes to BIM.

| Nov 1, 2010

Vancouver’s former Olympic Village shoots for Gold

The first tenants of the Millennium Water development in Vancouver, B.C., were Olympic athletes competing in the 2010 Winter Games. Now the former Olympic Village, located on a 17-acre brownfield site, is being transformed into a residential neighborhood targeting LEED ND Gold. The buildings are expected to consume 30-70% less energy than comparable structures.

| Oct 27, 2010

Grid-neutral education complex to serve students, community

MVE Institutional designed the Downtown Educational Complex in Oakland, Calif., to serve as an educational facility, community center, and grid-neutral green building. The 123,000-sf complex, now under construction on a 5.5-acre site in the city’s Lake Merritt neighborhood, will be built in two phases, the first expected to be completed in spring 2012 and the second in fall 2014.

| Oct 21, 2010

GSA confirms new LEED Gold requirement

The General Services Administration has increased its sustainability requirements and now mandates LEED Gold for its projects.

| Oct 18, 2010

World’s first zero-carbon city on track in Abu Dhabi

Masdar City, the world’s only zero-carbon city, is on track to be built in Abu Dhabi, with completion expected as early as 2020. Foster + Partners developed the $22 billion city’s master plan, with Adrian Smith + Gordon Gill Architecture, Aedas, and Lava Architects designing buildings for the project’s first phase, which is on track to be ready for occupancy by 2015.

boombox1
boombox2
native1

More In Category


Urban Planning

Bridging the gap: How early architect involvement can revolutionize a city’s capital improvement plans

Capital Improvement Plans (CIPs) typically span three to five years and outline future city projects and their costs. While they set the stage, the design and construction of these projects often extend beyond the CIP window, leading to a disconnect between the initial budget and evolving project scope. This can result in financial shortfalls, forcing cities to cut back on critical project features.



Libraries

Reasons to reinvent the Midcentury academic library

DLR Group's Interior Design Leader Gretchen Holy, Assoc. IIDA, shares the idea that a designer's responsibility to embrace a library’s history, respect its past, and create an environment that will serve student populations for the next 100 years.

halfpage1

Most Popular Content

  1. 2021 Giants 400 Report
  2. Top 150 Architecture Firms for 2019
  3. 13 projects that represent the future of affordable housing
  4. Sagrada Familia completion date pushed back due to coronavirus
  5. Top 160 Architecture Firms 2021