You are on page 1of 6

Data center

From Wikipedia, the free encyclopedia

Jump to navigationJump to search

ARSAT data center (2014)

A data center (American English)[1] or data centre (British English)[2][note 1] is a building, a


dedicated space within a building, or a group of buildings [3] used to house computer
systems and associated components, such as telecommunications and storage
systems.[4][5]
Since IT operations are crucial for business continuity, it generally includes redundant or
backup components and infrastructure for power supply, data communication
connections, environmental controls (e.g., air conditioning, fire suppression), and
various security devices. A large data center is an industrial-scale operation using as
much electricity as a small town.[6]

History

NASA mission control computer room c. 1962

Data centers have their roots in the huge computer rooms of the 1940s, typified
by ENIAC, one of the earliest examples of a data center. [7][note 2] Early computer systems,
complex to operate and maintain, required a special environment in which to operate.
Many cables were necessary to connect all the components, and methods to
accommodate and organize these were devised such as standard racks to mount
equipment, raised floors, and cable trays (installed overhead or under the elevated
floor). A single mainframe required a great deal of power and had to be cooled to avoid
overheating. Security became important – computers were expensive, and were often
used for military purposes.[7][note 3] Basic design-guidelines for controlling access to the
computer room were therefore devised.
During the boom of the microcomputer industry, and especially during the 1980s, users
started to deploy computers everywhere, in many cases with little or no care about
operating requirements. However, as information technology (IT) operations started to
grow in complexity, organizations grew aware of the need to control IT resources. The
availability of inexpensive networking equipment, coupled with new standards for the
network structured cabling, made it possible to use a hierarchical design that put the
servers in a specific room inside the company. The use of the term "data center", as
applied to specially designed computer rooms, started to gain popular recognition about
this time.[7][note 4]
The boom of data centers came during the dot-com bubble of 1997–2000.[8][note
5]
 Companies needed fast Internet connectivity and non-stop operation to deploy
systems and to establish a presence on the Internet. Installing such equipment was not
viable for many smaller companies. Many companies started building very large
facilities, called Internet data centers (IDCs),[9] which provide enhanced capabilities,
such as crossover backup: "If a Bell Atlantic line is cut, we can transfer them to ... to
minimize the time of outage."[9]
The term cloud data centers (CDCs) has been used.[10] Data centers typically cost a lot
to build and to maintain.[8][note 6] Increasingly, the division of these terms has almost
disappeared and they are being integrated into the term "data center". [11]

Requirements for modern data centers[edit]

Racks of telecommunications equipment in part of a data center

Modernization and data center transformation enhances performance and energy


efficiency.[12]
Information security is also a concern, and for this reason, a data center has to offer a
secure environment that minimizes the chances of a security breach. A data center
must, therefore, keep high standards for assuring the integrity and functionality of its
hosted computer environment.
Industry research company International Data Corporation (IDC) puts the average age
of a data center at nine years old. Gartner, another research company, says data
centers older than seven years are obsolete. [13] The growth in data (163 zettabytes by
2025[14]) is one factor driving the need for data centers to modernize.
Focus on modernization is not new: concern about obsolete equipment was decried in
2007,[15] and in 2011 Uptime Institute was concerned about the age of the equipment
therein.[note 7] By 2018 concern had shifted once again, this time to the age of the staff:
"data center staff are aging faster than the equipment.
Meeting standards for data centers
The Telecommunications Industry Association's Telecommunications Infrastructure
Standard for Data Centers specifies the minimum requirements for telecommunications
infrastructure of data centers and computer rooms including single tenant enterprise
data centers and multi-tenant Internet hosting data centers. The topology proposed in
this document is intended to be applicable to any size data center.
Telcordia GR-3160, NEBS Requirements for Telecommunications Data Center
Equipment and Spaces, provides guidelines for data center spaces within
telecommunications networks, and environmental requirements for the equipment
intended for installation in those spaces. These criteria were developed jointly by
Telcordia and industry representatives. They may be applied to data center spaces
housing data processing or Information Technology (IT) equipment. The equipment may
be used to:

 Operate and manage a carrier's telecommunication network


 Provide data center based applications directly to the carrier's customers
 Provide hosted applications for a third party to provide services to their customers
 Provide a combination of these and similar data center applications
Data center transformation
Data center transformation takes a step-by-step approach through integrated projects
carried out over time. This differs from a traditional method of data center upgrades that
takes a serial and siloed approach.[20] The typical projects within a data center
transformation initiative include
standardization/consolidation, virtualization, automation and security.

 Standardization/consolidation: Reducing the number of data centers and avoiding server


spraw (both physical and virtual) often includes replacing aging data center equipment, and
is aided by standardization.
 Virtualization: Lowers capital and operational expenses, reduces energy
consumption. Virtualized desktops can be hosted in data centers and rented out on a
subscription basis.  Investment bank Lazard Capital Markets estimated in 2008 that 48
percent of enterprise operations will be virtualized by 2012. Gartner views virtualization as a
catalyst for modernization. Automating: Automating tasks such as provisioning,
configuration, patching, release management, and compliance is needed, not just when
facing fewer skilled IT workers.
 Securing: Protection of virtual systems is integrated with existing security of physical
infrastructures.[31]
Machine room
The term "Machine Room" is at times used to refer to the large room within a Data
Center where the actual Central Processing Unit is located; this may be separate from
where high-speed printers are located. Air conditioning is most important in the machine
room.
Aside from air-conditioning, there must be monitoring equipment, one type of which is to
detect water prior to flood-level situations. One company, for several decades, has had
share-of-mind: Water Alert. The company, as of 2018, has two competing
manufacturers (Invetex, Hydro-Temp) and three competing distributors (Longden,
Northeast Flooring,[note 8] Slayton[note 9]).
Raised floor[edit]

Perforated cooling floor tile.

Main article: Raised floor


A raised floor standards guide named GR-2930 was developed by Telcordia
Technologies, a subsidiary of Ericsson.[38]
Although the first raised floor computer room was made by IBM in 1956,[39] and they've
"been around since the 1960s",[40] it was the 1970s that made it more common for
computer centers to thereby allow cool air to circulate more efficiently. [41][42]
The first purpose of the raised floor was to allow access for wiring. [39]
Lights out[edit]
The "lights-out"[43] data center, also known as a darkened or a dark data center, is a data
center that, ideally, has all but eliminated the need for direct access by personnel,
except under extraordinary circumstances. Because of the lack of need for staff to enter
the data center, it can be operated without lighting. All of the devices are accessed and
managed by remote systems, with automation programs used to perform unattended
operations. In addition to the energy savings, reduction in staffing costs and the ability
to locate the site further from population centers, implementing a lights-out data center
reduces the threat of malicious attacks upon the infrastructure. [44][45]

Data center levels and tiers[edit]


The two organizations in the United States that publish data center standards are
the Telecommunications Industry Association (TIA) and the Uptime Institute.
International standards EN50600 and ISO22237 Information
technology — Data centre facilities and infrastructures[edit]

 Class 1 single path solution


 Class 2 single path with redundancy solution
 Class 3 multiple paths providing a concurrent repair/operate solution
 Class 4 multiple paths providing a fault tolerant solution (except during maintenance)
Telecommunications Industry Association[edit]
Main article: TIA-942

The Telecommunications Industry Association's TIA-942 standard for data centers,


published in 2005 and updated four times since, defined four infrastructure levels. [46]

 Level 1 - basically a server room, following basic guidelines


 Level 4 - designed to host the most mission critical computer systems, with fully
redundant subsystems, the ability to continuously operate for an indefinite period of time
during primary power outages.
Uptime Institute – Data center Tier Classification Standard[edit]
Four Tiers are defined by the Uptime Institute standard:

 Tier I: is described as BASIC CAPACITY and must include a UPS


 Tier II: is described as REDUNDANT CAPACITY and adds redundant power and cooling
 Tier III: is described as CONCURRENTLY MAINTAINABLE and ensures that ANY
component can be taken out of service without affecting production
 Tier IV: is described as FAULT TOLERANT allowing any production capacity to be
insulated from ANY type of failure.

Data center design[edit]


The field of data center design has been growing for decades in various directions,
including new construction big and small along with the creative re-use of existing
facilities, like abandoned retail space, old salt mines and war-era bunkers.

 a 65-story data center has already been proposed[47]


 the number of data centers as of 2016 had grown beyond 3 million USA-wide, and more
than triple that number worldwide[8]
Local building codes may govern the minimum ceiling heights and other parameters.
Some of the considerations in the design of data centers are:
A typical server rack, commonly seen in colocation

 size - one room of a building, one or more floors, or an entire building, and can hold
1,000 or more servers[48]
 space, power, cooling, and costs in the data center.[49]

CRAC Air Handler

 Mechanical engineering infrastructure - heating, ventilation and air conditioning (HVAC);


humidification and dehumidification equipment; pressurization.[50]
 Electrical engineering infrastructure design - utility service planning; distribution,
switching and bypass from power sources; uninterruptible power source (UPS) systems;
and more.[50][51]

You might also like