What Is a Data Center?
At its simplest, a data center is a physical facility that organizations use to house their critical applications and data. A data center's design is based on a network of computing and storage resources that enable the delivery of shared applications and data. The key components of a data center design include routers, switches, firewalls, storage systems, servers, and application-delivery controllers.
A data center is a physical room, building or facility that houses IT infrastructure for building, running and delivering applications and services. It also stores and manages the data associated with those applications and services.
History of data centers
Data centers date back to the 1940s. The US military's Electrical Numerical Integrator and Computer (ENIAC), completed in 1945 at the University of Pennsylvania, is an early example of a data center that required dedicated space to house its massive machines.
Over the years, computers became more size-efficient, requiring less physical space. In the 1990s, microcomputers came on the scene, drastically reducing the amount of space needed for IT operations. These microcomputers that began filling old mainframe computer rooms became known as “servers,” and the rooms became known as “data centers.”
The advent of cloud computing in the early 2000s significantly disrupted the traditional data center landscape. Cloud services allow organizations to access computing resources on-demand, over the internet, with pay-per-use pricing—enabling the flexibility to scale up or down as needed.
In 2006, Google launched the first hyperscale data center in The Dalles, Oregon. This hyperscale facility currently occupies 1.3 million square feet of space and employs a staff of approximately 200 data center operators.1
A study from McKinsey & Company projects the industry to grow at 10% a year through 2030, with global spending on the construction of new facilities reaching USD49 billion.2
0 Comments