July 05, 2016
Data center, which is full of fiber optic cables (such as fiber cable LC to LC), optical modules, and many other optic products, is the brain of a company and the place where the most critical processes are run. Today we are gonna talk about what they contain, and how they are operated.
Although we have mentioned it above that data center is made up of many optics, there is another important point - large-scale computer systems that have been around it for a while. Computers, certainly, require electricity, as well as protection from theft and the accidental or intentional manipulation of hardware. Put simply, one has to safeguard data centers against external influences and provide them with sufficient cooling. After all, there is a lot of powerful hardware sitting in one place.
In addition to these "hard†factors, one must also take into consideration organizational measures, such as periodic backups that ensure operability. As a rule, the more extensive and critical the hardware and software become, the more time and effort are required to provide optimal protection.
For that reason, a data center preferably consists of a well-constructed, sturdy building that houses servers, storage devices, cables (say OM3 patch cable), and a connection to the Internet. In addition, the center also has a large amount of equipment associated with supplying power and cooling, and often automatic fire extinguishing systems. So next, let’s figure out how a data center works according to power supply and cooling these two aspects.
The data center is connected to two separate grid sectors operated by the local utility company. If one sector were to fail, then the second one will ensure that power is still supplied.
In addition, the data center has 13 diesel generators, which are housed in a separate building. Together, they can produce a total of 29 megawatts, an output that is sufficient to cover the data center’s electricity demand in an emergency. The diesel motors are configured for continuous operations and are always in a preheated state so that they can be started up quickly in the event of an incident. It only takes an outage in just one of the external grid sectors to automatically actuate the generators.
Both the local utility company and the diesel generators deliver electricity with a voltage of 20 kilovolts (kV), which is then transformed in the data center to 220 or 380 volts.
Within the data center, block batteries ensure that all operating applications can run for 15 minutes. This backup system makes it possible to provide power from the time a utility company experiences a total blackout to the time that the diesel generators start up.
The uninterruptible power supply (UPS) also ensures that the quality remains constant. It compensates for voltage and frequency fluctuations and thereby effectively protects sensitive computer electronic components and systems.
A redundantly designed power supply system is another feature of the data center. This enables one to perform repairs on one network, for example, without having to turn off servers, databases, or electrical equipment.
Several servers or storage units have multiple, redundant power supply units, which transform the supply voltage from the two grid sectors to the operating voltage. This ensures that a failure of one or two power supply units does not cause any problems.
As we know, the electronic components will generate heat when in operation. In order to keep data center operate smoothly, cooling a data center is essential, and because of the concentrated computing power, the costs to do so are considerable.
As a result, servers are installed in racks, which basically resemble specially standardized shelves. They are laid out so that two rows of racks face each other, thereby creating an aisle from which the front side of the server is accessible. The aisles are covered above and closed off at the ends by doors. Cool air set to a temperature of 24 to 26°C is blown in through holes in the floor, flows through the racks, and dissipates the heat emitted by the servers.
At higher outside temperatures, the air-conditioning systems are cooled with water, made possible by six turbo-cooling units. They are not all used to cool the data center, given that some are used as reserve units. Should a cooling system fail, the time until the backup unit is operational must be covered. To that end, 300,000 liters of ice-cold water (4°C) are available to absorb the heat from the air-conditioning systems during this period.
To top it off, the turbo-cooling units also have to dissipate heat. There are 18 heat exchangers on the data center’s roof for this purpose, which release hot air into the environment.
If outside temperatures is above 26°C, the heat exchangers are sprinkled with water in order to make heat dissipation more effective through evaporative cooling. The large amounts of water consumed in the summer are covered by waterworks allocated to the data center. The municipal water supply system provides a reserve supply in this case and acts as a failsafe.
After reading this article, hope you have an overview of data center. Anyway, data center is so important for an enterprise that it can help us in how we can provide the strongest offering to our customers. In some day, data center may be synonymous with network operations center (NOC), a restricted access area containing automated systems that constantly monitor server activity, Web traffic, and network performance. And we are really looking forward to it!
Posted by: fernxu123 at
03:00 AM
| No Comments
| Add Comment
Post contains 923 words, total size 7 kb.
35 queries taking 0.083 seconds, 80 records returned.
Powered by Minx 1.1.6c-pink.