The fence is distanced from the building by no less than 15 m (in accordance with the requirements of Tia 942 Tier 3 – min 9,8 m).
Separate driveways to the site for the staff and visitors as well as separate parking lots.
The entrance gate is controlled from the security station.
The building can withstand electromagnetic waves (measurements were taken).
External walls of the building are made of sandwich panels and internal walls are made of bricks.
Raised flooring installed in the server premises (50 cm).
Raised flooring is able to withstand 4 t/m2 static weight.
Clients’ access is controlled on the gate and the entrance door.
Separate server, network rooms and distribution board.
There are two new transformers installed on the site. Four lines extend from transformer to the building and 2 lines extend from transformer to the city.
Switchboards duplication 2N. The producer of switchboards is Schneider Electric.
Switchboards functioning are monitored at a distance. In the case power supply from electricity networks is interrupted, four Caterpillar diesel generators (each 520 kW) are integrated. The fuel is sufficient for 48 hours of continuous work.
Two UPS pcs. APC Symmetra PX. Every UPS is duplicated on the N+1 level. If two pieces are installed, 2(N+1) duplication can be achieved. These are the best UPS in their class. Their efficiency ratio amounts to 96%. Modular expansiveness. Easy maintenance. Reliability. Batteries are ready to supply electricity for 10 minutes when the Data Centre works at full load.
If needed, power distribution units (PDU) of 1-4 nests can be installed in the server cabinets – 23 sockets in each. If there are at least 2 blocks, they can be supplied from separate UPS blocks. All sockets are controlled from a distance; kWh of each socket can be recorded.
APC ACRC502 IN-ROW inner solution. Duplication 2N. The solution ensures stable cooling on the entire height of the server cabinet. Easy extension and adaptation to growing kW for each cabinet. Piping circuits are connected to separate chillers.
Chillers function using the turbo freecooling technology. This way chillers can achieve 1:10 efficiency coefficient (standard chillers achieve 1:3 coefficient), i.e. as they use only 1 kW of electricity, they produce 10 kW of cooling.
DCIM system is continuously monitoring the whole cooling system. It not only produces a detailed picture on the processes taking place but also ensures smart management of devices and continuous work, and maintains low power consumption.
PUE of cooling systems is 1.12!
IT equipment is supplied with 21–25°C temperature and relative humidity of 45–55% is maintained. Temperature variation never exceeds 5°C per hour.
For the purpose of continuous connectivity, two optical inlets have been installed through different walls to the Data Centre. They are connected to the communications backbone network of Baltneta.
Various inlets have been switched in different communication facilities.
The data centre uses Juniper network switching equipment and Virtual Chassis technology, allowing combining up to 8 physical switches into a single virtual one with only one control panel and capacity of 8 switches. Central switches are fitted with 40 Gbps connectivity and 10 Gbps distribution ports. Distribution switches have 10 Gbps connectivity and 1 Gbps distribution ports.
The network routing is performed by Brocade routers.
For direct connection to customers’ offices, more than 200 fibres (with extension up to 600 pcs.) have been installed to the Data Centre through different routes.
Automatic fire extinguishing with inert gas (nitrogen and argon, 1:1).
Cylinder 2N backup (possibility to extinguish fire while filling up empty cylinders).
Reserve cylinders are always connected to the system rather than stored.
Inert gas does not damage IT equipment.
Overpressure system equipped.
Server and infrastructure facilities have been equipped on the basis of the building-building approach.
The area and facilities are monitored by 36 video cameras.
All area and buildings are the property of Baltneta – no tenants or other activities.
An armed security guard is on day-and-night duty.
An additional security team arrives in less than 10 minutes.
Access control authorization with a card and PIN code.
Only authorized persons are allowed to enter the area.
Separate parking lots for employees and visitors.
Data Centre is shielded from the public area at a distance of 15 meters.
Maintenance (electricity, gas, communications) and collocation areas are isolated.
Server cabinet opening detectors.
Delivered and removed equipment register.
42U APC AR3150 750 x 1070 mm server cabinets.
Cabinet doors are perforated and lockable.
All cabinets are closed on the sides.
Cabinets are arranged in a row by creating a hot aisle.
Ceilings and doors of the hot aisle installed.
In total, there are 96 servers and 12 switching equipment cabinets.
Up to 4 pcs. of PDU may be installed in one cabinet.
Optionally, a cabinet can be equipped with opening-closing detectors, web cameras, impact and vibration sensors, monitors, cabling combs.
DCIM (Data Centre Infrastructure Management) system provides a data centre monitoring, data collection, administration, capacity management.
The proactive monitoring of the Data Centre helps to avoid unplanned downtime, ensures the smooth extension, and monitors changes in climate and electricity supply.
DCIM also provides client with ability to connect to the system with the login data, assigned to the one, and to monitor video of the camera installed in the server cabinet and also power, cooling supply to servers, power consumption, etc.