Colocation centre
This article needs additional citations for verification. (November 2016) |
Part of a series on |
Internet hosting service |
---|
Full-featured hosting |
Web hosting |
Application-specific web hosting |
By content format |
Other types |
A colocation centre (also spelled co-location, or shortened to colo) or "carrier hotel", is a type of
Configuration
Many colocation providers sell to a wide range of customers, ranging from large enterprises to small companies.[1] Typically, the customer owns the information technology (IT) equipment and the facility provides power and cooling. Customers retain control over the design and usage of their equipment, but daily management of the data centre and facility are overseen by the multi-tenant colocation provider.[2]
- Cabinets – A cabinet is a locking unit that holds a server rack. In a multi-tenant data centre, servers within cabinets share raised-floor space with other tenants, in addition to sharing power and cooling infrastructure.[3]
- Cages – A cage is dedicated server space within a traditional raised-floor data centre; it is surrounded by mesh walls and entered through a locking door. Cages share power and cooling infrastructure with other data centre tenants.
- Suites – A suite is a dedicated, private server space within a traditional raised-floor data centre; it is fully enclosed by solid partitions and entered through a locking door. Suites may share power and cooling infrastructure with other data center tenants, or have these resources provided on a dedicated basis.
- Modules – data center modules are purpose-engineered modules and components to offer scalable data center capacity. They typically use standardized components, which make them easily added, integrated or retrofitted into existing data centers, and cheaper and easier to build.[4] In a colocation environment, the data center module is a data center within a data center, with its own steel walls and security protocol, and its own cooling and power infrastructure. "A number of colocation companies have praised the modular approach to data centers to better match customer demand with physical build outs, and allow customers to buy a data center as a service, paying only for what they consume."[5]
Building features
Buildings with data centres inside them are often easy to recognize by the amount of cooling equipment located outside or on the roof.[6]
Colocation facilities have many other special characteristics:
- Clean agent fire suppression gaseous systems are sometimes installed to suppress a fire earlier than the fire sprinkler system. Passive fire protection elements include the installation of fire wallsaround the space, so a fire can be restricted to a portion of the facility for a limited time in the event of the failure of the active fire protection systems, or if they are not installed.
- 23-inch racksfor telecommunications equipment
- Cabinets and cages for physical access control over tenants' equipment. Depending on one's needs a cabinet can house individual or multiple racks.[7]
- Overhead or underfloor cable rack (tray) and fibreguide, power cables usually on separate rack from data
- Air conditioning is used to control the temperature and humidity in the space. ASHRAE recommends a temperature range and humidity range for optimal electronic equipment conditions versus environmental issues.[8] The electrical power used by the electronic equipment is converted to heat, which is rejected to the ambient air in the data centre space. Unless the heat is removed, the ambient temperature will rise, resulting in electronic equipment malfunction. By controlling the space air temperature, the server components at the board level are kept within the manufacturer's specified temperature and humidity range. Air conditioning systems help keep equipment space humidity within acceptable parameters by cooling the return space air below the dew point. Too much humidity and water may begin to condense on internal components. In case of a dry atmosphere, ancillary humidification systems may add water vapour to the space if the humidity is too low, to avoid static electricity discharge problems which may damage components.
- Low-impedance electrical ground
- Few, if any, windows
Colocation data centres are often audited to prove that they attain certain standards and levels of reliability; the most commonly seen systems are SSAE 16 SOC 1 Type I and Type II (formerly SAS 70 Type I and Type II) and the tier system by the Uptime Institute or TIA. For service organizations today, SSAE 16 calls for a description of its "system". This is far more detailed and comprehensive than SAS 70's description of "controls".[9] Other data center compliance standards include Health Insurance Portability and Accountability Act (HIPAA) audit and PCI DSS Standards.
Power
Colocation facilities generally have
Some customers choose to use equipment that is powered directly by 48 VDC (nominal) battery banks. This may provide better energy efficiency, and may reduce the number of parts that can fail, though the reduced voltage greatly increases necessary current, and thus the size (and cost) of power delivery wiring. An alternative to batteries is a motor–generator connected to a flywheel and diesel engine.
Many colocation facilities can provide redundant, A and B power feeds to customer equipment, and high end servers and telecommunications equipment often can have two power supplies installed.
Colocation facilities are sometimes connected to multiple sections of the utility power grid for additional reliability.
Internal connections
Colocation facility owners have differing rules regarding cross-connects between their customers, some of whom may be carriers. These rules may allow customers to run such connections at no charge, or allow customers to order such connections for a monthly fee. They may allow customers to order cross-connects to carriers, but not to other customers. Some colocation centres feature a "
Most peering points sit in colocation centres and because of the high concentration of servers inside larger colocation centres, most carriers will be interested in bringing direct connections to such buildings. In many cases, there will be a larger Internet exchange point hosted inside a colocation centre, where customers can connect for peering.[11]
See also
- Carrier-neutral data center
References
- ^ Pashke, Jeff. "Going Open – Software vendors in transition". 451 Research. Archived from the original on 6 December 2016. Retrieved 6 March 2016.
- ^ "Colocation: Managed or unmanaged?". 7L Networks. Retrieved 6 March 2016.
- ^ "Colocation Benefits And How To Get Started". Psychz Networks. Retrieved 18 February 2015.
- ^ DCD Intelligence "Assessing the Cost: Modular versus Traditional Build", October 2013 Archived 7 October 2014 at the Wayback Machine
- ^ Rath, John (October 2011). "DCK Guide to Modular Data Centers: The Modular Market". Data Center Knowledge. Archived from the original on 6 October 2014. Retrieved 1 October 2014.
- ^ Examples can be seen at http://www.datacentermap.com/blog/data-centers-from-the-sky-174.html
- ^ "How Much Space Will I Need in a Colocation Center?". ATI Solutions Inc. Retrieved 20 November 2018.
- ^ "Thermal Guidelines for Data Processing Environments, 3rd Ed. - ASHRAE Store".
- ^ "SSAE 16 Compliance". Colocation America. Archived from the original on 20 October 2020. Retrieved 24 August 2021.
- ISSN 1059-1028. Retrieved 15 September 2021.
- ^ "Learn About Colocation Benefits And How To Get Started". Psychz.Net.
External links
- Build Or Colocate? The ROI Of Your Next Data Center Archived 6 October 2014 at the Wayback Machine
- DCK Guide To Modular Data Centers: The Modular Market Archived 6 October 2014 at the Wayback Machine