Modern computing appears to be undergoing an unstoppable trend towards the cloud. Facilitated by ubiquitous “smart” devices and high-speed data links—both mobile and fixed-line—we are increasingly seeing data processing and storage occur in globally dispersed data centres. These are expanding at a phenomenal rate in terms of raw numbers and capacity to cope with demand.
You might not know it, but we are practically all heavy users of the cloud. Your Gmail, Facebook, Twitter, Google Docs, Office 365, Instagram, Snapchat, Android/Apple Apps and so on all rely on a resilient network of systems run by the biggest names in global technology. The lead industry players are Amazon, Microsoft, Apple and Google, but there are many more. These Content Delivery Networks (or CDNs) provide inter-networked data systems infrastructure as a service that companies leverage in order to facilitate your online experience; your most recent Netflix show may well have been delivered by a CDN facility determined to be closest to your geographic location.
There is a multitude of data centre designs, but what they all have in common is a massive need for power. This is required to not only run servers but also to provide for adequate cooling. Keeping computers, networking and other equipment going at optimal temperatures is such a huge input cost that great attention and resources have been spent on not only incremental efficiencies (Google, for example, have found that higher temperatures than traditionally maintained do not result in higher failure rates) but also revolutionary change. Clearly, innovation in this area is a win-win: costs are reduced and less demand benefits the environment through reduced emissions in power generation. Although the likes of Google and Microsoft have done an admirable job in offsetting carbon emissions and supporting renewable energy projects, it is clearly better to eliminate demand in the first place.
As reported in Motherboard (Oberhaus, 2018), Microsoft have implemented the pilot model of an ingenious project called Natick, whereby a data centre is deployed under the sea off the coast of Scotland, using the cold waters for cooling. As an added benefit, the nearby island already uses 100% renewable energy—though tidal and wind sources—so that the power requirements of the module are completely made up of renewable energy.
Credit: Microsoft
As detailed in Motherboard:
Microsoft just sent its first self-sufficient, waterproof data center to the bottom of the ocean floor near the Orkney Islands in Scotland, the company announced on Tuesday. About the size of a shipping container, the tubular data center holds 12 racks loaded with 864 servers and is attached to a large triangular weight that anchors it to the seabed over 100 feet beneath the ocean surface.
Also, in contract to warehouse-style data centres, the Natick-based system may well be more resilient to natural disaster such as hurricanes or floods. Other foreseeable advantages are reduced land use for cloud infrastructure and a natural proximity to cities, the largest of which are generally on the coast. Deployment lead-time may also be lessened.
Overwhelmingly, signs are that the project may herald a new way for cloud computing resources to be delivered. With data centres consuming over 3 percent of the global energy supply, representing 2 percent of carbon emissions (The Independent, 2016) the environmental benefits could be significant indeed.
🌊
References
Independent, 2016 (Bawden, D.) “Global warming: Data centres to consume three times as much energy in next decade, experts warn”. [Online]. Accessed: 06.08.18
Motherboard, 2018 (Oberhaus, D.). “Microsoft Just Put a Data Center on the Bottom of the Ocean” [Online]. Accessed: 06.08.18. https://motherboard.vice.com/en_us/article/pavq99/microsoft-project-natick-submarine-data-center
Comments