Mini Data Centers: IT Decentralizes Again

  • Published: June 26, 2018
  • Categories: Cloud, ICT, Enterprise Network
  • Share this article:

As the growth of the Internet of Things (IoT) pushes more computing to the edge of the network, mini data centers are becoming an important trend that ICT executives should be aware of.

Historically we’ve seen cyclical change in computing. Over the last few decades, the technology industry has shifted its computing model between centralized computing (for example mainframes) to decentralized (client server and PC’s) back to a centralized, cloud-based model. Now the rise of the Internet of Things with its billions of connected objects is promising to force a different kind of change to the centralized architecture of today’s cloud.

Connected cars and other intelligent devices that create and process real time analytics promise to overwhelm existing network bandwidth and data storage. Even the most massive data center will not be able to cope with data received from billions of cars, medical devices, sensors and other objects. It will make sense to process data generated by cars in Stuttgart on data centers in Stuttgart. Even with ultra-high-speed data centers and private backbones that many of these data centers are building, latency will remain an issue.

Another factor driving this trend is Big Data. ICT executives are just starting to acknowledge that data analytics that can identify their company’s market and customer opportunities and internal inefficiencies are the big prize from the next generation of computing. To generate these analytics, organizations currently rely on large and complex clusters of servers. But these clusters tend to involve processing bottlenecks that arise from data pipelining, indexing and extract-transform-load processes. Centralized infrastructures work well enough to generate analytics that rely on static or historical data. But analytics that generate the highest value, most actionable insight--those that provide competitive advantage—will be those that can correlate historical information with current IoT data. Centralized infrastructure is less suitable for this purpose.

Some of the major cloud providers like Google and Amazon have built their own networks for maximum throughput. But smaller companies’ networks must compete with Netflix and YouTube, which consume more than 40 percent of Internet bandwidth (Netflix during peak hours consumes as much as 70 percent of Internet traffic.)

Enter micro data centers—a recent trend designed to address latency, bandwidth and processing speed challenges. Mini data centers are replicable, standardized data centers located near where the data they process arises. While prefabricated, modular data centers have been around for a while, micro data centers provide simplified management along with high levels of security and reliability. Their modularized servers, with built-in environmental monitoring and security features, are factory tested so they can get up and running at edge locations quickly.

Author: George Nistor Senior ICT Sales and Business Development Deutsche Telekom AG
0 Comments, be the first to leave a reply Write a comment

Leave a comment

Your e-mail address will not be published. Required fields are marked *

We are not robots, therefore please choose which symbol does not fit.
Cloud Backup eBook

The simple guide to avoid losing digital assets in the cloud age.

Download now

Read more:

Schließen
Terms and conditions

The data provided by me can be used by Deutsche Telekom AG for general customer consultation, requirements-orientated design of the services I use, advertising and market research. Transferring this data for these purposes within the scope of my consent is to be done so solely within Deutsche Telekom AG. The use of my data for the above-listed purposes cannot be done so if I withdraw my consent. Withdrawing consent can be done so either in writing or electronically, e.g., via Email, at any time.