For years, industry pundits have played the Grim Reaper (not the White Walkers in Game of Thrones) when it comes to forecasting the demise of the data center. For example, TechRepublic declared “the data center is toast,” following a Gartner report last year that predicted 80% of enterprises will have shut down their traditional data centers by 2025, compared to just 10% today.

 At last year’s Gartner IT Infrastructure, Operations and Cloud Strategies Conference, the industry watcher offered another dark forecast: “by 2022, more than half of enterprises-generated data will be created and processed outside of data centers.” As organizations increasingly adopt “cloud first” strategies, new data center models and distributed hybrid IT solutions will emerge to shape the future of what’s in store for next-gen data center infrastructures.

In a new Gartner report, “The Future of Enterprise Data Centers—What’s Next,” senior research director Henrique Cecci recommends incorporating a combination of on-premises, cloud, edge, colocation and hosting services to better adapt to business needs for resiliency, flexibility and adaptability.

Let’s face it, the best chance for the survival of enterprise data centers is locating workloads and applications where they make the most sense and can deliver the best results for the business. No wonder Gartner, among others, sees a surge in the number of micro data centers.

As edge data centers become more prevalent, the concept of the micro data center makes more sense. Instead of monolithic IT infrastructures, a series of distributed mini racks will be characterized by smaller footprints and increased operational efficiencies. One of our valued partners, Schneider Electric, is bullish on this concept for a variety of reasons, including the ability to optimize energy utilization.

Recently, Kevin Brown, CTO of Schneider Electric’s Secure Power, shared the prospect of micro data center energy savings with Larry Dignan at ZDNet. While power consumption is often the wildcard in any data center scenario, micro data centers can be more efficient on power. In fact, Brown asserts that micro data centers are complementary to the cloud because they can handle compute for localized data from devices, sensors and customers without transporting it back and forth. Ultimately, this improves energy efficiency while boosting resiliency.

Brown also theorizes that at some point, the edge will consume more power than centralized data centers. In fact, he cites an example where a company moved compute offsite only to incur higher power costs due to a network build-out. So, it’s quite feasible and highly likely that micro and edge data centers ultimately will increase the cost of the power components, and potentially energy utilization.

To alleviate these escalating capacity pressures and costs, Schneider and VPS have embraced Software Defined Power (SDP) solutions that leverage AI and machine learning to gather data on connected power systems and deliver real-time insights into power usage. Currently, VPS is working with an ecosystem of partners to enable their power components with much-needed software intelligence to reallocate power distribution based on IT workload demands.

SDP is gaining traction, especially as the debate rages over what data centers of the future will look like. Ultimately, it won’t matter where the compute resides, but how best to support mission-critical applications while optimizing energy utilization. Amid all the speculation, one thing is for sure: The days of overspending on and overprovisioning power capacity should be over.

What’s on the horizon for your future data center?