When I think about what’s going on today regarding the future of data centers, I can’t help but reflect on a classic moment from the 1967 movie “The Graduate,” when young Benjamin Braddock (played by an equally young Dustin Hoffman) is given some unsolicited life/career advice: “Just one word,” our protagonist is told. “Plastics. There’s a great future in plastics. Will you think about it?”

Some half a century later, if you updated that conversation, it’s easy to believe the savvy counsel could be altered to another single word: Software. There’s a great future in software. Think about it: Everyone from Bill Gates to Larry Ellison to Diane Greene on down would heartily agree. It’s an especially good time to be involved in software, especially as it relates to data centers.

While legacy data centers traditionally have been seen as monolithic structures filled with rack after rack after rack of hardware, today that picture is more and more becoming antiquated and misguided. Cloud computing—and to a recently growing extent, micro data centers—have changed the way the game is played in data centers. For those who haven’t yet accepted the changes, get with them now because it’s beginning to be too late. It’s prime time to rethink—and that strategy includes just about every aspect of the data center.

In an insightful article by Rich Miller in Data Center Frontier, he poses the question about rethinking redundancy. “Is Culture Part of the Problem?” he asks. Yes, indeed. For decades, the culture of the data center has been all about keeping the lights on and the uptime on the up and up. Accomplishing redundancy traditionally has meant overprovisioning everything. And arguably, no aspect is more impacted than power.

While there’s no doubt that the compute infrastructure of today’s state-of-the-art data centers has improved dramatically over just a decade ago, the budget necessary for power and cooling has gone up. Data centers typically pay between $6 million and $20 million per megawatt—or about 40 percent of a facility’s budget. That’s way too much.

No longer should data center management kneejerk to throwing more hardware into the mix anytime reliability is in question. It’s time to rethink that situation. Even the Uptime Institute seems willing to amending its vaunted Tier Classification System, which has been around since the mid-1990s and has gained acceptance akin to the gold standard. Says Miller: “Now even Uptime acknowledges the growing importance of software, and how it can be a game changer for data center design.”

So what’s holding back full-scale changes? It’s easy to point at money (or lack of it), as it’s an easier than admitting the resistance to a change in culture is part of the problem. Again, it’s time for rethinking. Devise a plan that clearly aligns your data center costs with your business. Embracing Software Defined Power makes possible taking full advantage of advancements in such areas as AI, machine learning, automation and analytics. As long as the consensus remains that it’s perfectly acceptable for data centers’ unused power infrastructure—aka “stranded power”—to remain at 40 to 80 percent, the industry is moving forward in word only. Everyone knows actions speak louder than words.