The information technology industry seems to constantly reinvent itself. Processing and computing speeds are getting faster, storage capacities are getting bigger and cheaper, and our endpoint devices are becoming more functional.
This constant reinvention also has its negative aspects. It could be argued that the IT industry has moved into a vicious cycle of legacy obsolescence, a cycle where firms are forced to update through the expensive ‘rip and replace’ method (where companies are willing to replace their entire storage solution with a new one). The main issue seems to be that customers don’t have an alternative, or substitute, which doesn’t reflect the (in theory) perfect competition that customer essentially demand. Since when did we start just accepting what the legacy storage vendors offer us, when did we stop looking for alternatives?
It would be wrong to view hardware and software as competitors, as if they were fighting for power, rather, one could say that they are interdependent. Some of the biggest IT companies globally are software and hardware companies alike. Yet innovations tend to relate and in some cases when a new development occurs on one side it may allow an innovation elsewhere.
„It would be wrong to view hardware and software as competitors, as if they were fighting for power, rather, one could say that they are interdependent“
The legacy storage system usually involves a dual-controller storage array or single NAS head designed to service a database-heavy environment, using limited file sharing. During the prime of legacy storage, processors used to be quite costly and storage software wasn’t scalable. Small, increasing changes to such systems were manageable due to their singularity, yet if demand for more capacity or performance arose, then a system upgrade was automatically necessary. Therefore, in order to expand such a system one would need to upgrade the current system to a new one (mentioned above as rip and replace) or add another entirely separate system.
For too long, the only actual advances by market dominating storage vendors has been to come up with new methods of charging their consumers. For too long these vendors have been bundling software and hardware in a proprietary way, locking the consumer in and then conveniently over-charging for procurement, service and upgrade.
Obviously this business model isn’t suited to provide what customers actually require: a cost-effective way to store and keep their data safe.
The change and the Outcome
Legacy storage vendors have long had the upper hand when it comes to storage infrastructure, but now software defined storage is finally stepping up the game. Today the amount of data generated is exploding, and all that data needs to be effectively stored. Up to ninety percent of all data has actually been produced only in the last two years. Datacenters are struggling with the outcomes of such growth while IT managers simultaneously fight to survive restricted funds.
In today’s virtualized and collaborative data centers changes occur continuously, therefore, the storage environment must be flexible so that it can adapt to such changes without requiring a large-scale upgrade or outage. This is where software-defined Storage (SDS) comes into play, it is a storage-approach where the programming that controls storage-related tasks is decoupled from the physical hardware. This separation process allows companies to buy heterogeneous storage hardware without the concern of interoperability issues, worries about under- or over-utilizing certain storage resources, or manual oversight of storage resources. The solution provides a variety of functions spread over a range of server hardware components, these functions include deduplication, thin provisioning, snapshots, replication and additional backup and restore capabilities. In short it is a cost-effective storage solution with the ability to scale according to the requirements of a company.
„In short it is a cost-effective storage solution with the ability to scale according to the requirements of a company“
Remember that viscous upgrade cycle mentioned earlier, well today, we can actually avoid a lot of the costs sustained inside those cycles through implementing software-defined computing. The overall idea is relatively straight forward: instead of purchasing a branded product which packages hardware and software, clients may select the best hardware and software (separately), combining them into the most efficient solution customized to that company’s need. This means that SDS allows you to select inexpensive standard hardware (or keep the hardware you are already operating on) and use it together with enterprise-level software.
In modern data centers, storage workloads progressively need scalable environments in order to run demanding enterprise applications. This is where firms may reap the advantage of SDS. Only SDS is able to effectively manage these environments in regards to financial value, flexibility and effectiveness. To put such benefits in motion, firms need an SDS-approach that combines intelligent data services with predictive analytics across any primary or secondary storage hardware. The outcome? More economic value from existing environments.
Storage may be seen as the groundwork of a datacenter infrastructure deployment, if so, software-defined storage is fundamentally altering that foundation, permitting superior, more flexible, cost-effective and scalable datacenter infrastructure.
SDS is still considered to be in its early stages, yet the industry is portraying aggressive growth. With the current pressures on IT budgets and a simultaneous need to increase capacity impending, IT managers can no longer fall back to the “safe” option of year after year spending the full budget on legacy solutions – neither can the legacy vendors continue with old “rip and replace” strategies, not when competition is moving forward with the solution of the future.