Topics In Demand
Notification
New

No notification found.

Blog
Disruption in Data Storage – Nuances & Opportunities

832

0

In a world where most of our technology needs are delivered as cloud-based services and AI knowing more about us than our friends do, it is only apt to believe that all the underlying infrastructure needed to support this digital intrusion is ripe for a disruptive change.

We are talking about the storage solutions that help store, organize, process, and safeguard nearly 2.45 Quintillion bytes of data created worldwide in a single day. The data storage segment has seen a slew of disruptions over the past few years.

Evidently, one of the most prolific shifts in the way storage solutions were viewed was the growth of Software Defined Storage. SDS created a whole new dimension for allowing enterprises to manage their explosive data storage needs with software running on local or cloud data centers.

So, what fueled the need for further disruptive innovations in the data storage space?

With roughly 4.57 billion people or 59% of the global population being active Internet users, the amount of data generated in the coming years may well need new names in numerical labeling.

The growth of AI and Machine Learning has contributed significantly to the need for enterprises to store and analyze ever bigger volumes of data generated across their business. It becoming clear that the more data enterprises can crunch, the greater will be the efficiency of insights developed by AI systems.

The connected digital economy wherein consumers have more of their devices connected to the Internet is only proving to be the icing on the cake. From smartphones to wearables, smart televisions, intelligent home speakers, and even connected cars becoming more mainstream, several experimental and conceptual innovations in data storage have finally made it into real-world use cases.

Let’s have a look at the top three disruptions in data storage that bring about new opportunities for enterprises to actualize their digital aspirations.

Composable Architecture

Composable Architecture refers to an approach of managing a digital infrastructure through software via a web-based interface. By digital infrastructure in this context, we mean the computing hardware, storage, and network fabric. Each of these can be abstracted from their actual physical location and managed through an API that unifies executions of applications over these decoupled infrastructure components. Each device or component in a digital infrastructure following composable architecture can have its own independent existence, thereby eliminating the risk of the entire system failing if any one or more components face a technical issue. Storage is a vital part of composable architecture and enterprises will have the freedom to uniquely extend any component of the architecture, for example, storage alone, to support the management of the increased data generated. A company like Western Digital claims that this approach of composable architecture lowers the Total Cost of Ownership (TCO) by over 40% when compared to hyperscale solutions that enterprises have been relying on for a while now.

Computational Storage Services

Moving large amounts of data from a storage mechanism to the processing host or system is often a bottleneck for performance in the enterprise world. The concept of Computational Storage came into existence to resolve this problem. It focuses on bringing data and computational processing together as elements associated with the storage unit. It improves the efficiency of applications by integrating computing resources directly with storage. This facilitates parallel computation and lowers the latency in data transmission between memory and computing resources in traditional settings. Such processing power and speed could prove to be highly influential in today’s digital-friendly enterprise sector.

NVMe Over Fabric

Non-Volatile Memory Express (NVMe) brought about a massive shift from the way storage devices and technologies were standardized in the past. It was a consortium of industry leaders in the storage sector who came together to redefine and standardize the interface for NVM devices connected to a PCIe bus. The main aim was to reduce the dependency of the storage medium on the CPU so that the CPU could be left for better utilization by the core application. NVMe Over Fabrics (NVMe-oF) was the protocol that allowed the seamless transfer of storage management commands to remote NVM devices. This allowed enterprises the flexibility of having access to powerful memory storage services, while offering lesser load on the CPU subscribed, for memory management. As more brands involved in the manufacture of storage devices come together, NVMe-oF will soon become the dominant force in distributed memory management that enterprises cannot ignore.

The world is witnessing major changes. There’s the dawn of 5G wherein the Internet will see more traffic than ever before from a barrage of devices ranging from smartphones to refrigerators and automobiles that go online. The pandemic has driven enterprises into a fully remote working mode with thousands of employees accessing enterprise data, tools, and technology stacks from homes and remote locations. Devices, systems, and tools are generating massive volumes of data. Handling data of this magnitude requires quite a different storage approach from traditional practices. Enterprises will need to focus on building solutions that can incorporate newer storage innovations in their data management policies to enable sustainable growth in their business aspirations. The three disruptors covered here could well form a part of that select set.


That the contents of third-party articles/blogs published here on the website, and the interpretation of all information in the article/blogs such as data, maps, numbers, opinions etc. displayed in the article/blogs and views or the opinions expressed within the content are solely of the author's; and do not reflect the opinions and beliefs of NASSCOM or its affiliates in any manner. NASSCOM does not take any liability w.r.t. content in any manner and will not be liable in any manner whatsoever for any kind of liability arising out of any act, error or omission. The contents of third-party article/blogs published, are provided solely as convenience; and the presence of these articles/blogs should not, under any circumstances, be considered as an endorsement of the contents by NASSCOM in any manner; and if you chose to access these articles/blogs , you do so at your own risk.


Calsoft is ISV preferred product engineering services partner in Storage, Networking, Virtualization, Cloud, IoT and analytics domains.

© Copyright nasscom. All Rights Reserved.