Dynamic and autonomous data infrastructure is now a reality
It’s not breaking news that the demand for data continues to grow exponentially. Businesses need ever more data to create value and compete; data velocity, availability, security and cost of such data all play a crucial role for businesses to succeed.
Translating these requirements into the world of IT, especially for companies that operate at scale, becomes a massive challenge steeped in complexity!
We have the privilege to serve the needs of some of the largest and most demanding enterprises in the world and have a direct lens into the struggle that IT professionals go through to operate and manage data at vast scale.
First, IT professionals have the enormous challenge of figuring out how to architect agile data infrastructures that can handle the requirements of the world they know today. Next, they have to be ready for unknown future requirements and rapidly adopt new storage technologies, which if used timely and efficiently, can deliver a positive impact to the business’ outcome. And last but not least, they have to efficiently manage the life-cycle of data through acquisition and hardware obsolescence.
Whether customers use traditional enterprise storage arrays or more flexible software-defined solutions, they still require a lot of planning and management to make them work cohesively and deliver on expectations. Mistakes are nearly unavoidable and can be incredibly costly to the business.
So the question is: is it possible to relieve IT from the burden of many of these planning and management efforts?
The outcome would be that IT could become more “data” agile and efficient, save significant costs, and be able unlock critical IT skills that are currently trapped in managing the infrastructure to help create business value instead.
When considering the traditional approach of managing storage, the answer is NOT REALLY…
Legacy storage vendors continue to make storage products simpler and more scalable. Of course, these are very important efforts, but they are no longer enough. Incrementalism is running its course at least for larger enterprises!
It takes re-imagining how data could be managed to eliminate all of these complexities and achieve optimum data velocity, availability and cost. The public cloud has addressed many of the challenges just articulated. It is simple, scalable and elastic. However, the public cloud is not for everyone nor everything. Cost, latency, security, and data governance are significant factors for companies that operate at scale in many business sectors that make the cloud prohibitive.
Creating an architecture to solve these issues is what Datera innovators set their aim at solving, which required looking at the problem from a broader context.
I sometime compare the thinking behind what Datera is delivering to the type of transformation we have all experienced moving from traditional cell phones to smartphones.
When the innovators behind the smartphone developed the operating system, they did not start with the idea to build a better cell phone. They imagined a new paradigm. The phone suddenly become a platform that could integrate and radically simplify multiple functions as well as make/receive calls.
The Datera team followed a similar thinking. They looked at the problem from a higher construct; how to help customers manage their data more efficiently not just how to build a better storage product. They tore down storage barriers through a radically simple idea: define storage by its data, driven automatically by applications.
The end result is a hybrid cloud, software-defined platform implemented for block and object data that enables a data infrastructure that is heterogeneous, dynamic, and autonomous.
This is achieved through three distinct layers :
- A heterogeneous software defined storage layer for block and object data
- This software-defined layer provides traditional storage functionality of performance; with low latency, high availability, scalability, security, and enterprise data services capabilities such as data efficiency and protection.
- A unique capability in this layer is the ability to concurrently and transparently manage any validated class of media / server. This allows customers the flexibility to adopt new technologies on the fly and implement a heterogeneous hardware infrastructure that can simultaneously maximize performance as well as cost; all automatically via policies. Datera’s deep insights platform provides additional visibility to aid in that decision making process.
- Why it matters: Customers no longer need to compromise or deploy multiple products to optimize for performance or cost. Different classes of storage media, present and future, can be added and coexist in the same infrastructure seamlessly and transparently. Grow, mix and match using any brand and server generation approved by Datera in the Hardware Compatibility List “HCL”.Everything is easily, and abstractly managed.
- Data Management Layer – Making the infrastructure Dynamic
- Data is placed on the appropriate storage media based upon policies. Should your requirements change, or new technologies become available that could benefit the business, a change in policy can be executed in minutes and the data will begin moving “live” to the new destination.
- Why it matters: The more immediate and valuable benefit, these capabilities help customers eliminate all of the planning and efforts associated with data migrations or hardware obsolescence. When servers reach end of life, they are simply de-commissioned from the cluster and new servers joined. Everything is executed extremely simply and live. But it goes beyond that, companies can now cut MONTHS spent preparing to add new workloads or change existing ones. They can eliminate the waste of over-provisioning, they can take advantage of new media technologies rapidly, or as the value of data changes, they can move data to more cost effective destinations. All this happens live and through simple policy changes. Infrastructure can now be used more efficiently, in the most optimized way.
- Programmable Layer – Making the infrastructure Autonomous and easy to program
- Every functionality of the solution is exposed through an API first model.
- Customers can define the needs of their data through simple policies and the system will automatically implement them based on a heterogeneous infrastructure
- Automated provisioning against pre-defined policy enables self service
- Why it matters: Intelligent algorithms work 24/7, maximizing performance, data security, and overall efficiency across hundreds of workloads. Critical IT skills are now are able to focus on more value creating efforts rather than constantly watching and managing the infrastructure
Bottom line:
– Datera has gone well beyond just delivering a better storage platform.
– Datera’s passionate engineers have re-imagined the way data can be managed, eliminating most of the complexities that IT professionals have to deal with as they manage data at scale.
– For the first time IT professionals have the ability to architect a software-defined data infrastructure that is heterogeneous, dynamic, and autonomous, that can run the most demanding applications, enabling the business to be more agile, efficient and cost effective.
Please listen to our accompanying PodCast on this blog’s theme, as Datera leaders discuss the real world advances from our data platform.
For more information, we recommend reading our white papers:
Built for Autonomous Operations
Built for Continuous Availability
We can schedule a demo at any time. Please reach us at sales@datera.io and share any specific capability you would like to learn more about. We look forward to the opportunity!