Increasingly Cloudy with a 100% Chance of On-Prem

Everyone is talking about the move to cloud: private cloud; hybrid cloud; public cloud. What is your cloud strategy? Microservices, CI/CD, SaaS? All good questions but missing a key point.

For most companies, there is an existing on-premises environment that required blood, sweat and tears to get right. And, it is working just fine – even thriving.

No, this isn’t NIH or “not seeing the future” – it acknowledges that tectonic shifts happen over time, not instantaneously. During these shifts life gets much harder, and there is a duality, managing two IT approaches simultaneously often with the same staff and budget – or less.

If you are lucky, maybe you will get a merger/acquisition thrown in just to keep things interesting. Just ask Steve Philpott from Western Digital, whom I watched deftly manage the simultaneous integration of three companies while rationalizing 3,000 on-prem and cloud applications. Interesting indeed.

There was a day when you asked around, trying to figure out what the right application and the right infrastructure was for your business. There was a process: you would plan, meet with vendors and eventually commit to a relationship.

My wife of 31 years occasionally asks my thoughts on the current dating culture. I must admit I am perplexed. My 23-year-old daughter met her boyfriend on a dating app and it seems to be going well. In fact, today most young people that are dating meet via dating apps.

Dating App

There was a day when dating involved you asking around if your friends knew someone, or someone you knew setting you up with their lonely second cousin (don’t ask.)

You met for coffee, tried to figure out if they were right for you and eventually you committed to a relationship. No more!

Today you answer a few questions, and poof, you are matched with people who, according to a sophisticated but undisclosed algorithm, are perfect for you.

Swipe left, swipe right… serial killer? You’re not quite sure how it works, but hey, you have a date Friday night! Don’t worry, I’m sure it will be fine…

Today when you roll out or upgrade an application, you are faced with the question of whether to deploy it on-prem or in the cloud, and whether to deploy it as an application or consume it as a service (SaaS.)


Go down the SaaS path, you answer a few questions and suddenly, according to a sophisticated but undisclosed algorithm, you are matched with the application of your dreams.

You’re not quite sure how it works, but hey, you have an application up and running – you can get to your date Friday night! Don’t worry, I’m sure it will be fine…

Managing conventional IT and cloud-based IT environments creates inherent complexity because there is change – even if for the better.

To make matters worse, there are multiple cloud approaches, private, hybrid and public – which is right for you? When? And for what applications?

To alleviate some of the inevitable pressure, you need a bridge, something that can help you reduce complexity because it can be common across all these potential deployment models and even facilitate adoption when the time is right for your business.

I refer to these IT bridges as a common substrate. In this case, the substrate is responsible for one of your most important assets, your data. Ensuring you have the right data in the right place at the right time used to be the role of the application, but in today’s world your data transcends the application.

Your data is created, analyzed, poked and prodded by many applications. It is compressed, encrypted, replicated, deduplicated, distributed. Kafka’d, Hadoop’d, and Mongo’d. It is stored in lakes, clouds, silos. The point is that your data is the common substrate of your business – not your applications.

To help you transcend on-prem and cloud IT models, having a common data services platform can be a boon to effectively managing your IT environment, whether it is on-prem, or cloud, or most likely, a fluid balance of both.

Datera adapts to your environment at the rate your business evolves.

Datera Block Diagram

The Datera Data Services Platform was designed to serve this common data substrate. Put simply, it adapts to your environment at the rate your business evolves while radically simplifying the curation and stewardship of your data.

Being new to Datera, one of the first things I did was sit down with the founding architects and asked them why they did what they did. You know what I found? It was on purpose!

From the beginning, the team set out to leverage the strengths and resolve issues they observed when creating distributed systems in the past. I am happiest working with people who have also spent some time at the school of hard knocks. Character building, I think they call it.

Yes, Datera is software-defined storage. Yes, it is scale-out. Yes, it is high performance. This used to be what was considered innovative and this is where the Datera architects cut their teeth. But more is possible – and necessary – to be the common data substrate for hybrid IT.

You need to:

  • Raise the level of interface to be outcome driven at the data level
  • Provide a self-managing system via closed loop QoS and autonomic tuning
  • Enable early and rapid deployment when information is sparse, with analytics to inform you when changes are beneficial
  • Make it easily programmable via the common interfaces that are used in customers’ varied deployment models
  • Embrace the natural and inevitable asymmetry that comes from growing amounts and diverse use of data and the exploitation of new technology
  • Assume that the environment is multi-tenant and multi-customer in how resources are allocated and managed and how information is visualized.
  • Build for both speed, when speed matters, and cost, for when cost matters

After doing all these things, they went a step further by adding the ability to express objectives. You tell the system what you want, and it figures out how to deliver the cost, performance, resiliency and locality.

Game Changer Word Cloud

We all get smarter over time. Just tell it what you want now, and later, when you are smarter and have more information, you can change your mind. This is a game changer!

You can express these objectives in the terms that make sense for your changing environment.

Use VMware? Great. OpenStack? Fine. Kubernetes? Awesome. CLI, REST APIs? Interface in all the ways that make sense for your business.

Sounds cool, but how does it work?

The Datera distributed control plane resolves your objectives into a set of policies that are given to a high-performance data plane as key value stores for execution of the policy. The data plane then maps key value pairs onto the current physical hardware to deliver performance, reliability and accessibility. Software on the individual nodes, built from commodity infrastructure, utilize resources-specific capabilities depending on the type of storage, CPU, memory and networking that optimize for:

  • Transformation – protection, compression, encryption, deduplication…
  • Placement – NVM, SSD, HDD, Cloud…
  • Functionality – snapshot, replication, copy data management…

Telemetry information is gathered and communicated back to the control plane for cloud-based analytics, enabling the control plane to send adjustments to the data plane to meet the current objectives across the various tenants of the system. That, and a few million lines of code – but at least I disclosed the high-level algorithm!

The separation of concerns between the control plane, data plane and local resource management enables the Datera system to evolve and adapt – and you need that. In large scale systems the distribution of decision making is critical. The Datera Data Services Platform ensures that global decisions are evaluated and made in the control plane, away from the data path, and enforced in the data plane while delegating local decisions to the infrastructure where activities are taking place.

Embracing change and asymmetry is unique to Datera and is a fundamental architectural premise that delivers great customer value.

Change and asymmetry can come in many forms as you manage and exploit your data, including: adding new capabilities or services; changes in policy; fault management; addition of tenants; and scaling the physical infrastructure.

This innovation is enabled by having two key-value-store mapping contexts provided by the control plane to the data plane – the current map and the future map – where I am and where I am going.

When a change in the system occurs, a future map is created by the control plane and given to the data plane for execution. Transactions that occur in the system, such as writing new data, are evaluated against the current and future maps and data is placed and optimized as the union of the two objectives (current and future.)

Over a short period of time the system will facilitate the convergence of the current map and future maps by moving or transforming data non-disruptively. Once complete, the data plane notifies the control plane and the (now redundant) future map is discarded.

Interesting, but what does it do for you?

You set objectives, and the Datera system carries out your objectives – radically simple.

Hybrid Cloud, Multi-Cloud

In a hybrid IT model, at any point in time, you have projects moving applications or data from on-prem to cloud or vice-versa. You have new applications wanting access to historical data for analytics and machine learning.

You have legacy applications that will continue to run in their existing model but have expanding data needs. You have all these, all the time – something is always changing.

Underneath all this transition (i.e. chaos) is your data. An application may be transient and/or temporal, but seldom is the data. Data has weight, it has gravity. Chances are when you move an application between on-prem and cloud the data will go through a transformation.

You will need to move and often replicate data to enable this movement of applications. Similarly, to provide access for the purposes of analytics or machine learning, training data needs to be cleaned, distilled or transformed. All of this involves functionality that is cumbersome when done manually and is much better done programmatically.

And don’t forget, there is urgency.

Exploitation of data is a competitive advantage. Urgency often short circuits thorough planning, which means you need a data services substrate that lets you:

Businessman running with laptop

  • Specify the goal, not the method
  • Deploy quickly and without risk
  • Change easily, when better informed
  • Span all your deployment models
  • Adapt to change, which is constant

Sound familiar? Don’t worry, I’m sure it will be fine…