Yesterday was Day 1 of the first SDDC Symposium hosted by Datera, the leading data-defined storage software platform, with fellow leaders in networking, storage, compute, and solutions: Cumulus, HPE, Intel, Mellanox, and Scality. The virtual event was conceived to give you the information and tools you need to build or expand a modern, software-defined data center and see the amazing set of technologies that can make this happen.
Highlights from Day 1 are below, and look for the Day 2 recap on Friday that will include technical sessions on how to scale data across private, hybrid and multi-cloud, and how to automate data-driven enterprise private clouds, as well as a technical keynote on the principles of building a software-defined infrastructure. See the schedule online and register here to receive access to videos on-demand.
Day 1 Keynote Highlights
To say this was a keynote is a misnomer—it was more like an SDDC Late Night Talk Show with host Mark Peters of ESG, featuring special guests Guy Churchward, CEO of Datera, and Tom Sabol, Storage Sales Leader, HPE.
- Data-defined is the hallmark of this era. Peters and Churchward bantered about the relevance of the term software-defined and even AI-driven, agreeing that both are supporting concepts for a modern infrastructure that is defined by data. Organizations and vendors must enable the ability to free data from the infrastructure to be able to meet future, on-demand needs not yet even envisioned. A software-defined approach is the best route to make that happen.
- Technology is no longer an either/or choice; it’s ‘yes, and.’ In the past, organizations were forced to choose between capabilities—either this or that. With today’s technology, we are firmly in the era of ‘yes, and,’ which means there is no need to compromise. Yes, you can have application performance and reliability, and you can have an infrastructure that’s flexible enough to enable faster development cycles and scale as demands grow.
- Lock-in is the new legacy. “Cloud is a deployment model, not a company or a place,” said Churchward. Business is hybrid and dynamic, and IT needs to change to fit the new operating and delivery model. Lock-in via a storage array, media type or software constraint is just a ‘new legacy’ boat anchor that will weigh down organizations, so open is more important than ever to generate optionality.
- “Coalitions are needed to move the industry forward.” ESG’s Peters emphasized the benefit to end users of the informal SDDC ‘coalition’ that Datera is driving with the sponsors of this event, showing where and how they fit together, complement one another, form reference architectures, and break through the ‘marketing speak’ that often gets in the way of technology success. Sabol shared an example of the recently introduced HPE Datera Cloud Kit that he championed from HPE, Datera, and Mellanox. It was conceived with a customer of his, brought back to the factory, and battle-tested by HPE as part of its HPE Complete partner program, and supported by the coalition from Level 1 customer support on up through on-premises troubleshooting.
- A majority of enterprises use SDS today and are growing their use and reliance on it moving forward. Peters previewed brand new ESG research showing how extensively software-defined storage is used, saying 55% of organizations are using SDS at some level with 72% reporting that they are committed to SDS as their long-term storage strategy. Churchward cautioned that SDS as a term has been overblown in the past decade, and clarified key differences in implementation: “All clouds are SDS. VMware vSAN is SDS, but is homogenous to VMware. HCI is also SDS, but has constraints at scale. And then there’s technology like Datera’s that is heterogeneous and unbound.”
- What’s next for the SDDC? More automation to optimize openness and data freedom. Sabol noted how critical telemetry analytics on each node and data unit would be in the future, and Churchward emphasized that avoiding vendor lock-in for multi-cloud environments will be crucial for organizations to maintain data freedom.
Day 1 Technical Sessions Highlights
Below are highlights from Wednesday’s technical sessions, and you can register to receive the on-demand playlist on the SDDC Symposium site:
- In “Networking with Elasticity and Advanced Analytics,” John Kim of Mellanox discussed how organizations want agility and automation in the network, but also need to prioritize control—often for security and privacy reasons—as well as availability and performance. This is achievable with the right hardware design coupled with a software-defined approach.
- In “Automating Open Networks for Efficiency and Scale,” Cumulus engineer Pete Lumbis asks that if your network doesn’t spark joy, you can do better by moving toward an open networking system that is agile, resilient, scalable, and operationally efficient.
- In “Intelligent Infrastructure and Software-Driven Architectures,” HPE’s Chris Tinker explains how companies can capitalize on emerging technologies in their own data centers by first answering three core questions: what are your business requirements; what are your workload requirements; and what are the specific app requirements. Chris said that no “unicorn solutions” exist that can solve every problem, but HPE exists to bridge the gap between its own IP and partner technologies for the SDDC.
Thanks for joining Day 1 of the virtual Symposium—register here to watch Day 2 and receive sessions on-demand.