A new age in cloud innovation: From edge to core to supercloud – SiliconANGLE
When it comes to cloud, there has been no shortage of major developments in recent years.
The latest mega trend, supercloud, is just the latest of a long line of revolutions that Jay Cuthrell (pictured), partner at IBM Consulting, has had a front-row view of. It goes back to “edge to core to cloud,” and the world of “many clouds, and hybrid clouds, public-private clouds,” according to Cuthrell.
“Supercloud really represents that super structure around all that. What we’re going to see is all the innovation that might have been taking place in one area, boom. Let’s distribute it everywhere simultaneously,” he said. “Very coherent, consistent experiences. Developers. It’s going to be a playland, an absolute playland.”
Cuthrell spoke with theCUBE industry analysts John Furrier and Dave Vellante at the Supercloud 3: Security, AI and the Supercloud event, during an exclusive broadcast on theCUBE, SiliconANGLE Media’s livestreaming studio. They discussed the evolution of cloud, the impacts of artificial intelligence and more.
The next wave
It’s taken a while to get where things are today, where experiences are substantially similar, if not identical across estates. It’s about getting to those carrier-grade experiences that one just takes for granted, according to Cuthrell.
“It’s just going to work,” he said. “The interoperability is there. The pervasiveness, I think you maybe heard it called the ubiquitous workload substrate in the past. You’ve got to have that, because a developer is going to focus in an area to get something accomplished. And you’ve got to normalize all that mess. You’ve got to make it simple and consistent.”
That’s where all the rallying around supercloud is going to open up the next wave of things that can’t even be imagined yet, according to Cuthrell. Society may have seen it in science fiction, but now it becomes transactional.
These days, open source has also opened up a new dynamic. Hyperscaler clouds are running with elasticity, while AI is on the horizon. So how does one rewrite distributed computing?
“If you look at memory-centric, distributed, aggregated architectures, you know, going back and forth like an accordion, then there’s the question of, as a developer, how am I going to take something that ran on, let’s call it a silo — it was a silo of excellence somewhere,” he said. “But now, I’ve got to put that out in the middle of nowhere, and it’s got to be deterministic power, weight, cooling and geometry.”
One needs to be able to run that data workload locally, send back the telemetry and the events that are relevant and pertinent to what’s trying to be accomplished, according to Cuthrell. That’s only possible once one has moved beyond certain points of view.
“‘Well, you know, this is really the private cloud, or the private cloud versus the public cloud,’ versus the ‘I’m going to only do repatriation for these things,’ or these contrived … arguments that really are going to be opened up a little bit with FinOps,” he said. “Because you’re going to bring cloud fiscal responsibility to a lot of decisions that are being made.”
From there, organizations are going to look at what’s fit for purpose at the edge, according to Cuthrell. That’s because those low-latency experiences for a consumer or for industrial, and all the notions of open virtual radio access networks with 5G — all those promises and spectral efficiencies — are coming together. “Going back to the old days … it’s convergence,” he said.
Will new economic models emerge?
Advancements in AI pose an interesting question: Are there going to be new economic models that emerge out of this that disrupt the economics of the data center and then eventually seep their way in? Many people in 2010 were asking what 2020 might look like. Today, everyone’s asking what 2030 will look like. It’s important to look at patterns, according to Cuthrell.
“You look at the patterns and adopting what was the best practice or what was proven in this one area. And you start to move the adjacencies, the white spaces. And I think that the story becomes, how do I take what was carrier-grade trusted and true?” he said. “And you’re putting that into an enterprise setting, because most enterprises, whether they realize it or not, are going to have to become internal service providers themselves.”
Enterprises will need to deliver that consistency of service-level agreements for their constituents, for the end users in their own company and for their downstream customers, according to Cuthrell. So, how might enterprises figure out how they stay relevant for the next 10 years in the industry? The digital transformation, as people call it, is effectively about embracing all of the things that one sees that are distributed.
“But the companies that package that, put that together — like, you just had VMware on, I think they’re a great normalizer of all these clouds — they’re enabling a supercloud behavior. I would say Red Hat, that’s a great example of open source,” he said. “Some of these other stories as they come together, you’re going to see where it enables these developers to do things more consistently than they’ve ever been able to do before.”
Here’s the complete video interview with Jay Cuthrell, part of SiliconANGLE’s and theCUBE’s coverage of the Supercloud 3: Security, AI and the Supercloud event: