Quantcast
Channel: Artificial Intelligence News, Analysis and Resources - The New Stack
Viewing all articles
Browse latest Browse all 742

A Decade of Cloud Native: From CNCF, to the Pandemic, to AI

$
0
0
time machine

This week marks my 5th anniversary of working at The New Stack. When I started, in the final week of March 2020, I had to quickly get used to the “cloud native” world. What was Kubernetes and why did everyone keep talking about it, what did “serverless” mean exactly, what did the acronym CI/CD stand for, what were “micro services”?

It wasn’t as if I was a tech neophyte — I’d founded the pioneering tech blog ReadWriteWeb in 2003 and ran it until 2012. After RWW, I continued to immerse myself in internet technology (particularly the open web), but I no longer kept track of enterprise trends like cloud computing and Docker containers. So I completely missed ‘the new new thing’ of the mid-2010s.

The Launch of CNCF in 2015

But one of my ex-RWW colleagues didn’t miss out. Alex Williams founded The New Stack in 2014, after noticing that Docker and containers technology were fundamentally changing enterprise IT. Then when The Linux Foundation announced the Cloud Native Computing Foundation (CNCF) on July 21, 2015, Alex and the early TNS team were perfectly positioned to be one of the default media companies for this movement (similar to how ReadWriteWeb was in the right place at the right time when Web 2.0 took off around 2005).

The launch of CNCF ushered in the lucrative “cloud native” era, which the new organization defined as “applications or services that are container-packaged, dynamically scheduled and micro services-oriented.” Kubernetes, which itself turned 10 last June, was the key project at CNCF’s launch. Described at the time as an “open source cluster scheduler,” Kubernetes was donated to the foundation by Google as a seed technology.

CNCF website 2015

CNCF website 2015; via Wayback Machine

Time to Scale-Out

After a few weeks settling in at TNS from March 2020, I had gotten used to the new terms and acronyms. I was even ready to get some takes off about cloud native! My debut post in April declared that if the 2000s “was when networking evolved, and the 2010s was all about compute, then the 2020s will see a revolution in scale-out data.”

Scale-out what now? Oh, I mean “a revolution in the data layer,” I helpfully clarified in the opening paragraph. Actually, I’d cribbed that from DataStax chief strategy officer Sam Ramji, who had used similar wording at a recent TNS webinar. As for “scale-out,” that was another one of those strange new terms I had to get used to; it meant adding more power to an application by adding more machines. “Scale-out is, of course, a cornerstone of the cloud native world we now live in,” I added sagely.

Of course, I was winging it. But I was also starting to understand where the 2020s were headed: the “data layer” would be key to computing over the coming decade. Ramji even mentioned AI and machine learning during the session. Obviously this was before generative AI burst onto the scene a couple of years later, but Ramji kind of nailed it when he said that over the next ten years, “there’s an opportunity to make data really easy, really manageable, and create a playground for apps of the future, which will include AI and ML apps.”

The Pandemic Bounce in 2020

The other notable thing about the time I started at TNS? It was a matter of days after my country (New Zealand) went into a full lockdown due to COVID-19. As many of you will remember, March 2020 was when the pandemic ramped up, leading to lockdowns in many parts of the world. We all had to get used to working from home (WFH soon became a new trending acronym). Actually, The New Stack was already a virtual company and I myself had worked from home for about 15 years by that point. So WFH wasn’t new to me — but it was to millions of other people, and so it suddenly escalated demand for cloud native technology.

As TNS contributor Mark Hinkle wrote in June 2020: “The current economic climate has lit a fire under many IT and application teams to move aggressively to the cloud. These moves were already well underway before COVID-19, but now they’ve been accelerated.”

A popular catchphrase during the pandemic years was “digital transformation,” which for enterprise IT departments usually meant shifting to cloud native technologies. The New Stack’s Lawrence Hecht reported in May 2021 that “more than two-thirds (68%) of IT professionals believe their 500+ employee company’s use of Kubernetes increased as a result of the pandemic” (he was citing numbers from the Portworx by Pure Storage 2021 Kubernetes Adoption Survey).

Even though Zoom fatigue had well and truly set in by March 2021, the cloud native platform was continuing to grow stronger. There was a lot of talk about “multicloud” that year, even leading to a proposal by Berkeley professor Ion Stoica (also cofounder of Databricks and Anyscale) to turn cloud computing into a true utility. He and fellow professor Scott Shenker coined the term “sky computing” — meaning a layer above cloud platforms, to enable interoperability between clouds from big players like Google, Microsoft and AWS.

Sky computing is just one of the ongoing efforts to make cloud native technology easier for developers to use. In recent years we’ve also seen the emergence of web-based technologies, like WebAssembly and cloud developer platforms. These have further strengthened and democratized the cloud native toolset (the web being the ultimate democratic internet platform).

Where Is Cloud Native Now, in 2025?

Needless to say, AI has revolutionized the cloud native ecosystem over the past couple of years — just as it has done to every other technology sector. After the emergence of generative AI in 2022, almost all cloud native products have now integrated AI.

The CNCF’s flagship conference, KubeCon, is just around the corner. Take a look at this sample of scheduled sessions to see the impact AI has had on cloud native:

  • Production-Ready LLMs on Kubernetes: Patterns, Pitfalls, and Performance
  • Orchestrating AI Models in Kubernetes: Deploying Ollama as a Native Container Runtime
  • Optimizing Training Performance for Large Language Model(LLM) in Kubernetes
  • A Practical Guide To Benchmarking AI and GPU Workloads in Kubernetes
  • AI Pipelines With OPEA: Best Practices for Cloud Native ML Operations

AI is mostly used in the cloud native world to “underpin large-scale AI infrastructure,” as CNCF CTO Chris Aniszczyk put it in his 2024 year in review post. But it’s also being used to automate many cloud native tools — see this recent CNCF blog post about “how AI-driven approaches are transforming log management tools into ‘intelligent assistants.’”

Given that cloud native technologies both power AI and are increasingly powered by AI, it’s safe to say that the cloud native ecosystem has adapted well to the AI era we’re all now living through.

The post A Decade of Cloud Native: From CNCF, to the Pandemic, to AI appeared first on The New Stack.

Cloud native technologies have underpinned the tech world for 10 years: from CNCF's launch, through the covid years, to the current AI era.

Viewing all articles
Browse latest Browse all 742

Latest Images

Trending Articles



Latest Images