Exploring AI’s Growth: KubeCon + CloudNativeCon Europe 2024

Delve into operationalizing AI workloads, observability, platform engineering, security, and optimization.

March 27, 2024

Exploring AI's Growth: KubeCon + CloudNativeCon Europe 2024

KubeCon Europe 2024 recently concluded in Paris. Arun Gupta, VP and GM of Open Ecosystem at Intel brings insights into the robustness of the open-source community as a driving force for intersectional advancements between AI and cloud-native technologies.

KubeCon + CloudNativeCon Europe is one of the premier open source developer conferences in the world, and it just wrapped up on Friday in Paris (March 19-22, 2024). Developers, IT professionals, C-level leaders and open source aficionados came together to share key learnings and new innovations and discuss the future of cloud native computing. The show highlighted some of the most popular projects in the cloud native ecosystem, including Kubernetes, Istio and OpenTelemetry.

I have attended the event for the last 8 years, and the last 2 years with our Intel team. This year, I participated in two sessions:

  1. Cloud-Native LLM (large language model) Deployments Made Easy Using LangChain and
  2. Savoir Faire: Cloud Native Technical Leadership 

It’s an honor to be part of the open source community and have the support from Intel. Furthermore, I am the elected chair of the Governing Board for both the Cloud Native Computing Foundation (CNCF) and the Open Source Security Foundation (OSSF).

After the first day of the show, it was clear that AI was taking center stage. This is a significant shift from a year ago at KubeCon, Amsterdam, where AI was mentioned only on the fringes. But a nice complement from KubeCon, Chicago, where the AI story started to emerge around 6-months ago. In Paris, the community’s best and brightest all explored the impact AI is having on open source – including Jorge Palma of Microsoft, Ricardo Rocha of CERN, and Susan Wu of Google (along with Patrick Ohly and Cathy Zhang of Intel). 

There was also star power from the AI world with leaders such as Timothee Lacroix, co-founder of Mistral AI, Paige Bailey from Google DeepMind, and Jeffrey Morgan, founder of Ollama, all talking about bridging the gap between cloud native and AI. Beyond the keynotes, there were hundreds of sessions, training, office hours and many other events, all showcasing the brilliant fusion of topics, including AI, networking, platform engineering, software development, security, hackathons, DEI topics and more. 

See More: NVIDIA GTC 2024 Highlights

Top 5 Highlights From KubeCon 2024

What stood out? Here are my top 5 highlights from the conference and why I think they matter.

1. Record attendance: The largest KubeCon ever

With more than 12,000 developers attending, this was the largest KubeCon ever. The sessions and exhibition floor were packed with attendees – there was a lot of energy. This type of participation really reinforces the ubiquity of cloud native platforms and shows that the business opportunities they provide as open source becomes more sustainable.

2. Cloud native advantages in accelerating AI innovation

Cloud native provides several advantages for AI, such as packaging model and dependencies as a Docker package, helping to develop using Kubernetes on desktop and scale on large clusters in the cloud or data center, and enhancing resource management to ensure the CPU/memory can be allocated appropriately for the models to run properly. 

However, there are gaps between the software lifecycle of a data scientist and how to leverage these cloud native principles. The show highlighted the strong intent to bridge this gap, and we should see a lot of activity in that space throughout the rest of this year.

3. AI workload management

The community is realizing that while GPUs can be used for AI training, the shortage of supply can be restricting. In response, a trend is emerging where CPUs are being used for inferencing and preprocessing data for training before handing it over to GPU. I had many discussions with attendees about how specific processors can enable better AI

performance. In fact, the latest processor technology can result in higher inference performance and lesser latency on LLMs under multiple parameters, thus offloading AI workloads.

4. UXL foundation: Enabling multi-vendor AI workloads

Running AI workloads on GPU also requires custom libraries, sophisticated packaging and it is often vendor-controlled. Customers would like the ability to move their AI workloads on Kubernetes without being locked into a vendor. This is where the UXL Foundation (Unified Acceleration Foundation) shines. 

As an evolution of the oneAPI initiative, it aims to build a multi-architecture, multi-vendor software ecosystem for all accelerators. This will allow developers to move AI workloads without getting locked into a vendor. This came up during multiple interviews and hallway discussions at KubeCon this year.

5. Maturing ecosystem: Observability, security, and optimization

Certain themes clearly emerged during my walk around the exhibition floor. These were focused on observability, platform engineering, security, and optimization. This range of topics shows the maturity of the platform. Additionally, with Kubernetes as the de facto compute platform, growth and development in these areas will help operationalize AI workloads.

The event also included a variety of experiences. From a fun run to the KubeCrawl to the Cloud Native Learning Lounge, there was something for everyone. I participated in CloudNativeHacks (a joint collaboration between CNCF and the United Nations to further the delivery of the Sustainable Development Goals) and in Kids Day, where multiple workshops were available to inspire the next generation of developers. And, of course, the “hallway track,” where you bump into people in the hallway, is always the best.

As both an active member of the community and representing a vendor in the technology space, this event is important because an open ecosystem creates a level playground that allows multiple players to compete. Solving global challenges requires global and diverse participation, and open source inherently enables that. 

The CNCF community is very welcoming, diverse and inclusive. I’ve been involved with this community for several years; it has been an exciting and humbling experience to see it grow. I’m quite excited about the future of cloud native technology and all the challenges that this community is going to solve. I’ll be at future KubeCon events – please reach out to say hello.

Finally, congratulations to the winners of CloudNativeHacksOpens a new window – Team Urban Unity, Team Forester, and Team Potato on winning the hackathon. It was amazing to see the kind of impact that a few individuals can make in just over 20+ hours of hacking.

What did you find interesting at KubeCon + CloudNativeCon Europe 2024? Let us know on FacebookOpens a new window , XOpens a new window , and LinkedInOpens a new window . We’d love to hear from you!

Image Source: Shutterstock

MORE ON TECH EVENTS

Arun Gupta
Arun Gupta

VP and GM of Open Ecosystem, Intel

Arun Gupta is Vice President and General Manager of Open Ecosystem initiatives at Intel Corporation. He is a strategist, advocate, and practitioner who has spent two decades helping companies such as Apple and Amazon embrace open-source principles. Arun is currently chairperson of the Cloud Native Computing Foundation Governing Board.
Take me to Community
Do you still have questions? Head over to the Spiceworks Community to find answers.