Sign up or log in to watch the video
DevOps for AI: running LLMs in production with Kubernetes and KubeFlow
Aarno Aukia - 6 months ago
Explore the essentials of deploying and managing large language models (LLMs) in production environments using Kubernetes and KubeFlow. As AI and LLMs transition from experimental phases to business-critical applications, this session provides best practices, architectural design insights, and hands-on demonstrations to streamline AI workflows, ensure scalability, and maintain reliability. Ideal for developers and DevOps professionals, this talk will enhance your AI deployment strategies and operational efficiency in real-world business scenarios.
Jobs with related skills
System Admin im Bereich DevOps (m/w/d)
Instaffo
·
2 days ago
Kiel, Germany
DevOps Engineer (w/m/x)
ÖBB-Konzern
·
5 days ago
Vienna, Austria
Hybrid
IT Solution Architect (m/w/d)
Instaffo
·
4 days ago
Grünstadt, Germany
DevOps Engineer (f/m/d)
Instaffo
·
3 days ago
Frankfurt am Main, Germany
Related Videos