DevOps for AI: running LLMs in production with Kubernetes and KubeFlow
Aarno Aukia - 2 months ago
Explore the essentials of deploying and managing large language models (LLMs) in production environments using Kubernetes and KubeFlow. As AI and LLMs transition from experimental phases to business-critical applications, this session provides best practices, architectural design insights, and hands-on demonstrations to streamline AI workflows, ensure scalability, and maintain reliability. Ideal for developers and DevOps professionals, this talk will enhance your AI deployment strategies and operational efficiency in real-world business scenarios.
Jobs with related skills
AI Architect & Consultant (m/f/d)
Riverty Group GmbH
·
1 month ago
Berlin, Germany
+4
Hybrid
Devops Engineer Supply Solutions (m/w/d)
msg
·
23 days ago
Frankfurt am Main, Germany
+8
Hybrid
(Senior) Cloud Data Engineer Supply Solutions (m/w/d)
msg
·
23 days ago
Frankfurt am Main, Germany
+8
Hybrid
Senior Developer – DevOps (x|f|m) - Hybrid
Sartorius
·
2 months ago
Municipality of Madrid, Spain
Hybrid
Related Videos