Do you thrive on solving complex technical challenges in AI infrastructure?
Are you ready to architect the future of AI at the edge?
Join the Akamai Inference Cloud Team!
The Akamai Inference Cloud team is part of Akamais Cloud Technology Group. We build AI platforms for efficient, compliant, and high-performing applications. These platforms support customers in running inference models and empower developers to create advanced AI solutions effectively. #AIC
Partner with the best
The Principal Software Engineer will lead and design a globally distributed AI inference platform with OpenAI-compatible endpoints. Responsibilities include making technical decisions, setting architectural patterns, and managing large-scale inference workloads. This position requires broad experience, extensive knowledge of AI/ML systems, and a history of successfully delivering production AI platforms.
Do you thrive on solving complex technical challenges in AI infrastructure?
Are you ready to architect the future of AI at the edge?
Join the Akamai Inference Cloud Team!
The Akamai Inference Cloud team is part of Akamais Cloud Technology Group. We build AI platforms for efficient, compliant, and high-performing applications. These platforms support customers in running inference models and empower developers to create advanced AI solutions effectively. #AIC
Partner with the best
The Principal Software Engineer will lead and design a globally distributed AI inference platform with OpenAI-compatible endpoints. Responsibilities include making technical decisions, setting architectural patterns, and managing large-scale inference workloads. This position requires broad experience, extensive knowledge of AI/ML systems, and a history of successfully delivering production AI platforms.
,[Architecting and designing the core infrastructure of Akamai Inference Cloud, ensuring it is performant, compliant, economical, and explainable., Providing thought leadership on AI inference optimization, model serving architectures, and establishing technical standards for the platform., Making critical technical decisions on frameworks, tools, and approaches for AI workload deployment and global traffic orchestration., Mentoring engineers and collaborating with cross-functional teams to define technical requirements and elevate team capabilities., Contributing hands-on to critical path development, code reviews, and architectural implementations. Requirements: Python, Machine learning, Inference frameworks, LLM, Kubernetes, cloud-native, AI Tools: . Additionally: Sport subscription, Private healthcare, International projects, Free coffee, Gym, Bike parking, In-house trainings, Modern office, No dress code.