Do you thrive on technical leadership and building cutting-edge AI systems?
Are you ready to drive innovation at the intersection of AI and edge computing?
Join the Akamai Inference Cloud Team!
The Akamai Inference Cloud team is part of Akamais Cloud Technology Group. We build AI platforms for efficient, compliant, and high-performing applications. These platforms support customers in running inference models and empower developers to create advanced AI solutions effectively. #AIC
Partner with the best
As a Senior II Software Engineer Lead, this role involves leading the design and implementation of vital AI platform components. Responsibilities include building and optimizing systems for OpenAI-compatible endpoints and managing inference workloads across regions. You will mentor engineers, shape technical decisions, and ensure scalable, high-quality solutions. Expertise in AI/ML systems and exceptional problem-solving abilities are essential.
Do you thrive on technical leadership and building cutting-edge AI systems?
Are you ready to drive innovation at the intersection of AI and edge computing?
Join the Akamai Inference Cloud Team!
The Akamai Inference Cloud team is part of Akamais Cloud Technology Group. We build AI platforms for efficient, compliant, and high-performing applications. These platforms support customers in running inference models and empower developers to create advanced AI solutions effectively. #AIC
Partner with the best
As a Senior II Software Engineer Lead, this role involves leading the design and implementation of vital AI platform components. Responsibilities include building and optimizing systems for OpenAI-compatible endpoints and managing inference workloads across regions. You will mentor engineers, shape technical decisions, and ensure scalable, high-quality solutions. Expertise in AI/ML systems and exceptional problem-solving abilities are essential.
,[Leading the design and implementation of critical platform components for Akamai Inference Cloud, ensuring performance, scalability, and reliability., Driving technical decisions for your domain, selecting appropriate tools, frameworks, and approaches for AI inference workloads., Mentoring and guiding engineers on the team through code reviews, design discussions, and technical problem-solving., Implementing and optimizing containerized AI workloads with hardware-specific optimizations and integrating inference frameworks at scale., Collaborating across teams to define technical requirements, contribute to platform standards, and ensure operational excellence. Requirements: Python, AI inference, LLM, Cloud, Kubernetes, Docker, DevOps, CI/CD, IaaS, GPU, hardware acceleration Tools: . Additionally: Sport subscription, Private healthcare, International projects, Free coffee, Gym, Bike parking, In-house trainings, Modern office, No dress code.