lablup.com

  • sns
  • sns
  • sns
  • sns
Intel logo

Intel

Power AI Everywhere with Intel & Lablup

Lablup partners with Intel to bring Intel® Gaudi® AI Accelerators to Backend.AI, an AI infrastructure operating platform built for scale.

Learn more about Backend.AI
Intel
About Intel

Intel and AI acceleration

Intel has been at the center of High Performance Computing (HPC) for decades. Gaudi AI Accelerators handle deep learning training and inference, while Xeon Scalable Processors power data centers worldwide.

The oneAPI ecosystem gives developers a single programming model across different hardware, and Intel Tiber AI Cloud provides on-demand access to Gaudi instances.

Partner A×Partner B
Strategic Partnership
Partnership

Backend.AI and Intel Gaudi

Backend.AI’s Sokovan™ orchestrator distributes workloads across Intel Gaudi’s compute engines to reduce bottlenecks. It schedules model training and inference down to memory allocation and parallelism, so Gaudi operates closer to its rated throughput.

The full stack for AI

Intel

Gaudi AI Accelerators

Purpose-built accelerators for deep learning training and inference at scale

Xeon Scalable Processors

Enterprise-grade CPUs powering data centers and AI workloads worldwide

oneAPI Ecosystem

Open, unified programming model for heterogeneous computing

Intel Tiber AI Cloud

Cloud platform providing on-demand access to Intel AI hardware

Lablup

Sokovan™ Orchestrator

Workload distribution across Gaudi compute engines

Container-level Virtualization beta

Patented technology for fine-grained accelerator resource management

Multi-tenant AI Platform

Secure, isolated environments for teams sharing AI infrastructure

End-to-end MLOps

Complete lifecycle management from experimentation to production

Intel’s AI hardware and Lablup’s platform software, combined in a single stack

What our leaders say

Lablup unlocks the full potential of Intel Gaudi AI Accelerators with Backend.AI — AI infrastructure operating platform. Backend.AI orchestrator, Sokovan intelligently distributes workloads across Intel Gaudi’s compute engines, significantly reducing bottlenecks. Every model training and inference task is precisely orchestrated—from memory usage to parallelism—maximizing customers AI performance. Backend.AI doesn’t just run on Gaudi—it unleashes Gaudi.

Jeongkyu Shin, CEO of Lablup Inc.

The partnership between Intel and Lablup represents a perfect fusion of hardware excellence and software intelligence. Together, we’re redefining what’s possible for GenAI and LLMs, delivering not just performance but unprecedented efficiency at scale. Whether in the cloud or your data center, this collaboration brings the AI industry exactly what it needs right now: choice without compromise.

Arijit Bandyopadhyay, CTO of Enterprise Analytics & AI, Head of Strategy - Enterprise & Cloud (CSV Group) at Intel Corporation
News & Blog

Our journey with Intel

Click each item to read the press release or blog post.

Resource Center

Learn more about Backend.AI & Intel

Your AI infrastructure, built on Intel Gaudi.

Talk to our team about deploying Backend.AI on Gaudi 2 and Gaudi 3.

Lablup
  • sns
  • sns
  • sns
  • sns

© Lablup Inc. All rights reserved.

KR Office: 8F, 577, Seolleung-ro, Gangnam-gu, Seoul, Republic of Korea US Office: 3003 N First st, Suite 221, San Jose, CA 95134
KR Office: 8F, 577, Seolleung-ro, Gangnam-gu, Seoul, Republic of Korea US Office: 3003 N First st, Suite 221, San Jose, CA 95134