
LocalAI is the open-source AI engine. Run any model - LLMs, vision, voice, image, video - on any hardware. No GPU required.

LocalAI is an open-source AI engine that runs any model on any hardware without requiring a GPU. It provides drop-in compatibility with OpenAI and Anthropic APIs while supporting 36+ backends and working across NVIDIA, AMD, Intel, Apple Silicon, and CPU-only systems.
LocalAI is designed for developers, enterprises, and organizations seeking to run AI models privately and efficiently on their own infrastructure. It's ideal for teams prioritizing data sovereignty, cost control, and the ability to work with any hardware configuration.

Get up and running with Kimi-K2.5, GLM-5, MiniMax, DeepSeek, gpt-oss, Qwen, Gemma and other models.

Web UI for training and running open models like Gemma 4, Qwen3.5, DeepSeek, gpt-oss locally.

A list of Free Software network services and web applications which can be hosted on your own servers

Fair-code workflow automation platform with native AI capabilities. Combine visual building with custom code, self-host or cloud, 400+ integrations.

Get up and running with Kimi-K2.5, GLM-5, MiniMax, DeepSeek, gpt-oss, Qwen, Gemma and other models.

User-friendly AI Interface (Supports Ollama, OpenAI API, ...)

Virtual whiteboard for sketching hand-drawn like diagrams

The Postgres development platform. Supabase gives you a dedicated Postgres database to build your web, mobile, and AI applications.

High performance self-hosted photo and video management solution.

The world’s fastest framework for building websites.

A fancy self-hosted monitoring tool

Open Source Continuous File Synchronization