30+ frameworks for building AI applications — explained simply with examples and use cases.
The most popular framework for building LLM apps. Chains, agents, memory, retrievers — everything you need.
Data framework for building RAG apps. Best-in-class for document-heavy AI applications.
Run any open-source LLM locally with one command. Mac, Linux, Windows. REST API built in.
Multi-agent conversation framework by Microsoft. Build collaborative AI agent teams.
Coordinate AI agent crews with role-based delegation. Simpler than AutoGen, great DX.
Hugging Face's library — load and run thousands of open models with a few lines of Python.
Build stateful, cyclical agent workflows. Sister project to LangChain — graph-based control flow.
Stanford's framework for programming (not prompting) language models. Compile prompts automatically.
TypeScript toolkit for building AI-powered web apps. Streaming, tool use, multi-provider support.
Open-source embedding database. Simplest way to build RAG apps locally with persistent storage.
Production-grade managed vector database. Auto-scaling, low latency, enterprise security.
Open-source vector DB with hybrid search, multi-tenancy, and built-in embedding modules.
High-performance vector database written in Rust. Self-hosted or managed cloud option.
High-throughput LLM inference engine. PagedAttention algorithm for 24x faster serving.
Desktop app to discover, download, and run local LLMs with a beautiful chat UI.
Industry-standard deep learning framework. The foundation of most modern AI research.
Google's end-to-end ML platform. Production-ready with TF Serving and TF Lite for mobile/edge.
Anthropic's open standard for connecting AI models to external tools, APIs, and data sources.
More frameworks added every week. Submit yours →