## A practical note on AI frameworks for building AI-native products
### 1) Provider SDKs. Start here for most products
**[OpenAI SDK](https://platform.openai.com/docs/overview)** (Python / TypeScript)
The default choice for production. Direct control, stable APIs, strong support for tool calling, structured outputs, multimodal, and streaming.
**[Anthropic SDK](https://docs.anthropic.com/)** (Python / TypeScript)
Equally production-ready. Often chosen for long-context workflows, reasoning-heavy tasks, or specific model behavior preferences.
Use provider SDKs when you want minimal abstraction and predictable behavior in production.
### 2) App-layer frameworks. Use only when orchestration is real
**[Vercel AI SDK](https://sdk.vercel.ai/)** (TypeScript)
Best for web apps. Excellent streaming UX and tight React and Next.js integration. Fastest path from chat UI to production.
**[LangChain](https://www.langchain.com/)** (Python / TypeScript)
Useful when you need complex workflows, tool routing, agent loops, or memory patterns. Powerful but easy to overbuild. Keep it thin.
**[LlamaIndex](https://www.llamaindex.ai/)** (Python / TypeScript)
Best-in-class for RAG. Indexing, retrieval, and document pipelines are the core strength. Pair with a provider SDK for reasoning.
Rule of thumb.
If the product is knowledge-heavy, start with LlamaIndex.
If the product is workflow-heavy, consider LangChain lightly or roll your own.
### 3) Tool interoperability. Where the ecosystem is going
**[MCP – Model Context Protocol](https://modelcontextprotocol.io/)**
A standard way to expose tools and data to models and agents. Useful when you expect multiple agents, clients, or model providers and want clean, vendor-agnostic tool interfaces.
### 4) Automation, workflows, and AI ops glue
**[n8n](https://n8n.io/)**
Best for connecting AI to existing SaaS tools. Email, Slack, CRMs, databases. Low code, fast iteration. Ideal for ops, GTM, and internal tooling.
**[Codewords](https://www.codewords.ai/)** (by Agemo)
Best for AI-native workflows where logic, state, and decisions are driven by the model. Feels closer to software than automation. Useful when prompts, tools, and branching behavior evolve over time.
Rule of thumb.
Use n8n when AI is a step in a broader business process.
Use Codewords when the process itself is AI-driven.
### 5) Local development and testing
**[LM Studio](https://lmstudio.ai/)**
Good for running local models and testing prompts without cloud calls. Useful for privacy-sensitive work and fast iteration.
### 6) Raw API access and debugging
**[cURL](https://curl.se/)**
Essential for debugging, reproducing issues, and validating requests. Not a framework, but always part of the stack.
### How this usually looks in practice
- Core product logic. Provider SDK plus thin custom orchestration.
- RAG-heavy products. LlamaIndex plus provider SDK.
- Web apps with streaming chat. Vercel AI SDK plus provider SDK.
- AI-native internal tools or agents. Codewords plus provider SDK.
- Ops and automation. n8n.
Build with fewer abstractions than you think you need. Add frameworks only when the complexity is already there.