Skip to main content

LlamaIndex logo

Welcome to LlamaIndex.TS

LlamaIndex.TS is a data framework for building production-ready LLM applications in TypeScript and JavaScript. Use it to integrate large language models with your own data through powerful patterns like RAG (Retrieval-Augmented Generation), agents, and workflows.

Quickstart

Get started in minutes with your first RAG application

Core Concepts

Understand the key concepts behind LlamaIndex.TS

Examples

Explore comprehensive examples for all features

API Reference

Browse the complete API documentation

What Makes LlamaIndex.TS Special?

LlamaIndex.TS is designed to be lightweight, flexible, and production-ready for building LLM applications across any JavaScript runtime.

Multi-Runtime Support

Run your LLM applications anywhere JavaScript runs:
  • Node.js >= 20 ✅
  • Deno
  • Bun
  • Nitro
  • Vercel Edge Runtime ✅ (with some limitations)
  • Cloudflare Workers ✅ (with some limitations)
Browser support is currently limited due to the lack of support for AsyncLocalStorage-like APIs.

Key Features

RAG Made Easy

Build powerful retrieval-augmented generation systems with simple, composable APIs for indexing, querying, and chat.

Agent Workflows

Create sophisticated agentic systems with the modern @llamaindex/workflow package for multi-step reasoning and tool usage.

Modular Architecture

Install only what you need. Provider packages for LLMs, embeddings, and vector stores keep your bundle size minimal.

Multiple LLM Support

Works with OpenAI, Anthropic, Groq, Gemini, Llama, Mistral, and many more providers out of the box.

Supported LLM Providers

LlamaIndex.TS integrates with all major LLM providers:
  • OpenAI (GPT-4, GPT-3.5)
  • Anthropic (Claude)
  • Google (Gemini)
  • Groq
  • Llama2, Llama3, Llama3.1
  • MistralAI
  • Fireworks
  • DeepSeek
  • ReplicateAI
  • TogetherAI
  • HuggingFace
  • DeepInfra

Use Cases

LlamaIndex.TS powers a wide range of LLM applications:
1

Question Answering

Build systems that answer questions over your documents, knowledge bases, or databases using RAG.
2

Conversational Agents

Create chatbots and assistants that can use tools, access your data, and maintain context across conversations.
3

Document Analysis

Extract insights from PDFs, emails, transcripts, and other unstructured data sources.
4

Workflow Automation

Orchestrate multi-step AI workflows with agents that can reason, plan, and execute complex tasks.

Community and Support

Join our growing community of developers building with LlamaIndex.TS:

GitHub

Star the repo, report issues, and contribute

Discord

Join our Discord community for support and discussions

Twitter

Follow @llama_index for updates and announcements

Playground

Try LlamaIndex.TS interactively in your browser

Open Source

LlamaIndex.TS is fully open source and MIT licensed. We welcome contributions from the community!
Check out our contributing guide to get started with contributing.

Next Steps

Installation

Set up LlamaIndex.TS for your runtime environment

Quickstart

Build your first RAG application in 5 minutes