Installation and Setup
We support Node.JS versions 18, 20 and 22, with experimental support for Deno, Bun and Vercel Edge functions.
Installation from NPM
- npm
- Yarn
- pnpm
npm install llamaindex
yarn add llamaindex
pnpm add llamaindex
Environment variables
Our examples use OpenAI by default. You can use other LLMs via their APIs; if you would prefer to use local models check out our local LLM example.
To use OpenAI, you'll need to get an OpenAI API key and then make it available as an environment variable this way:
export OPENAI_API_KEY="sk-......" # Replace with your key
If you want to have it automatically loaded every time, add it to your .zshrc/.bashrc
.
WARNING: do not check in your OpenAI key into version control. GitHub automatically invalidates OpenAI keys checked in by accident.
What next?
- The easiest way to started is to build a full-stack chat app with
create-llama
. - Try our other getting started tutorials
- Learn more about the high level concepts behind how LlamaIndex works
- Check out our many examples of LlamaIndex.TS in action