Get started
This tutorial will show you how to instrument a simple RAG application that consists of a retrieval step to fetch data and an LLM call to OpenAI to answer the user question based on the data.1. Install Dependencies
2. Create an API key
To create an API key head to the LangSmith settings page. Then click + API Key.3. Set up environment variables
This example uses OpenAI, but you can adapt it to use any LLM provider. If you’re using Anthropic, use the Anthropic wrapper to trace your calls. For other providers, use the traceable wrapper.4. Define your application
We will instrument a simple RAG application for this tutorial, but feel free to use your own code if you’d like - just make sure it has an LLM call!Application Code
Application Code
5. Trace OpenAI calls
The first thing you might want to trace is all your OpenAI calls. LangSmith makes this easy with thewrap_openai
(Python) or wrapOpenAI
(TypeScript) wrappers. All you have to do is modify your code to use the wrapped client instead of using the OpenAI
client directly.

6. Trace entire application
You can also use thetraceable
decorator (Python or TypeScript) to trace your entire application instead of just the LLM calls.
