Structured outputs

powered by LLMs.

Structured outputs

powered by LLMs.

Structured outputs

powered by LLMs.

Designed for simplicity, transparency, and control.

Designed for simplicity, transparency, and control.

Elixir

Ruby

PHP

TypeScript

Python

pip install -U instructor

Copy

OpenAI

Anthropic

Litellm

Cohere

import instructor
from pydantic import BaseModel
from openai import OpenAI

# Define your desired output structure
class UserInfo(BaseModel):
    name: str
    age: int

# Patch the OpenAI client
client = instructor.from_openai(OpenAI())

# Extract structured data from natural language
user_info = client.chat.completions.create(
    model="gpt-3.5-turbo",
    response_model=UserInfo,

    messages=[{"role": "user", "content": "John Doe is 30 years old."}],
)

print(user_info.name)
#> John Doe
print(user_info.age)
#> 30

Elixir

Ruby

PHP

TypeScript

Python

pip install -U instructor

Copy

OpenAI

Anthropic

Litellm

Cohere

import instructor
from pydantic import BaseModel
from openai import OpenAI

# Define your desired output structure
class UserInfo(BaseModel):
    name: str
    age: int

# Patch the OpenAI client
client = instructor.from_openai(OpenAI())

# Extract structured data from natural language
user_info = client.chat.completions.create(
    model="gpt-3.5-turbo",
    response_model=UserInfo,

    messages=[{"role": "user", "content": "John Doe is 30 years old."}],
)

print(user_info.name)
#> John Doe
print(user_info.age)
#> 30

Trusted by engineers from

⛓️ Langflow

Why use Instructor

1

Powered by type hints

Instructor is powered by Pydantic, which is powered by type hints. Schema validation, prompting is controlled by type annotations; less to learn, less code to write, and integrates with your IDE.

from pydantic import BaseModel

class UserInfo(BaseModel):
    name: str
    age: int

2

Customizable

Pydantic is highly customizable. You can define your own validators, custom error messages, and more.

from pydantic import BeforeValidator

class QuestionAnswer (BaseModel):
    question: str
    answer: Annotated[
        str,
        BeforeValidator(
            llm_validator("don't say objectionable things", client=client)
        ),
    ]

3

Ecosystem

Pydantic is the most widely used data validation library for Python with over 100M downloads a month. It's used by FastAPI, Typer, and many other popular libraries.

100M

100M

How to use Instuctor

Pydantic over Raw Schema

Say goodbye to complex abstractions and hello to simplicity that scales. Define your schema once with Pydantic and let Instructor handle the rest

Pydantic

Json Schema

from typing import List, Literal
from pydantic import BaseModel, Field


class Property(BaseModel):
    name: str = Field(description="name of property in snake case")
    value: str

class Character(BaseModel):
    """
    Any character in a fictional story
    """
    name: str
    age: int
    properties: List[Property]
    role: Literal['protagonist', 'antagonist', 'supporting']

class AllCharacters(BaseModel):
    characters: List[Character] = Field(description="A list of all characters in the story")

Copy

Pydantic

Json Schema

from typing import List, Literal
from pydantic import BaseModel, Field


class Property(BaseModel):
    name: str = Field(description="name of property in snake case")
    value: str

class Character(BaseModel):
    """
    Any character in a fictional story
    """
    name: str
    age: int
    properties: List[Property]
    role: Literal['protagonist', 'antagonist', 'supporting']

class AllCharacters(BaseModel):
    characters: List[Character] = Field(description="A list of all characters in the story")

Copy

Pydantic

Json Schema

from typing import List, Literal
from pydantic import BaseModel, Field


class Property(BaseModel):
    name: str = Field(description="name of property in snake case")
    value: str

class Character(BaseModel):
    """
    Any character in a fictional story
    """
    name: str
    age: int
    properties: List[Property]
    role: Literal['protagonist', 'antagonist', 'supporting']

class AllCharacters(BaseModel):
    characters: List[Character] = Field(description="A list of all characters in the story")

Copy

Easy to try and install

Get all the functionality of the base client with some smart enhancements out of the box using just two parameters: response_model and max_retry

import instructor
from openai import OpenAI
from pydantic import BaseModel

# Patch the OpenAI client with Instructor
client = instructor.from_openai(OpenAI())

class UserDetail(BaseModel):
    name: str
    age: int

# Function to extract user details
def extract_user() -> UserDetail:
    user = client.chat.completions.create(
        model="gpt-4-turbo-preview",
        response_model=UserDetail,
        messages=[
            {"role": "user", "content": "Extract Jason is 25 years old"},
        ]
    )
    return user

Copy

Instructor

Open AI

import instructor
from openai import OpenAI
from pydantic import BaseModel

# Patch the OpenAI client with Instructor
client = instructor.from_openai(OpenAI())

class UserDetail(BaseModel):
    name: str
    age: int

# Function to extract user details
def extract_user() -> UserDetail:
    user = client.chat.completions.create(
        model="gpt-4-turbo-preview",
        response_model=UserDetail,
        messages=[
            {"role": "user", "content": "Extract Jason is 25 years old"},
        ]
    )
    return user

Copy

Instructor

Open AI

import instructor
from openai import OpenAI
from pydantic import BaseModel

# Patch the OpenAI client with Instructor
client = instructor.from_openai(OpenAI())

class UserDetail(BaseModel):
    name: str
    age: int

# Function to extract user details
def extract_user() -> UserDetail:
    user = client.chat.completions.create(
        model="gpt-4-turbo-preview",
        response_model=UserDetail,
        messages=[
            {"role": "user", "content": "Extract Jason is 25 years old"},
        ]
    )
    return user

Copy

Instructor

Open AI

Key features

Hooks into the most popular

Specify Pydantic models to define the structure of your LLM outputs

Flexible Backends

Seamlessly integrate with various LLM providers beyond OpenAI

Retry Management

Easily configure the number of retry attempts for your requests