Instructor

To use Instructor to generate structured outputs, you need to wrap the OpenAI client with both Instructor and Braintrust. It's important that you call Braintrust's wrap_openai first, because it uses low-level usage info and headers returned by the OpenAI call to log metrics to Braintrust.

trace-instructor.py
import instructor from braintrust import init_logger, load_prompt, wrap_openai   logger = init_logger(project="Your project name")     def run_prompt(text: str):  # Replace with your project name and slug  prompt = load_prompt("Your project name", "Your prompt name")    # wrap_openai will make sure the client tracks usage of the prompt.  client = instructor.patch(wrap_openai(OpenAI()))    # Render with parameters  return client.chat.completions.create(**prompt.build(input=text), response_model=MyResponseModel)

On this page

Instructor - Docs - Braintrust