Skip to content

openai

marvin.engine.language_models.openai

OpenAIChatLLM

format_messages

Format Marvin message objects into a prompt compatible with the LLM model

run async

Calls an OpenAI LLM with a list of messages and returns the response.

OpenAIStreamHandler

handle_streaming_response async

Accumulate chunk deltas into a full response. Returns the full message. Passes partial messages to the callback, if provided.