Skip to content

openai_tracer

ChatCompletion

Bases: ChatCompletion

Wrapper of openai.types.chat.ChatCompletion with ImpactsOutput

ChatCompletionChunk

Bases: ChatCompletionChunk

Wrapper of openai.types.chat.ChatCompletionChunk with ImpactsOutput

OpenAIInstrumentor()

Instrumentor initialized by EcoLogits to automatically wrap all OpenAI calls

Source code in ecologits/tracers/openai_tracer.py
def __init__(self) -> None:
    self.wrapped_methods = [
        {
            "module": "openai.resources.chat.completions",
            "name": "Completions.create",
            "wrapper": openai_chat_wrapper,
        },
        {
            "module": "openai.resources.chat.completions",
            "name": "AsyncCompletions.create",
            "wrapper": openai_async_chat_wrapper,
        },
    ]

openai_chat_wrapper(wrapped, instance, args, kwargs)

Function that wraps an OpenAI answer with computed impacts

Parameters:

Name Type Description Default
wrapped Callable

Callable that returns the LLM response

required
instance Completions

Never used - for compatibility with wrapt

required
args Any

Arguments of the callable

required
kwargs Any

Keyword arguments of the callable

required

Returns:

Type Description
Union[ChatCompletion, Stream[ChatCompletionChunk]]

A wrapped ChatCompletion or Stream[ChatCompletionChunk] with impacts

Source code in ecologits/tracers/openai_tracer.py
def openai_chat_wrapper(
    wrapped: Callable,
    instance: Completions,
    args: Any,
    kwargs: Any
) -> Union[ChatCompletion, Stream[ChatCompletionChunk]]:
    """
    Function that wraps an OpenAI answer with computed impacts

    Args:
        wrapped: Callable that returns the LLM response
        instance: Never used - for compatibility with `wrapt`
        args: Arguments of the callable
        kwargs: Keyword arguments of the callable

    Returns:
        A wrapped `ChatCompletion` or `Stream[ChatCompletionChunk]` with impacts
    """
    if kwargs.get("stream", False):
        return openai_chat_wrapper_stream(wrapped, instance, args, kwargs)
    else:
        return openai_chat_wrapper_non_stream(wrapped, instance, args, kwargs)

openai_async_chat_wrapper(wrapped, instance, args, kwargs) async

Function that wraps an OpenAI answer with computed impacts in async mode

Parameters:

Name Type Description Default
wrapped Callable

Async callable that returns the LLM response

required
instance AsyncCompletions

Never used - for compatibility with wrapt

required
args Any

Arguments of the callable

required
kwargs Any

Keyword arguments of the callable

required

Returns:

Type Description
Union[ChatCompletion, AsyncStream[ChatCompletionChunk]]

A wrapped ChatCompletion or AsyncStream[ChatCompletionChunk] with impacts

Source code in ecologits/tracers/openai_tracer.py
async def openai_async_chat_wrapper(
    wrapped: Callable,
    instance: AsyncCompletions,
    args: Any,
    kwargs: Any,
) -> Union[ChatCompletion, AsyncStream[ChatCompletionChunk]]:
    """
    Function that wraps an OpenAI answer with computed impacts in async mode

    Args:
        wrapped: Async callable that returns the LLM response
        instance: Never used - for compatibility with `wrapt`
        args: Arguments of the callable
        kwargs: Keyword arguments of the callable

    Returns:
        A wrapped `ChatCompletion` or `AsyncStream[ChatCompletionChunk]` with impacts
    """
    if kwargs.get("stream", False):
        return openai_async_chat_wrapper_stream(wrapped, instance, args, kwargs)
    else:
        return await openai_async_chat_wrapper_base(wrapped, instance, args, kwargs)