mistralai_tracer_v0
ChatCompletionResponse
Bases: ChatCompletionResponse
Wrapper of mistralai.models.chat_completion.ChatCompletionResponse
with ImpactsOutput
ChatCompletionStreamResponse
Bases: ChatCompletionStreamResponse
Wrapper of mistralai.models.chat_completion.ChatCompletionStreamResponse
with ImpactsOutput
MistralAIInstrumentor()
Instrumentor initialized by EcoLogits to automatically wrap all MistralAI calls
Source code in ecologits/tracers/mistralai_tracer_v0.py
mistralai_chat_wrapper(wrapped, instance, args, kwargs)
Function that wraps a MistralAI answer with computed impacts
Parameters:
Name | Type | Description | Default |
---|---|---|---|
wrapped |
Callable
|
Callable that returns the LLM response |
required |
instance |
MistralClient
|
Never used - for compatibility with |
required |
args |
Any
|
Arguments of the callable |
required |
kwargs |
Any
|
Keyword arguments of the callable |
required |
Returns:
Type | Description |
---|---|
ChatCompletionResponse
|
A wrapped |
Source code in ecologits/tracers/mistralai_tracer_v0.py
mistralai_chat_wrapper_stream(wrapped, instance, args, kwargs)
Function that wraps a MistralAI answer with computed impacts in streaming mode
Parameters:
Name | Type | Description | Default |
---|---|---|---|
wrapped |
Callable
|
Callable that returns the LLM response |
required |
instance |
MistralClient
|
Never used - for compatibility with |
required |
args |
Any
|
Arguments of the callable |
required |
kwargs |
Any
|
Keyword arguments of the callable |
required |
Returns:
Type | Description |
---|---|
Iterable[ChatCompletionStreamResponse]
|
A wrapped |
Source code in ecologits/tracers/mistralai_tracer_v0.py
mistralai_async_chat_wrapper(wrapped, instance, args, kwargs)
async
Function that wraps a MistralAI answer with computed impacts in async mode
Parameters:
Name | Type | Description | Default |
---|---|---|---|
wrapped |
Callable
|
Async callable that returns the LLM response |
required |
instance |
MistralAsyncClient
|
Never used - for compatibility with |
required |
args |
Any
|
Arguments of the callable |
required |
kwargs |
Any
|
Keyword arguments of the callable |
required |
Returns:
Type | Description |
---|---|
ChatCompletionResponse
|
A wrapped |
Source code in ecologits/tracers/mistralai_tracer_v0.py
mistralai_async_chat_wrapper_stream(wrapped, instance, args, kwargs)
async
Function that wraps a MistralAI answer with computed impacts in streaming and async mode
Parameters:
Name | Type | Description | Default |
---|---|---|---|
wrapped |
Callable
|
Callable that returns the LLM response |
required |
instance |
MistralAsyncClient
|
Never used - for compatibility with |
required |
args |
Any
|
Arguments of the callable |
required |
kwargs |
Any
|
Keyword arguments of the callable |
required |
Returns:
Type | Description |
---|---|
AsyncGenerator[ChatCompletionStreamResponse, None]
|
A wrapped |