mistralai_tracer_v1
ChatCompletionResponse
Bases: ChatCompletionResponse
Wrapper of mistralai.models.ChatCompletionResponse
with ImpactsOutput
CompletionChunk
Bases: CompletionChunk
Wrapper of mistralai.models.CompletionChunk
with ImpactsOutput
MistralAIInstrumentor()
Instrumentor initialized by EcoLogits to automatically wrap all MistralAI calls
Source code in ecologits/tracers/mistralai_tracer_v1.py
mistralai_chat_wrapper(wrapped, instance, args, kwargs)
Function that wraps a MistralAI answer with computed impacts
Parameters:
Name | Type | Description | Default |
---|---|---|---|
wrapped |
Callable
|
Callable that returns the LLM response |
required |
instance |
Mistral
|
Never used - for compatibility with |
required |
args |
Any
|
Arguments of the callable |
required |
kwargs |
Any
|
Keyword arguments of the callable |
required |
Returns:
Type | Description |
---|---|
ChatCompletionResponse
|
A wrapped |
Source code in ecologits/tracers/mistralai_tracer_v1.py
mistralai_chat_wrapper_stream(wrapped, instance, args, kwargs)
Function that wraps a MistralAI answer with computed impacts in streaming mode
Parameters:
Name | Type | Description | Default |
---|---|---|---|
wrapped |
Callable
|
Callable that returns the LLM response |
required |
instance |
Mistral
|
Never used - for compatibility with |
required |
args |
Any
|
Arguments of the callable |
required |
kwargs |
Any
|
Keyword arguments of the callable |
required |
Returns:
Type | Description |
---|---|
Iterable[CompletionEvent]
|
A wrapped |
Source code in ecologits/tracers/mistralai_tracer_v1.py
mistralai_async_chat_wrapper(wrapped, instance, args, kwargs)
async
Function that wraps a MistralAI answer with computed impacts in async mode
Parameters:
Name | Type | Description | Default |
---|---|---|---|
wrapped |
Callable
|
Async callable that returns the LLM response |
required |
instance |
Mistral
|
Never used - for compatibility with |
required |
args |
Any
|
Arguments of the callable |
required |
kwargs |
Any
|
Keyword arguments of the callable |
required |
Returns:
Type | Description |
---|---|
ChatCompletionResponse
|
A wrapped |
Source code in ecologits/tracers/mistralai_tracer_v1.py
mistralai_async_chat_wrapper_stream(wrapped, instance, args, kwargs)
async
Function that wraps a MistralAI answer with computed impacts in streaming and async mode
Parameters:
Name | Type | Description | Default |
---|---|---|---|
wrapped |
Callable
|
Callable that returns the LLM response |
required |
instance |
Mistral
|
Never used - for compatibility with |
required |
args |
Any
|
Arguments of the callable |
required |
kwargs |
Any
|
Keyword arguments of the callable |
required |
Returns:
Type | Description |
---|---|
AsyncGenerator[CompletionEvent, None]
|
A wrapped |