Mistral AI
Deprecation of Mistral AI v0
Mistral AI python client with version <1.0.0
will no longer be supported by EcoLogits. See official migration guide from v0 to v1 .
Lack of transparency
Some models released by Mistral AI are not open-weights, plus there is no information on the inference infrastructure. Thus, the environmental impacts are estimated with a lower precision.
This guide focuses on the integration of EcoLogits with the Mistral AI official python client .
Official links:
- Repository: mistralai/client-python
- Documentation: docs.mistral.ai
Installation
To install EcoLogits along with all necessary dependencies for compatibility with the Mistral AI client, please use the mistralai
extra-dependency option as follows:
This installation command ensures that EcoLogits is set up with the specific libraries required to interface seamlessly with Mistral AI's Python client.
Chat Completions
Example
Integrating EcoLogits with your applications does not alter the standard outputs from the API responses. Instead, it enriches them by adding the Impacts
object, which contains detailed environmental impact data.
from ecologits import EcoLogits
from mistralai import Mistral
# Initialize EcoLogits
EcoLogits.init()
client = Mistral(api_key="<MISTRAL_API_KEY>")
response = client.chat.complete(# (1)!
messages=[
{"role": "user", "content": "Tell me a funny joke!"}
],
model="mistral-tiny"
)
# Get estimated environmental impacts of the inference
print(response.impacts)
- Use
client.chat
for Mistral AI v0.
import asyncio
from ecologits import EcoLogits
from mistralai import Mistral
# Initialize EcoLogits
EcoLogits.init()
client = Mistral(api_key="<MISTRAL_API_KEY>")
async def main() -> None:
response = await client.chat.complete_async(# (1)!
messages=[
{"role": "user", "content": "Tell me a funny joke!"}
],
model="mistral-tiny"
)
# Get estimated environmental impacts of the inference
print(response.impacts)
asyncio.run(main())
- Use
client.chat
for Mistral AI v0.
Streaming example
In streaming mode, the impacts are calculated incrementally, which means you don't need to sum the impacts from each data chunk. Instead, the impact information in the last chunk reflects the total cumulative environmental impacts for the entire request.
from ecologits import EcoLogits
from mistralai import Mistral
# Initialize EcoLogits
EcoLogits.init()
client = Mistral(api_key="<MISTRAL_API_KEY>")
stream = client.chat.stream(# (1)!
messages=[
{"role": "user", "content": "Tell me a funny joke!"}
],
model="mistral-tiny"
)
for chunk in stream:
# Get cumulative estimated environmental impacts of the inference
print(chunk.data.impacts) # (2)!
- Use
client.chat_stream
for Mistral AI v0. - Use
chunk.impacts
for Mistral AI v0.
import asyncio
from ecologits import EcoLogits
from mistralai import Mistral
# Initialize EcoLogits
EcoLogits.init()
client = Mistral(api_key="<MISTRAL_API_KEY>")
async def main() -> None:
response = await client.chat.stream_async(# (1)!
messages=[
{"role": "user", "content": "Tell me a funny joke!"}
],
model="mistral-tiny"
)
async for chunk in stream:
# Get cumulative estimated environmental impacts of the inference
if hasattr(chunk, "impacts"):
print(chunk.data.impacts) # (2)!
asyncio.run(main())
- Use
client.chat_stream
for Mistral AI v0. - Use
chunk.impacts
for Mistral AI v0.