Exemple Jupyter AI avec mistral

Contents

5.9. Exemple Jupyter AI avec mistral#

%load_ext jupyter_ai_magics
%config AiMagics.default_language_model = "mistralai:mistral-large-latest"

%env MISTRAL_API_KEY=xxxxxxxxxxxxxx
%%ai -f math
Generate the 2D heat equation in LaTeX surrounded by `$$`

\(\displaystyle \frac{\partial u}{\partial t} = \alpha \left( \frac{\partial^2 u}{\partial x^2} + \frac{\partial^2 u}{\partial y^2} \right) \)

%%ai -f code
Load the "titanic.csv" file. Do a univariate analysis.
Write code to show the relevent plots. 
Use a single figure to make te plots using subplots.

AI generated code inserted below ⬇️

import pandas as pd
import matplotlib.pyplot as plt
import seaborn as sns

# Load the dataset
titanic = pd.read_csv('titanic.csv')

# Select relevant columns for univariate analysis
columns = ['Age', 'Fare', 'Pclass', 'SibSp', 'Parch']

# Create a figure with subplots
fig, axes = plt.subplots(nrows=3, ncols=2, figsize=(15, 10))
axes = axes.flatten()

# Plot each column
for i, col in enumerate(columns):
    sns.histplot(titanic[col], kde=True, ax=axes[i])
    axes[i].set_title(f'Distribution of {col}')

# Adjust layout
plt.tight_layout()
plt.show()

png

5.9.1. The END#

%ai list

Provider

Environment variable

Set?

Models

ai21

AI21_API_KEY

  • ai21:j1-large
  • ai21:j1-grande
  • ai21:j1-jumbo
  • ai21:j1-grande-instruct
  • ai21:j2-large
  • ai21:j2-grande
  • ai21:j2-jumbo
  • ai21:j2-grande-instruct
  • ai21:j2-jumbo-instruct

gpt4all

Not applicable.

N/A

  • gpt4all:ggml-gpt4all-j-v1.2-jazzy
  • gpt4all:ggml-gpt4all-j-v1.3-groovy
  • gpt4all:ggml-gpt4all-l13b-snoozy
  • gpt4all:mistral-7b-openorca.Q4_0
  • gpt4all:mistral-7b-instruct-v0.1.Q4_0
  • gpt4all:gpt4all-falcon-q4_0
  • gpt4all:wizardlm-13b-v1.2.Q4_0
  • gpt4all:nous-hermes-llama2-13b.Q4_0
  • gpt4all:gpt4all-13b-snoozy-q4_0
  • gpt4all:mpt-7b-chat-merges-q4_0
  • gpt4all:orca-mini-3b-gguf2-q4_0
  • gpt4all:starcoder-q4_0
  • gpt4all:rift-coder-v0-7b-q4_0
  • gpt4all:em_german_mistral_v01.Q4_0

huggingface_hub

HUGGINGFACEHUB_API_TOKEN

See https://huggingface.co/models for a list of models. Pass a model’s repository ID as the model ID; for example, huggingface_hub:ExampleOwner/example-model.

mistralai

MISTRAL_API_KEY

  • mistralai:open-mistral-7b
  • mistralai:open-mixtral-8x7b
  • mistralai:open-mixtral-8x22b
  • mistralai:mistral-small-latest
  • mistralai:mistral-medium-latest
  • mistralai:mistral-large-latest
  • mistralai:codestral-latest

ollama

Not applicable.

N/A

See https://www.ollama.com/library for a list of models. Pass a model’s name; for example, deepseek-coder-v2.

qianfan

QIANFAN_AK, QIANFAN_SK

  • qianfan:ERNIE-Bot
  • qianfan:ERNIE-Bot-4

togetherai

TOGETHER_API_KEY

  • togetherai:Austism/chronos-hermes-13b
  • togetherai:DiscoResearch/DiscoLM-mixtral-8x7b-v2
  • togetherai:EleutherAI/llemma_7b
  • togetherai:Gryphe/MythoMax-L2-13b
  • togetherai:Meta-Llama/Llama-Guard-7b
  • togetherai:Nexusflow/NexusRaven-V2-13B
  • togetherai:NousResearch/Nous-Capybara-7B-V1p9
  • togetherai:NousResearch/Nous-Hermes-2-Yi-34B
  • togetherai:NousResearch/Nous-Hermes-Llama2-13b
  • togetherai:NousResearch/Nous-Hermes-Llama2-70b

Aliases and custom commands:

Name

Target

gpt2

huggingface_hub:gpt2

gpt3

openai:davinci-002

chatgpt

openai-chat:gpt-3.5-turbo

gpt4

openai-chat:gpt-4

ernie-bot

qianfan:ERNIE-Bot

ernie-bot-4

qianfan:ERNIE-Bot-4

titan

bedrock:amazon.titan-tg1-large

openrouter-claude

openrouter:anthropic/claude-3.5-sonnet:beta