6. Analyse de séries temporelles avec IA#

Marc Buffat dpt mécanique, UCB Lyon1

time series

import tensorflow as tf
2025-09-10 09:41:53.553487: I external/local_xla/xla/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used.
2025-09-10 09:41:53.557387: I external/local_xla/xla/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used.
2025-09-10 09:41:53.567635: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1757490113.584372 1400336 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1757490113.589375 1400336 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1757490113.602535 1400336 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1757490113.602554 1400336 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1757490113.602556 1400336 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1757490113.602558 1400336 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-09-10 09:41:53.607088: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
%matplotlib inline
import numpy as np
import matplotlib.pyplot as plt
# police des titres
plt.rc('font', family='serif', size='18')
from IPython.display import display,Markdown
# IA
import sklearn as sk
import tensorflow as tf
_uid_ = 12345
def serie_temp(N,a0=1.0,a1=0.5,a2 = 0.4, a3=0.1):
    # data / jours 
    np.random.seed(_uid_)
    # time series
    Ts = np.array([x for x in np.arange(N)],dtype=int)
    ys = [ a0*np.sin(2*np.pi*x/180) + a1*np.cos(2*np.pi*x/15) \
         + a2*x/360  for x in range(N)] + \
           a3*np.random.normal(size=N,scale=0.2)
    return Ts,ys

6.1. Objectifs#

On étudie un système temporel \(Y(t)\) et on souhaite prédire l’évolution du système: i.e. la prévision de ses futures réalisations en se basant sur ses valeurs passées

Une série temporelle Yt est communément décomposée en tendance, saisonnalité, bruit:

\[Y(t) =T(t)+S(t)+\epsilon(t)\]
  • tendance \(T(t)\) = évolution à long terme

  • saisonnalité \(S(t)\) = phénoméne périodique

  • bruit \(\epsilon(t)\) = partie aléatoire

6.1.1. méthodes#

méthodes classiques: (modélisation de série chro. linéaires):

  • lissages exponentiels,

  • modèles de régression (régression linéaire, modèles non-paramétriques… ),

  • modèles SARIMA

utilisation de l’IA:

  • random forest,

  • réseaux de neuronnes récurrents LSTM

6.2. Génération des données#

  • Série temporelle \(Y = Y(t)\)

  • N mesures à intervalle régulier \(\Delta t\)

    • tableau de données ys

      \[ys[i] = Y(i\Delta t)\]
    • tableau ts (pour l’analyse)

      \[ts[i] = i\Delta t\]

tests

  1. série périodique simple

  2. serie bi-périodique (modulation)

  3. avec tendance à long terme

  4. avec du bruit

# construction serie temporelle
# cas periodique le plus simple
Ts,ys = serie_temp(1000,a0=0,a1=0.5,a2=0.0,a3 = 0.)
# cas bi-periodique 
#Ts,ys = serie_temp(1000,a0=1.0,a1=0.5,a2=0.0,a3=0.0)
# + tendance 
#Ts,ys = serie_temp(1000,a0=1.0,a1=0.5,a2=0.2,a3=0.0)
# + bruit
Ts,ys = serie_temp(1000,a0=1.0,a1=0.5,a2=0.2,a3=0.3)
plt.figure(figsize=(12,8))
plt.subplot(1,2,1)
plt.plot(Ts[:],ys)
plt.xlabel("jour")
plt.title("serie temporelle");
plt.subplot(1,2,2)
plt.plot(Ts[:100],ys[:100])
plt.xlabel("jour")
Text(0.5, 0, 'jour')
../../_images/bf9b80bd688eda840d0243d4f61002379bb6d6331c1d67c910d6515d340fa59d.png

6.3. préparation des données#

fenêtrage des données:

choix d’une fenêtre de nav jours précédents pour prédire nap valeurs (i.e. sur nap jours)

  • nav taille de la fenêtre d’histoire (avant)

  • nap taille de la fenêtre prédiction (après)

  • N nbre de fenêtres

  • t0 date de début prédiction

def dataset(Ts,ys,nav,nap,N,t0):
    # choix d'une fenetre de nav jours précédents pour prédir nap valeurs (i.e. sur nap jours)
    # nav taille de la fenetre d'histoire (avant)
    # nap taille de la fenetre prediction (apres)
    # N nbre de fenetres
    # t0 date de debut prediction
    # 
    t1 = t0 - N - nav -nap
    print(f"apprentissage sur {N} fenetres de {nav}-{nap} jours entre le jour {t1} et {t0}")
    # 
    X  = np.zeros((N,nav))
    y  = np.zeros((N,nap))
    t  = np.zeros(N,dtype=int)
    # construction de la base de données
    for i in range(N):
        X[i,:] = ys[t1+i:t1+i+nav]
        y[i]   = ys[t1+i+nav:t1+i+nav+nap]
        t[i]   = Ts[t1+i+nav]
    return X,y,t
# N fenetres: de 14 jours -> 7 jours pour prediction à partir du jour t0
nav = 14
nap = 7
#N  = 200
#t0 = 300
N = 400
t0 = 600
X,y,t = dataset(Ts,ys,nav,nap,N,t0)
apprentissage sur 400 fenetres de 14-7 jours entre le jour 179 et 600
X.shape, y.shape, t.shape
((400, 14), (400, 7), (400,))
def plot_dataset():
    plt.figure(figsize=(14,6))
    plt.subplot(1,2,1)
    plt.plot(t-nav,X[:,0])
    plt.plot(t,y[:,0])
    plt.xlabel("jour")
    plt.ylabel("y")
    plt.title("data apprentissage")
    plt.subplot(1,2,2)
    plt.plot(np.arange(t[0]-nav,t[0]+nap),ys[t[0]-nav:t[0]+nap],'--')
    plt.plot(np.arange(t[0]-nav,t[0]),X[0,:],'or')
    plt.plot(np.arange(t[0],t[0]+nap),y[0,:],'xg')
    plt.plot(np.arange(t[-1]-nav,t[-1]+nap),ys[t[-1]-nav:t[-1]+nap],'--')
    plt.plot(np.arange(t[-1]-nav,t[-1]),X[-1,:],'or')
    plt.plot(np.arange(t[-1],t[-1]+nap),y[-1,:],'xg')
    plt.xlabel("jour")
    plt.title("first/last window");
    return
plot_dataset()
../../_images/d3ab5bfeb0f4bad6b29fbeee0fbae3990913353745ba27072e90fc94e31e28db.png

6.4. Scikit Learn RandomForest#

“forêt aléatoire” d’arbres de décision

  • prédiction 1 valeur à la fois

random forest

6.5. Réseau de neurones: LSTM/ RNN#

LSTM = Long Short-Term Memory

  • réseau RNN récurrent

  • fonction activation: évite l’explosion de la sortie (tanh )

  • méthode de gradient numérique (\(\alpha\) taux d’apprentissage) $\( w_{k+1} = w_k - \alpha F_w\)$

  • EPOCH = nbre d’epoques pour l’apprentissage

Le nombre d’époques est un hyperparamètre qui définit le nombre de fois que l’algorithme d’apprentissage parcours l’ensemble des données d’entraînement

  1. Modèle de neuronne informatique

../../_images/neuroneformel-1.png

la sortie \(y\) est une fonction non linéaire des entrées (f = fonction d’activation)

\[ y = f(\sum_i w_i x_i + b) \]

les coefficients \(w_i, b\) sont obtenu par minimisation d’une erreur \(Err = || y_{pred} - \hat{y} ||\) à partir d’une base de données d’apprentissage \(\hat{y}\) en utilisant des algorithmes de minimisation (gradient)

  1. Réseau de neuronnes par couche

../../_images/reseau_neuronne.png
  1. Réseau de neuronnes récurrents (traitement de séquence temporelle)

../../_images/reseau-RNN.png
\[ y^t = f(\sum_i w_i x^t_i + b + \sum_j r_j y^t_j) \]

6.5.1. Réseaux RNN#

images/Architecture-RNN.jpg

6.5.2. La problématique de l’apprentissage d’un réseau récurrent#

réseau récurrent simple classique constitué d’une couche récurrente suivie d’une couche dense :

../../_images/RNNsimple.png

Il comprend trois matrices de poids : W, R et V ; R étant la matrice des poids récurrents. L’apprentissage du réseau consiste donc à apprendre ces trois matrices sur une base d’exemples étiquetés.

Or l’algorithme de minimisation par gradient pour les réseaux de neuronnes utilise un algorithme appelé rétropropagation du gradient. Cet algorithme rétropropage le gradient de l’erreur à travers les différentes couches de poids du réseau, en remontant de la dernière à la première couche.

Malheureusement, dans le cas des réseaux récurrents, la présence du cycle de récurrence (matrice R) interdit l’utilisation de cet algorithme

6.5.3. solution : rétropropagation à travers le temps#

La solution à ce problème consiste à exploiter la version dépliée du réseau, qui élimine les cycles.

Nous allons donc utiliser une approximation du réseau récurrent par un réseau déplié K fois (K = profondeur = nbre de couches internes cachés de 10 a 100) , comme présenté sur la figure suivante avec K=2 :

../../_images/RNNdeplie.png

Attention

  • Le réseau déplié étant plus profond, la disparition du gradient (ou gradient évanescent) est plus importante durant l’apprentissage, et il est plus difficile à entraîner à cause d’une erreur qui tend à s’annuler en se rapprochant des couches basses.

Il est donc important d’utiliser toutes les stratégies possibles permettant de lutter contre ce phénomène : Batch Normalization, dropout, régularisation L1 et L2, etc.

  • Comme les poids de la couche récurrente sont dupliqués, les réseaux récurrents sont également sujets à un autre phénomène appelé explosion du gradient. Il s’agit d’un gradient d’erreur dont la norme est supérieure à 1.

Une méthode simple et efficace pour éviter cela consiste à tester cette norme, et à la limiter si elle est trop importante (aussi appelée gradient clipping, en anglais).

6.5.4. neuronne LSTM : Long Short Term Memory#

Afin de modéliser des dépendances à très long terme, il est nécessaire de donner aux réseaux de neurones récurrents la capacité de maintenir un état sur une longue période de temps.

C’est le but des cellules LSTM (Long Short Term Memory), qui possèdent une mémoire interne appelée cellule (ou cell). La cellule permet de maintenir un état aussi longtemps que nécessaire. Cette cellule consiste en une valeur numérique que le réseau peut piloter en fonction des situations.

../../_images/RNN_LSTM.png

la cellule mémoire peut être pilotée par trois portes de contrôle qu’on peut voir comme des vannes :

  • la porte d’entrée décide si l’entrée doit modifier le contenu de la cellule

  • la porte d’oubli décide s’il faut remettre à 0 le contenu de la cellule

  • la porte de sortie décide si le contenu de la cellule doit influer sur la sortie du neurone

Le mécanisme des trois portes est strictement similaire. L’ouverture/la fermeture de la vanne est modélisée par une fonction d’activation f qui est généralement une sigmoïde. Cette sigmoïde est appliquée à la somme pondérée des entrées, des sorties et de la cellule, avec des poids spécifiques.

Pour calculer la sortie \(y^t\), on utilise donc l’entrée \(x^t\), les états cachés \(h^{t-1}\) (\(x^{t-1},x^{t-2}\)) (dépliement de la récurrence) qui représentent la mémoire à court terme (short-term mémory) et les états des cellules mémoires \(c^{t-1}\) qui représentent la mémoire à long terme (long-term memory)

Comme n’importe quel neurone, les neurones LSTM sont généralement utilisés en couches. Dans ce cas, les sorties de tous les neurones sont réinjectées en entrée de tous les neurones.

Compte tenu de toutes les connexions nécessaires au pilotage de la cellule mémoire, les couches de neurones de type LSTM sont deux fois plus « lourdes » que les couches récurrentes simples, qui elles-mêmes sont deux fois plus lourdes que les couches denses classiques.

Les couches LSTM sont donc à utiliser avec parcimonie !

6.6. Mise en oeuvre#

6.6.1. Apprentissage RandomForest#

  • scikit learn

from sklearn.linear_model import LinearRegression
from sklearn.ensemble import RandomForestRegressor
from sklearn.neighbors import KNeighborsRegressor
from sklearn.metrics   import r2_score
# choix de l'algorithme
clf = RandomForestRegressor()
#clf = KNeighborsRegressor()
#clf = LinearRegression()
Xlearn = X.copy()
ylearn = y[:,0]
clf.fit(Xlearn,ylearn)
RandomForestRegressor()
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
print("score = {:2d}%".format(int(100*clf.score(Xlearn, ylearn))))
yp = clf.predict(Xlearn)
print("R2 = {:3.2f}%".format(r2_score(ylearn,yp)))
score = 99%
R2 = 1.00%
def plot_pred():
    plt.figure(figsize=(10,6))
    plt.plot(Ts[t2:t2+nap],ypred,'x')
    plt.plot(Ts[t2-nav:t2],Xpred[0],'--o')
    plt.plot(Ts[t2-nav:t2+nap],ys[t2-nav:t2+nap],'--')
    plt.xlabel("jour")
    plt.title(f"prediction sur {nap} jours à partir du jour {t2}");
    return
# prediction à partir de t2
t2 = t0 
Xpred  = np.array([ys[t2-nav:t2]])
ypred  = np.zeros(nap)
Xp     = Xpred.copy()
ypred[0] = clf.predict(Xp)[0]
for i in range(1,nap):
    Xp[0,:-i] = Xpred[0,i:]
    Xp[0,-i:] = ypred[:i]
    ypred[i] = clf.predict(Xp)[0]
Xpred.shape, ypred.shape
((1, 14), (7,))
plot_pred()
../../_images/79f480d398aa718a733d769ad75cd2cabf127fd10181be7eaea7c6f649cc6515.png

6.6.2. Mise en oeuvre LSTM RNN#

  • bibliothèque tensor flow Keras RNN

#Machine learning
from sklearn import preprocessing
import tensorflow as tf
import statsmodels as st
from statsmodels.tsa.seasonal import STL
from sklearn.model_selection  import train_test_split
Xlearn = X.copy()
ylearn = y.copy()
Xlearn = Xlearn.reshape(X.shape[0], nav, 1)
ylearn = ylearn.reshape(y.shape[0], nap, 1)
Xlearn.shape, ylearn.shape
((400, 14, 1), (400, 7, 1))
#Nombre d'époque d'entrainement (fenetre de taille nav)
#EPOQUE = 300
EPOQUE = 200
#EPOQUE = 50
# modèle du réseaux de neurones(4 rangées (100,100,50,50) dont la première LSTM)
# si pas activation: activation='linear' lineaire a(x)=x, sinon test avec 'relu'
modele_lstm = tf.keras.models.Sequential([
    tf.keras.layers.LSTM(nav),
    tf.keras.layers.Dense(nav,activation='tanh'),
    tf.keras.layers.Dense(nap,activation='tanh'),
    tf.keras.layers.Dense(nap)
])
#Configuration du modèle(on minimise avec la méthode des moindres carrés)
modele_lstm.compile(optimizer='adam', metrics=['mae'], loss='mse')
print(EPOQUE)
200
E0000 00:00:1757490123.007106 1400336 cuda_executor.cc:1228] INTERNAL: CUDA Runtime error: Failed call to cudaGetRuntimeVersion: Error loading CUDA libraries. GPU will not be used.: Error loading CUDA libraries. GPU will not be used.
W0000 00:00:1757490123.013966 1400336 gpu_device.cc:2341] Cannot dlopen some GPU libraries. Please make sure the missing libraries mentioned above are installed properly if you would like to use GPU. Follow the guide at https://www.tensorflow.org/install/gpu for how to download and setup the required libraries for your platform.
Skipping registering GPU devices...
#Lance l'entrainement du modèle
import time
time_start = time.time()
modele_lstm.fit(Xlearn, ylearn, epochs=EPOQUE, verbose = True)
print('phase apprentissage: {:.2f} seconds'.format(time.time()-time_start))
Epoch 1/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 16s 1s/step - loss: 0.6055 - mae: 0.6366

13/13 ━━━━━━━━━━━━━━━━━━━━ 1s 4ms/step - loss: 0.7063 - mae: 0.6963
Epoch 2/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.6190 - mae: 0.6675

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.6106 - mae: 0.6560 
Epoch 3/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.4443 - mae: 0.5472

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.5073 - mae: 0.5902 
Epoch 4/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.4611 - mae: 0.5545

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.4320 - mae: 0.5415 
Epoch 5/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.3940 - mae: 0.5240

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.3641 - mae: 0.4971 
Epoch 6/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 101ms/step - loss: 0.3534 - mae: 0.4980

 6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 0.3362 - mae: 0.4785 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.3155 - mae: 0.4610 
Epoch 7/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 33ms/step - loss: 0.3276 - mae: 0.4745

 8/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.2781 - mae: 0.4329 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.2680 - mae: 0.4239
Epoch 8/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1954 - mae: 0.3556

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2246 - mae: 0.3897 
Epoch 9/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2523 - mae: 0.4200

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2290 - mae: 0.3939 
Epoch 10/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2396 - mae: 0.4047

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2238 - mae: 0.3909 
Epoch 11/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2173 - mae: 0.3920

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2175 - mae: 0.3865 
Epoch 12/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2245 - mae: 0.3917

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2093 - mae: 0.3773 
Epoch 13/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2150 - mae: 0.3879

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2094 - mae: 0.3786 
Epoch 14/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2363 - mae: 0.4060

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2123 - mae: 0.3821 
Epoch 15/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2321 - mae: 0.3968

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2100 - mae: 0.3781 
Epoch 16/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2015 - mae: 0.3811

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2008 - mae: 0.3723 
Epoch 17/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1914 - mae: 0.3636

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1948 - mae: 0.3638 
Epoch 18/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1795 - mae: 0.3446

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1966 - mae: 0.3655 
Epoch 19/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2231 - mae: 0.3934

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2054 - mae: 0.3764 
Epoch 20/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1914 - mae: 0.3492

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1961 - mae: 0.3651 
Epoch 21/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1886 - mae: 0.3582

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1921 - mae: 0.3628 
Epoch 22/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 101ms/step - loss: 0.1671 - mae: 0.3369

 6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.1807 - mae: 0.3487 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.1841 - mae: 0.3533 
Epoch 23/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1822 - mae: 0.3523

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1829 - mae: 0.3519 
Epoch 24/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2076 - mae: 0.3930

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1847 - mae: 0.3558 
Epoch 25/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1919 - mae: 0.3574

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1779 - mae: 0.3465 
Epoch 26/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 102ms/step - loss: 0.1352 - mae: 0.2950

 6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.1621 - mae: 0.3269 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.1661 - mae: 0.3336 
Epoch 27/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.1659 - mae: 0.3404

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1676 - mae: 0.3386 
Epoch 28/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1713 - mae: 0.3365

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1610 - mae: 0.3272 
Epoch 29/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1312 - mae: 0.2926

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1551 - mae: 0.3213 
Epoch 30/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1447 - mae: 0.2960

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1495 - mae: 0.3132 
Epoch 31/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1171 - mae: 0.2751

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1434 - mae: 0.3064 
Epoch 32/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1356 - mae: 0.2948

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1405 - mae: 0.3032 
Epoch 33/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1493 - mae: 0.3175

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1412 - mae: 0.3053 
Epoch 34/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1133 - mae: 0.2627

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1333 - mae: 0.2936 
Epoch 35/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1199 - mae: 0.2669

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1187 - mae: 0.2749 
Epoch 36/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1346 - mae: 0.2950

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1258 - mae: 0.2832 
Epoch 37/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1010 - mae: 0.2585

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1147 - mae: 0.2718 
Epoch 38/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1761 - mae: 0.3568

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1248 - mae: 0.2854 
Epoch 39/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0802 - mae: 0.2240

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1029 - mae: 0.2551 
Epoch 40/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1106 - mae: 0.2669

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1075 - mae: 0.2630 
Epoch 41/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0908 - mae: 0.2382

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1089 - mae: 0.2643 
Epoch 42/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1118 - mae: 0.2672

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1023 - mae: 0.2548 
Epoch 43/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1066 - mae: 0.2529

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0943 - mae: 0.2431 
Epoch 44/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0959 - mae: 0.2498

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0971 - mae: 0.2518 
Epoch 45/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0699 - mae: 0.2166

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0822 - mae: 0.2307 
Epoch 46/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0791 - mae: 0.2246

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0866 - mae: 0.2353 
Epoch 47/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0670 - mae: 0.2085

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0831 - mae: 0.2329 
Epoch 48/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 97ms/step - loss: 0.0664 - mae: 0.2130

 7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0795 - mae: 0.2294

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0817 - mae: 0.2321 
Epoch 49/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0776 - mae: 0.2332

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0809 - mae: 0.2305 
Epoch 50/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0888 - mae: 0.2406

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0832 - mae: 0.2329 
Epoch 51/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 101ms/step - loss: 0.0753 - mae: 0.2121

 6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0816 - mae: 0.2275 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0797 - mae: 0.2265 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0795 - mae: 0.2264
Epoch 52/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0701 - mae: 0.2154

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0756 - mae: 0.2227 
Epoch 53/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0782 - mae: 0.2288

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0754 - mae: 0.2235 
Epoch 54/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0849 - mae: 0.2431

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0725 - mae: 0.2200 
Epoch 55/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0634 - mae: 0.2006

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0722 - mae: 0.2178 
Epoch 56/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0693 - mae: 0.2162

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0739 - mae: 0.2220 
Epoch 57/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0495 - mae: 0.1800

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0676 - mae: 0.2104 
Epoch 58/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0535 - mae: 0.1887

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0659 - mae: 0.2073 
Epoch 59/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0602 - mae: 0.1979

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0604 - mae: 0.1989 
Epoch 60/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0545 - mae: 0.1844

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0613 - mae: 0.2009 
Epoch 61/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0624 - mae: 0.2046

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0594 - mae: 0.1984 
Epoch 62/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0648 - mae: 0.2063

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0582 - mae: 0.1940 
Epoch 63/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0588 - mae: 0.1932

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0574 - mae: 0.1931 
Epoch 64/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0549 - mae: 0.1853

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0555 - mae: 0.1863 
Epoch 65/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0440 - mae: 0.1686

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0521 - mae: 0.1853 
Epoch 66/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0624 - mae: 0.2053

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0532 - mae: 0.1895 
Epoch 67/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0553 - mae: 0.1879

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0506 - mae: 0.1816 
Epoch 68/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0508 - mae: 0.1860

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0463 - mae: 0.1747 
Epoch 69/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 97ms/step - loss: 0.0380 - mae: 0.1574

 7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0413 - mae: 0.1633 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0429 - mae: 0.1664
Epoch 70/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0334 - mae: 0.1392

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0434 - mae: 0.1677 
Epoch 71/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 103ms/step - loss: 0.0519 - mae: 0.1863

 6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0425 - mae: 0.1667 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0410 - mae: 0.1634 
Epoch 72/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0352 - mae: 0.1557

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0391 - mae: 0.1590 
Epoch 73/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0376 - mae: 0.1574

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0366 - mae: 0.1539 
Epoch 74/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0367 - mae: 0.1496

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0366 - mae: 0.1524 
Epoch 75/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0227 - mae: 0.1244

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0309 - mae: 0.1394 
Epoch 76/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0279 - mae: 0.1351

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0307 - mae: 0.1406 
Epoch 77/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0318 - mae: 0.1412

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0317 - mae: 0.1419 
Epoch 78/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 102ms/step - loss: 0.0315 - mae: 0.1357

 7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0277 - mae: 0.1288  

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - loss: 0.0281 - mae: 0.1308
Epoch 79/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0215 - mae: 0.1176

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0261 - mae: 0.1280 
Epoch 80/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0246 - mae: 0.1279

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0276 - mae: 0.1303 
Epoch 81/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0312 - mae: 0.1361

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0281 - mae: 0.1310 
Epoch 82/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0306 - mae: 0.1378

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0259 - mae: 0.1273 
Epoch 83/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0184 - mae: 0.1054

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0244 - mae: 0.1229 
Epoch 84/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0218 - mae: 0.1127

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0251 - mae: 0.1227 
Epoch 85/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0176 - mae: 0.1049

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0221 - mae: 0.1168 
Epoch 86/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0171 - mae: 0.1052

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0216 - mae: 0.1153 
Epoch 87/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0255 - mae: 0.1243

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0236 - mae: 0.1216 
Epoch 88/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0239 - mae: 0.1225

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0245 - mae: 0.1233 
Epoch 89/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0217 - mae: 0.1155

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0222 - mae: 0.1176 
Epoch 90/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0218 - mae: 0.1213

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0231 - mae: 0.1209 
Epoch 91/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0213 - mae: 0.1137

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0225 - mae: 0.1175 
Epoch 92/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0161 - mae: 0.1003

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0227 - mae: 0.1188 
Epoch 93/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 101ms/step - loss: 0.0162 - mae: 0.1032

 6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0210 - mae: 0.1135 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0214 - mae: 0.1148 
Epoch 94/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0192 - mae: 0.1115

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0215 - mae: 0.1163 
Epoch 95/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0229 - mae: 0.1221

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0219 - mae: 0.1176 
Epoch 96/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0177 - mae: 0.1076

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0203 - mae: 0.1122 
Epoch 97/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0171 - mae: 0.1048

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0199 - mae: 0.1117 
Epoch 98/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 103ms/step - loss: 0.0193 - mae: 0.1100

 7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0197 - mae: 0.1104  

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0200 - mae: 0.1113
Epoch 99/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0254 - mae: 0.1310

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0203 - mae: 0.1141 
Epoch 100/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0244 - mae: 0.1238

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0216 - mae: 0.1156 
Epoch 101/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0256 - mae: 0.1242

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0216 - mae: 0.1149 
Epoch 102/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0252 - mae: 0.1319

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0222 - mae: 0.1182 
Epoch 103/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 99ms/step - loss: 0.0185 - mae: 0.1105

 6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0201 - mae: 0.1133

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0204 - mae: 0.1136
Epoch 104/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0179 - mae: 0.1020

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0203 - mae: 0.1110 
Epoch 105/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 100ms/step - loss: 0.0177 - mae: 0.1048

 7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0182 - mae: 0.1061  

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0188 - mae: 0.1079
Epoch 106/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0195 - mae: 0.1119

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0189 - mae: 0.1092 
Epoch 107/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0210 - mae: 0.1147

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0205 - mae: 0.1138 
Epoch 108/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0263 - mae: 0.1318

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0202 - mae: 0.1127 
Epoch 109/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0169 - mae: 0.1012

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0194 - mae: 0.1105 
Epoch 110/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0234 - mae: 0.1196

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0192 - mae: 0.1092 
Epoch 111/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0175 - mae: 0.1086

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0197 - mae: 0.1102 
Epoch 112/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0219 - mae: 0.1156

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0196 - mae: 0.1114 
Epoch 113/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0134 - mae: 0.0893

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0174 - mae: 0.1041 
Epoch 114/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0154 - mae: 0.1007

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0182 - mae: 0.1064 
Epoch 115/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0219 - mae: 0.1166

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0190 - mae: 0.1088 
Epoch 116/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0221 - mae: 0.1212

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0189 - mae: 0.1082 
Epoch 117/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0153 - mae: 0.0999

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0174 - mae: 0.1052 
Epoch 118/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0239 - mae: 0.1201

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0193 - mae: 0.1089 
Epoch 119/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0225 - mae: 0.1153

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0200 - mae: 0.1110 
Epoch 120/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0170 - mae: 0.1056

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0173 - mae: 0.1046 
Epoch 121/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 93ms/step - loss: 0.0142 - mae: 0.0961

 7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0178 - mae: 0.1063

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0180 - mae: 0.1069 
Epoch 122/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0262 - mae: 0.1293

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0188 - mae: 0.1080 
Epoch 123/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0161 - mae: 0.0998

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0173 - mae: 0.1037 
Epoch 124/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 101ms/step - loss: 0.0189 - mae: 0.1058

 7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0185 - mae: 0.1064 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0181 - mae: 0.1059
Epoch 125/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0208 - mae: 0.1205

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0187 - mae: 0.1100 
Epoch 126/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0193 - mae: 0.1061

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0178 - mae: 0.1043 
Epoch 127/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0166 - mae: 0.1022

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0183 - mae: 0.1060 
Epoch 128/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0236 - mae: 0.1201

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0185 - mae: 0.1067 
Epoch 129/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0205 - mae: 0.1159

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0179 - mae: 0.1067 
Epoch 130/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0148 - mae: 0.0970

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0168 - mae: 0.1020 
Epoch 131/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0152 - mae: 0.0983

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0165 - mae: 0.1017 
Epoch 132/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0125 - mae: 0.0883

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0166 - mae: 0.1014 
Epoch 133/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0196 - mae: 0.1135

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0181 - mae: 0.1074 
Epoch 134/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0168 - mae: 0.1024

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0183 - mae: 0.1064 
Epoch 135/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0147 - mae: 0.0936

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0170 - mae: 0.1027 
Epoch 136/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 103ms/step - loss: 0.0143 - mae: 0.0948

 6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0161 - mae: 0.1008 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0169 - mae: 0.1030 
Epoch 137/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0192 - mae: 0.1143

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0172 - mae: 0.1060 
Epoch 138/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0196 - mae: 0.1059

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0179 - mae: 0.1051 
Epoch 139/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0210 - mae: 0.1151

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0178 - mae: 0.1048 
Epoch 140/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0217 - mae: 0.1179

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0192 - mae: 0.1097 
Epoch 141/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0205 - mae: 0.1165

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0177 - mae: 0.1061 
Epoch 142/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0219 - mae: 0.1157

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0180 - mae: 0.1061 
Epoch 143/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0163 - mae: 0.0996

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0177 - mae: 0.1058 
Epoch 144/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0156 - mae: 0.0958

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0170 - mae: 0.1022 
Epoch 145/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0176 - mae: 0.1045

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0165 - mae: 0.1012 
Epoch 146/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0144 - mae: 0.0984

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0172 - mae: 0.1049 
Epoch 147/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0186 - mae: 0.1072

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0185 - mae: 0.1074 
Epoch 148/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0152 - mae: 0.1001

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0158 - mae: 0.0992 
Epoch 149/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 101ms/step - loss: 0.0119 - mae: 0.0863

 6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0138 - mae: 0.0919 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0156 - mae: 0.0983 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0157 - mae: 0.0987
Epoch 150/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0115 - mae: 0.0833

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0180 - mae: 0.1041 
Epoch 151/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0161 - mae: 0.1037

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0193 - mae: 0.1090 
Epoch 152/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0184 - mae: 0.1096

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0182 - mae: 0.1082 
Epoch 153/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0163 - mae: 0.0944

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0168 - mae: 0.1012 
Epoch 154/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0200 - mae: 0.1148

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0174 - mae: 0.1052 
Epoch 155/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0173 - mae: 0.1006

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0168 - mae: 0.1024 
Epoch 156/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0231 - mae: 0.1236

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0185 - mae: 0.1076 
Epoch 157/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0159 - mae: 0.1019

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0158 - mae: 0.1003 
Epoch 158/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0202 - mae: 0.1147

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0191 - mae: 0.1094 
Epoch 159/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0193 - mae: 0.1098

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0179 - mae: 0.1052 
Epoch 160/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0173 - mae: 0.1071

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0164 - mae: 0.1028 
Epoch 161/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 104ms/step - loss: 0.0148 - mae: 0.0973

 7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0159 - mae: 0.0991 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0162 - mae: 0.1003 
Epoch 162/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0167 - mae: 0.1003

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0161 - mae: 0.1005 
Epoch 163/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0116 - mae: 0.0844

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0157 - mae: 0.0995 
Epoch 164/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0131 - mae: 0.0914

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0152 - mae: 0.0981 
Epoch 165/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 97ms/step - loss: 0.0167 - mae: 0.1030

 7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0161 - mae: 0.1010

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0161 - mae: 0.1008 
Epoch 166/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0153 - mae: 0.0948

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0160 - mae: 0.0992 
Epoch 167/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0146 - mae: 0.0958

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0157 - mae: 0.0990 
Epoch 168/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0134 - mae: 0.0907

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0156 - mae: 0.0977 
Epoch 169/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0163 - mae: 0.1028

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0169 - mae: 0.1024 
Epoch 170/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0126 - mae: 0.0858

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0156 - mae: 0.0976 
Epoch 171/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0169 - mae: 0.1055

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0168 - mae: 0.1032 
Epoch 172/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0159 - mae: 0.1001

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0169 - mae: 0.1020 
Epoch 173/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0168 - mae: 0.0984

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0160 - mae: 0.0992 
Epoch 174/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0172 - mae: 0.1053

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0165 - mae: 0.1025 
Epoch 175/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0119 - mae: 0.0877

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0145 - mae: 0.0954 
Epoch 176/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0138 - mae: 0.0893

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0155 - mae: 0.0985 
Epoch 177/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 97ms/step - loss: 0.0156 - mae: 0.0969

 7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0169 - mae: 0.1027 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0171 - mae: 0.1034
Epoch 178/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0206 - mae: 0.1148

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0163 - mae: 0.1011 
Epoch 179/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0198 - mae: 0.1066

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0167 - mae: 0.1004 
Epoch 180/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0218 - mae: 0.1248

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0170 - mae: 0.1046 
Epoch 181/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0156 - mae: 0.1010

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0158 - mae: 0.0995 
Epoch 182/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0162 - mae: 0.0999

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0165 - mae: 0.1006 
Epoch 183/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0148 - mae: 0.0991

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0157 - mae: 0.0990 
Epoch 184/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0136 - mae: 0.0904

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0159 - mae: 0.0989 
Epoch 185/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0189 - mae: 0.1089

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0173 - mae: 0.1043 
Epoch 186/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0166 - mae: 0.1005

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0177 - mae: 0.1052 
Epoch 187/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0164 - mae: 0.1027

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0160 - mae: 0.0998 
Epoch 188/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0153 - mae: 0.1018

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0157 - mae: 0.0988 
Epoch 189/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0151 - mae: 0.0935

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0158 - mae: 0.0999 
Epoch 190/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0116 - mae: 0.0850

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0155 - mae: 0.0986 
Epoch 191/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 100ms/step - loss: 0.0161 - mae: 0.1000

 7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0187 - mae: 0.1090  

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0186 - mae: 0.1081
Epoch 192/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0153 - mae: 0.0973

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0183 - mae: 0.1067 
Epoch 193/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0162 - mae: 0.1028

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0176 - mae: 0.1053 
Epoch 194/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0182 - mae: 0.1044

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0162 - mae: 0.1002 
Epoch 195/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0146 - mae: 0.0940

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0171 - mae: 0.1031 
Epoch 196/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0143 - mae: 0.0977

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0174 - mae: 0.1049 
Epoch 197/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 97ms/step - loss: 0.0142 - mae: 0.0956

 7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0172 - mae: 0.1050

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0172 - mae: 0.1045 
Epoch 198/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0190 - mae: 0.1086

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0175 - mae: 0.1051 
Epoch 199/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0162 - mae: 0.1005

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0186 - mae: 0.1083 
Epoch 200/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0139 - mae: 0.0950

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0176 - mae: 0.1046 
phase apprentissage: 17.49 seconds
modele_lstm.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                     Output Shape                  Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ lstm (LSTM)                     │ (None, 14)             │           896 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 14)             │           210 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense_1 (Dense)                 │ (None, 7)              │           105 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense_2 (Dense)                 │ (None, 7)              │            56 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 3,803 (14.86 KB)
 Trainable params: 1,267 (4.95 KB)
 Non-trainable params: 0 (0.00 B)
 Optimizer params: 2,536 (9.91 KB)
ypred = modele_lstm.predict(Xlearn, verbose=True)
print(Xlearn.shape,ypred.shape)
Ylearn = ylearn.reshape(ylearn.shape[0],nap,)
print("R2 score {:.2f}".format(r2_score(Ylearn, ypred)))
print("model evaluate loss/mae")
modele_lstm.evaluate(Xlearn,ylearn)
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 101ms/step

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step  

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step
(400, 14, 1) (400, 7)
R2 score 0.98
model evaluate loss/mae
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 163ms/step - loss: 0.0193 - mae: 0.1107

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0157 - mae: 0.0985  
[0.015791576355695724, 0.09871227294206619]
# prediction à partir de t2
t2 = t0 
Xpred  = np.array([ys[t2-nav:t2]]).reshape(1,nav,1)
ypred = modele_lstm.predict(Xpred, verbose=True)
print(Xpred.shape,ypred.shape)
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step

1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step
(1, 14, 1) (1, 7)
Xpred = Xpred.reshape(1,nav,)
ypred = ypred.reshape(nap)
plot_pred()
../../_images/bdf7771154aa3dd5a5698c1e512e61047c92f73ad41dfe9451ee710adbb160c8.png

6.7. bibliographie#

6.8. FIN#