9. Analyse de séries temporelles avec IA#

Marc Buffat dpt mécanique, UCB Lyon1

time series

import tensorflow as tf
2025-11-19 16:23:51.433795: I external/local_xla/xla/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used.
2025-11-19 16:23:51.437895: I external/local_xla/xla/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used.
2025-11-19 16:23:51.448689: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1763565831.465661  646540 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1763565831.470727  646540 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1763565831.484345  646540 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1763565831.484364  646540 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1763565831.484366  646540 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1763565831.484368  646540 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-19 16:23:51.488957: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
%matplotlib inline
import numpy as np
import matplotlib.pyplot as plt
# police des titres
plt.rc('font', family='serif', size='18')
from IPython.display import display,Markdown
# IA
import sklearn as sk
import tensorflow as tf
_uid_ = 12345
def serie_temp(N,a0=1.0,a1=0.5,a2 = 0.4, a3=0.1):
    # data / jours 
    np.random.seed(_uid_)
    # time series
    Ts = np.array([x for x in np.arange(N)],dtype=int)
    ys = [ a0*np.sin(2*np.pi*x/180) + a1*np.cos(2*np.pi*x/15) \
         + a2*x/360  for x in range(N)] + \
           a3*np.random.normal(size=N,scale=0.2)
    return Ts,ys

9.1. Objectifs#

On étudie un système temporel \(Y(t)\) et on souhaite prédire l’évolution du système: i.e. la prévision de ses futures réalisations en se basant sur ses valeurs passées

Une série temporelle Yt est communément décomposée en tendance, saisonnalité, bruit:

\[Y(t) =T(t)+S(t)+\epsilon(t)\]
  • tendance \(T(t)\) = évolution à long terme

  • saisonnalité \(S(t)\) = phénoméne périodique

  • bruit \(\epsilon(t)\) = partie aléatoire

9.1.1. méthodes#

méthodes classiques: (modélisation de série chro. linéaires):

  • lissages exponentiels,

  • modèles de régression (régression linéaire, modèles non-paramétriques… ),

  • modèles SARIMA

utilisation de l’IA:

  • random forest,

  • réseaux de neuronnes récurrents LSTM

9.2. Génération des données#

  • Série temporelle \(Y = Y(t)\)

  • N mesures à intervalle régulier \(\Delta t\)

    • tableau de données ys

      \[ys[i] = Y(i\Delta t)\]
    • tableau ts (pour l’analyse)

      \[ts[i] = i\Delta t\]

tests

  1. série périodique simple

  2. serie bi-périodique (modulation)

  3. avec tendance à long terme

  4. avec du bruit

# construction serie temporelle
# cas periodique le plus simple
Ts,ys = serie_temp(1000,a0=0,a1=0.5,a2=0.0,a3 = 0.)
# cas bi-periodique 
#Ts,ys = serie_temp(1000,a0=1.0,a1=0.5,a2=0.0,a3=0.0)
# + tendance 
#Ts,ys = serie_temp(1000,a0=1.0,a1=0.5,a2=0.2,a3=0.0)
# + bruit
Ts,ys = serie_temp(1000,a0=1.0,a1=0.5,a2=0.2,a3=0.3)
plt.figure(figsize=(12,8))
plt.subplot(1,2,1)
plt.plot(Ts[:],ys)
plt.xlabel("jour")
plt.title("serie temporelle");
plt.subplot(1,2,2)
plt.plot(Ts[:100],ys[:100])
plt.xlabel("jour")
Text(0.5, 0, 'jour')
../../_images/bf9b80bd688eda840d0243d4f61002379bb6d6331c1d67c910d6515d340fa59d.png

9.3. préparation des données#

fenêtrage des données:

choix d’une fenêtre de nav jours précédents pour prédire nap valeurs (i.e. sur nap jours)

  • nav taille de la fenêtre d’histoire (avant)

  • nap taille de la fenêtre prédiction (après)

  • N nbre de fenêtres

  • t0 date de début prédiction

def dataset(Ts,ys,nav,nap,N,t0):
    # choix d'une fenetre de nav jours précédents pour prédir nap valeurs (i.e. sur nap jours)
    # nav taille de la fenetre d'histoire (avant)
    # nap taille de la fenetre prediction (apres)
    # N nbre de fenetres
    # t0 date de debut prediction
    # 
    t1 = t0 - N - nav -nap
    print(f"apprentissage sur {N} fenetres de {nav}-{nap} jours entre le jour {t1} et {t0}")
    # 
    X  = np.zeros((N,nav))
    y  = np.zeros((N,nap))
    t  = np.zeros(N,dtype=int)
    # construction de la base de données
    for i in range(N):
        X[i,:] = ys[t1+i:t1+i+nav]
        y[i]   = ys[t1+i+nav:t1+i+nav+nap]
        t[i]   = Ts[t1+i+nav]
    return X,y,t
# N fenetres: de 14 jours -> 7 jours pour prediction à partir du jour t0
nav = 14
nap = 7
#N  = 200
#t0 = 300
N = 400
t0 = 600
X,y,t = dataset(Ts,ys,nav,nap,N,t0)
apprentissage sur 400 fenetres de 14-7 jours entre le jour 179 et 600
X.shape, y.shape, t.shape
((400, 14), (400, 7), (400,))
def plot_dataset():
    plt.figure(figsize=(14,6))
    plt.subplot(1,2,1)
    plt.plot(t-nav,X[:,0])
    plt.plot(t,y[:,0])
    plt.xlabel("jour")
    plt.ylabel("y")
    plt.title("data apprentissage")
    plt.subplot(1,2,2)
    plt.plot(np.arange(t[0]-nav,t[0]+nap),ys[t[0]-nav:t[0]+nap],'--')
    plt.plot(np.arange(t[0]-nav,t[0]),X[0,:],'or')
    plt.plot(np.arange(t[0],t[0]+nap),y[0,:],'xg')
    plt.plot(np.arange(t[-1]-nav,t[-1]+nap),ys[t[-1]-nav:t[-1]+nap],'--')
    plt.plot(np.arange(t[-1]-nav,t[-1]),X[-1,:],'or')
    plt.plot(np.arange(t[-1],t[-1]+nap),y[-1,:],'xg')
    plt.xlabel("jour")
    plt.title("first/last window");
    return
plot_dataset()
../../_images/d3ab5bfeb0f4bad6b29fbeee0fbae3990913353745ba27072e90fc94e31e28db.png

9.4. Scikit Learn RandomForest#

“forêt aléatoire” d’arbres de décision

  • prédiction 1 valeur à la fois

random forest

9.5. Réseau de neurones: LSTM/ RNN#

LSTM = Long Short-Term Memory

  • réseau RNN récurrent

  • fonction activation: évite l’explosion de la sortie (tanh )

  • méthode de gradient numérique (\(\alpha\) taux d’apprentissage) $\( w_{k+1} = w_k - \alpha F_w\)$

  • EPOCH = nbre d’epoques pour l’apprentissage

Le nombre d’époques est un hyperparamètre qui définit le nombre de fois que l’algorithme d’apprentissage parcours l’ensemble des données d’entraînement

  1. Modèle de neuronne informatique

../../_images/neuroneformel-1.png

la sortie \(y\) est une fonction non linéaire des entrées (f = fonction d’activation)

\[ y = f(\sum_i w_i x_i + b) \]

les coefficients \(w_i, b\) sont obtenu par minimisation d’une erreur \(Err = || y_{pred} - \hat{y} ||\) à partir d’une base de données d’apprentissage \(\hat{y}\) en utilisant des algorithmes de minimisation (gradient)

  1. Réseau de neuronnes par couche

../../_images/reseau_neuronne.png
  1. Réseau de neuronnes récurrents (traitement de séquence temporelle)

../../_images/reseau-RNN.png
\[ y^t = f(\sum_i w_i x^t_i + b + \sum_j r_j y^t_j) \]

9.5.1. Réseaux RNN#

images/Architecture-RNN.jpg

9.5.2. La problématique de l’apprentissage d’un réseau récurrent#

réseau récurrent simple classique constitué d’une couche récurrente suivie d’une couche dense :

../../_images/RNNsimple.png

Il comprend trois matrices de poids : W, R et V ; R étant la matrice des poids récurrents. L’apprentissage du réseau consiste donc à apprendre ces trois matrices sur une base d’exemples étiquetés.

Or l’algorithme de minimisation par gradient pour les réseaux de neuronnes utilise un algorithme appelé rétropropagation du gradient. Cet algorithme rétropropage le gradient de l’erreur à travers les différentes couches de poids du réseau, en remontant de la dernière à la première couche.

Malheureusement, dans le cas des réseaux récurrents, la présence du cycle de récurrence (matrice R) interdit l’utilisation de cet algorithme

9.5.3. solution : rétropropagation à travers le temps#

La solution à ce problème consiste à exploiter la version dépliée du réseau, qui élimine les cycles.

Nous allons donc utiliser une approximation du réseau récurrent par un réseau déplié K fois (K = profondeur = nbre de couches internes cachés de 10 a 100) , comme présenté sur la figure suivante avec K=2 :

../../_images/RNNdeplie.png

Attention

  • Le réseau déplié étant plus profond, la disparition du gradient (ou gradient évanescent) est plus importante durant l’apprentissage, et il est plus difficile à entraîner à cause d’une erreur qui tend à s’annuler en se rapprochant des couches basses.

Il est donc important d’utiliser toutes les stratégies possibles permettant de lutter contre ce phénomène : Batch Normalization, dropout, régularisation L1 et L2, etc.

  • Comme les poids de la couche récurrente sont dupliqués, les réseaux récurrents sont également sujets à un autre phénomène appelé explosion du gradient. Il s’agit d’un gradient d’erreur dont la norme est supérieure à 1.

Une méthode simple et efficace pour éviter cela consiste à tester cette norme, et à la limiter si elle est trop importante (aussi appelée gradient clipping, en anglais).

9.5.4. neuronne LSTM : Long Short Term Memory#

Afin de modéliser des dépendances à très long terme, il est nécessaire de donner aux réseaux de neurones récurrents la capacité de maintenir un état sur une longue période de temps.

C’est le but des cellules LSTM (Long Short Term Memory), qui possèdent une mémoire interne appelée cellule (ou cell). La cellule permet de maintenir un état aussi longtemps que nécessaire. Cette cellule consiste en une valeur numérique que le réseau peut piloter en fonction des situations.

../../_images/RNN_LSTM.png

la cellule mémoire peut être pilotée par trois portes de contrôle qu’on peut voir comme des vannes :

  • la porte d’entrée décide si l’entrée doit modifier le contenu de la cellule

  • la porte d’oubli décide s’il faut remettre à 0 le contenu de la cellule

  • la porte de sortie décide si le contenu de la cellule doit influer sur la sortie du neurone

Le mécanisme des trois portes est strictement similaire. L’ouverture/la fermeture de la vanne est modélisée par une fonction d’activation f qui est généralement une sigmoïde. Cette sigmoïde est appliquée à la somme pondérée des entrées, des sorties et de la cellule, avec des poids spécifiques.

Pour calculer la sortie \(y^t\), on utilise donc l’entrée \(x^t\), les états cachés \(h^{t-1}\) (\(x^{t-1},x^{t-2}\)) (dépliement de la récurrence) qui représentent la mémoire à court terme (short-term mémory) et les états des cellules mémoires \(c^{t-1}\) qui représentent la mémoire à long terme (long-term memory)

Comme n’importe quel neurone, les neurones LSTM sont généralement utilisés en couches. Dans ce cas, les sorties de tous les neurones sont réinjectées en entrée de tous les neurones.

Compte tenu de toutes les connexions nécessaires au pilotage de la cellule mémoire, les couches de neurones de type LSTM sont deux fois plus « lourdes » que les couches récurrentes simples, qui elles-mêmes sont deux fois plus lourdes que les couches denses classiques.

Les couches LSTM sont donc à utiliser avec parcimonie !

9.6. Mise en oeuvre#

9.6.1. Apprentissage RandomForest#

  • scikit learn

from sklearn.linear_model import LinearRegression
from sklearn.ensemble import RandomForestRegressor
from sklearn.neighbors import KNeighborsRegressor
from sklearn.metrics   import r2_score
# choix de l'algorithme
clf = RandomForestRegressor()
#clf = KNeighborsRegressor()
#clf = LinearRegression()
Xlearn = X.copy()
ylearn = y[:,0]
clf.fit(Xlearn,ylearn)
RandomForestRegressor()
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
print("score = {:2d}%".format(int(100*clf.score(Xlearn, ylearn))))
yp = clf.predict(Xlearn)
print("R2 = {:3.2f}%".format(r2_score(ylearn,yp)))
score = 99%
R2 = 1.00%
def plot_pred():
    plt.figure(figsize=(10,6))
    plt.plot(Ts[t2:t2+nap],ypred,'x')
    plt.plot(Ts[t2-nav:t2],Xpred[0],'--o')
    plt.plot(Ts[t2-nav:t2+nap],ys[t2-nav:t2+nap],'--')
    plt.xlabel("jour")
    plt.title(f"prediction sur {nap} jours à partir du jour {t2}");
    return
# prediction à partir de t2
t2 = t0 
Xpred  = np.array([ys[t2-nav:t2]])
ypred  = np.zeros(nap)
Xp     = Xpred.copy()
ypred[0] = clf.predict(Xp)[0]
for i in range(1,nap):
    Xp[0,:-i] = Xpred[0,i:]
    Xp[0,-i:] = ypred[:i]
    ypred[i] = clf.predict(Xp)[0]
Xpred.shape, ypred.shape
((1, 14), (7,))
plot_pred()
../../_images/79f480d398aa718a733d769ad75cd2cabf127fd10181be7eaea7c6f649cc6515.png

9.6.2. Mise en oeuvre LSTM RNN#

  • bibliothèque tensor flow Keras RNN

#Machine learning
from sklearn import preprocessing
import tensorflow as tf
import statsmodels as st
from statsmodels.tsa.seasonal import STL
from sklearn.model_selection  import train_test_split
Xlearn = X.copy()
ylearn = y.copy()
Xlearn = Xlearn.reshape(X.shape[0], nav, 1)
ylearn = ylearn.reshape(y.shape[0], nap, 1)
Xlearn.shape, ylearn.shape
((400, 14, 1), (400, 7, 1))
#Nombre d'époque d'entrainement (fenetre de taille nav)
#EPOQUE = 300
EPOQUE = 200
#EPOQUE = 50
# modèle du réseaux de neurones(4 rangées (100,100,50,50) dont la première LSTM)
# si pas activation: activation='linear' lineaire a(x)=x, sinon test avec 'relu'
modele_lstm = tf.keras.models.Sequential([
    tf.keras.layers.LSTM(nav),
    tf.keras.layers.Dense(nav,activation='tanh'),
    tf.keras.layers.Dense(nap,activation='tanh'),
    tf.keras.layers.Dense(nap)
])
#Configuration du modèle(on minimise avec la méthode des moindres carrés)
modele_lstm.compile(optimizer='adam', metrics=['mae'], loss='mse')
print(EPOQUE)
200
E0000 00:00:1763565835.955585  646540 cuda_executor.cc:1228] INTERNAL: CUDA Runtime error: Failed call to cudaGetRuntimeVersion: Error loading CUDA libraries. GPU will not be used.: Error loading CUDA libraries. GPU will not be used.
W0000 00:00:1763565835.970021  646540 gpu_device.cc:2341] Cannot dlopen some GPU libraries. Please make sure the missing libraries mentioned above are installed properly if you would like to use GPU. Follow the guide at https://www.tensorflow.org/install/gpu for how to download and setup the required libraries for your platform.
Skipping registering GPU devices...
#Lance l'entrainement du modèle
import time
time_start = time.time()
modele_lstm.fit(Xlearn, ylearn, epochs=EPOQUE, verbose = True)
print('phase apprentissage: {:.2f} seconds'.format(time.time()-time_start))
Epoch 1/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 19s 2s/step - loss: 0.6391 - mae: 0.6473

13/13 ━━━━━━━━━━━━━━━━━━━━ 2s 5ms/step - loss: 0.7139 - mae: 0.7046
Epoch 2/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.6026 - mae: 0.6566

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.6053 - mae: 0.6510 
Epoch 3/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 82ms/step - loss: 0.6963 - mae: 0.7048

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.5747 - mae: 0.6304 
Epoch 4/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.4245 - mae: 0.5399

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.4521 - mae: 0.5540 
Epoch 5/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.3285 - mae: 0.4738

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.3840 - mae: 0.5108 
Epoch 6/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.4082 - mae: 0.5306

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.3594 - mae: 0.4930 
Epoch 7/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2419 - mae: 0.4004

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2942 - mae: 0.4416 
Epoch 8/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.2528 - mae: 0.4034

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2753 - mae: 0.4257 
Epoch 9/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.2467 - mae: 0.3969

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2640 - mae: 0.4183 
Epoch 10/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.2268 - mae: 0.3924

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2508 - mae: 0.4091 
Epoch 11/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.2414 - mae: 0.4012

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2528 - mae: 0.4125 
Epoch 12/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2595 - mae: 0.4250

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2311 - mae: 0.3933 
Epoch 13/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2347 - mae: 0.4004

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2267 - mae: 0.3907 
Epoch 14/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2083 - mae: 0.3720

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2218 - mae: 0.3875 
Epoch 15/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2142 - mae: 0.3838

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2118 - mae: 0.3801 
Epoch 16/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2011 - mae: 0.3667

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2068 - mae: 0.3736 
Epoch 17/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2106 - mae: 0.3846

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2157 - mae: 0.3847 
Epoch 18/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1904 - mae: 0.3584

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2030 - mae: 0.3728 
Epoch 19/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2553 - mae: 0.4270

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2065 - mae: 0.3764 
Epoch 20/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 101ms/step - loss: 0.2061 - mae: 0.3808

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1998 - mae: 0.3707  
Epoch 21/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1419 - mae: 0.3051

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1871 - mae: 0.3574 
Epoch 22/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1904 - mae: 0.3600

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1861 - mae: 0.3551 
Epoch 23/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1715 - mae: 0.3399

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1833 - mae: 0.3526 
Epoch 24/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.1641 - mae: 0.3386

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1805 - mae: 0.3499 
Epoch 25/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 86ms/step - loss: 0.1825 - mae: 0.3532

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1785 - mae: 0.3460 
Epoch 26/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1979 - mae: 0.3642

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1795 - mae: 0.3494 
Epoch 27/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1634 - mae: 0.3292

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1615 - mae: 0.3280 
Epoch 28/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.1831 - mae: 0.3552

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1720 - mae: 0.3382 
Epoch 29/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1574 - mae: 0.3251

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1666 - mae: 0.3319 
Epoch 30/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1419 - mae: 0.3045

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1545 - mae: 0.3190 
Epoch 31/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 81ms/step - loss: 0.1626 - mae: 0.3318

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1558 - mae: 0.3225 
Epoch 32/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.1440 - mae: 0.3087

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1534 - mae: 0.3189 
Epoch 33/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - loss: 0.1280 - mae: 0.2903

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1411 - mae: 0.3063 
Epoch 34/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 74ms/step - loss: 0.1979 - mae: 0.3681

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1519 - mae: 0.3161 
Epoch 35/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.1355 - mae: 0.2996

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1364 - mae: 0.2986 
Epoch 36/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.1582 - mae: 0.3262

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1383 - mae: 0.3007 
Epoch 37/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 85ms/step - loss: 0.1346 - mae: 0.2918

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1321 - mae: 0.2927 
Epoch 38/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 80ms/step - loss: 0.1472 - mae: 0.3139

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1323 - mae: 0.2939 
Epoch 39/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 87ms/step - loss: 0.1270 - mae: 0.2941

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1278 - mae: 0.2889 
Epoch 40/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 89ms/step - loss: 0.1267 - mae: 0.2875

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1287 - mae: 0.2905 
Epoch 41/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 96ms/step - loss: 0.0986 - mae: 0.2498

10/13 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1184 - mae: 0.2750 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1195 - mae: 0.2769
Epoch 42/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.1248 - mae: 0.2807

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1247 - mae: 0.2845 
Epoch 43/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1389 - mae: 0.2964

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1264 - mae: 0.2860 
Epoch 44/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1134 - mae: 0.2694

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1170 - mae: 0.2744 
Epoch 45/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.0793 - mae: 0.2253

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1062 - mae: 0.2613 
Epoch 46/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 80ms/step - loss: 0.1445 - mae: 0.3092

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1208 - mae: 0.2775 
Epoch 47/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1105 - mae: 0.2717

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1067 - mae: 0.2612 
Epoch 48/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1118 - mae: 0.2659

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1069 - mae: 0.2605 
Epoch 49/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.1006 - mae: 0.2372

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1065 - mae: 0.2576 
Epoch 50/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0952 - mae: 0.2456

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1023 - mae: 0.2535 
Epoch 51/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0992 - mae: 0.2456

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1030 - mae: 0.2558 
Epoch 52/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1117 - mae: 0.2708

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1025 - mae: 0.2556 
Epoch 53/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1012 - mae: 0.2491

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0979 - mae: 0.2473 
Epoch 54/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.1063 - mae: 0.2644

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0995 - mae: 0.2519 
Epoch 55/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.1012 - mae: 0.2524

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0969 - mae: 0.2468 
Epoch 56/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0969 - mae: 0.2459

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0937 - mae: 0.2457 
Epoch 57/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - loss: 0.0912 - mae: 0.2414

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0926 - mae: 0.2430 
Epoch 58/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0854 - mae: 0.2328

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0901 - mae: 0.2386 
Epoch 59/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 87ms/step - loss: 0.1005 - mae: 0.2486

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0920 - mae: 0.2400 
Epoch 60/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 84ms/step - loss: 0.0958 - mae: 0.2384

 7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0911 - mae: 0.2376

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0904 - mae: 0.2379

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 12ms/step - loss: 0.0903 - mae: 0.2379
Epoch 61/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1003 - mae: 0.2549

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0868 - mae: 0.2343 
Epoch 62/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.0851 - mae: 0.2306

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0843 - mae: 0.2324 
Epoch 63/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - loss: 0.1103 - mae: 0.2713

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0862 - mae: 0.2378 
Epoch 64/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 88ms/step - loss: 0.0788 - mae: 0.2199

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0766 - mae: 0.2209 
Epoch 65/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 86ms/step - loss: 0.0650 - mae: 0.2032

 4/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0735 - mae: 0.2159

 8/13 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0760 - mae: 0.2195

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0775 - mae: 0.2219
Epoch 66/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0667 - mae: 0.1984

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0766 - mae: 0.2212 
Epoch 67/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 100ms/step - loss: 0.0769 - mae: 0.2127

 7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0746 - mae: 0.2160 

12/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0749 - mae: 0.2174

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0748 - mae: 0.2174
Epoch 68/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.0581 - mae: 0.1909

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0688 - mae: 0.2078 
Epoch 69/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0746 - mae: 0.2211

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0730 - mae: 0.2181 
Epoch 70/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 95ms/step - loss: 0.0663 - mae: 0.1989

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0688 - mae: 0.2064 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0687 - mae: 0.2065
Epoch 71/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0657 - mae: 0.2012

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0720 - mae: 0.2119 
Epoch 72/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0718 - mae: 0.2153

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0666 - mae: 0.2076 
Epoch 73/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0506 - mae: 0.1744

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0589 - mae: 0.1935 
Epoch 74/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0532 - mae: 0.1812

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0589 - mae: 0.1926 
Epoch 75/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0503 - mae: 0.1746

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0545 - mae: 0.1864 
Epoch 76/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0496 - mae: 0.1825

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0576 - mae: 0.1946 
Epoch 77/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0420 - mae: 0.1624

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0492 - mae: 0.1756 
Epoch 78/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0456 - mae: 0.1760

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0493 - mae: 0.1779 
Epoch 79/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0574 - mae: 0.1992

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0472 - mae: 0.1746 
Epoch 80/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0420 - mae: 0.1597

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0445 - mae: 0.1680 
Epoch 81/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0413 - mae: 0.1653

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0406 - mae: 0.1614 
Epoch 82/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0268 - mae: 0.1253

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0357 - mae: 0.1503 
Epoch 83/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 98ms/step - loss: 0.0460 - mae: 0.1744

 7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0392 - mae: 0.1585 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - loss: 0.0376 - mae: 0.1538
Epoch 84/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0399 - mae: 0.1653

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0357 - mae: 0.1517 
Epoch 85/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0381 - mae: 0.1598

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0334 - mae: 0.1454 
Epoch 86/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0302 - mae: 0.1400

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0311 - mae: 0.1388 
Epoch 87/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0302 - mae: 0.1385

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0298 - mae: 0.1370 
Epoch 88/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0234 - mae: 0.1234

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0278 - mae: 0.1316 
Epoch 89/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0319 - mae: 0.1465

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0284 - mae: 0.1345 
Epoch 90/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0205 - mae: 0.1114

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0266 - mae: 0.1301 
Epoch 91/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0276 - mae: 0.1297

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0270 - mae: 0.1302 
Epoch 92/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0217 - mae: 0.1181

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0259 - mae: 0.1277 
Epoch 93/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0222 - mae: 0.1194

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0244 - mae: 0.1252 
Epoch 94/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0276 - mae: 0.1331

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0246 - mae: 0.1239 
Epoch 95/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0265 - mae: 0.1273

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0257 - mae: 0.1260 
Epoch 96/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0228 - mae: 0.1208

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0233 - mae: 0.1209 
Epoch 97/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0284 - mae: 0.1401

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0234 - mae: 0.1219 
Epoch 98/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0208 - mae: 0.1148

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0240 - mae: 0.1226 
Epoch 99/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0218 - mae: 0.1149

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0221 - mae: 0.1169 
Epoch 100/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0257 - mae: 0.1287

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0238 - mae: 0.1229 
Epoch 101/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0244 - mae: 0.1252

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0235 - mae: 0.1226 
Epoch 102/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.0209 - mae: 0.1127

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0232 - mae: 0.1192 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0232 - mae: 0.1192
Epoch 103/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 70ms/step - loss: 0.0216 - mae: 0.1188

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0218 - mae: 0.1176 
Epoch 104/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0215 - mae: 0.1136

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0238 - mae: 0.1216 
Epoch 105/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 78ms/step - loss: 0.0234 - mae: 0.1232

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0227 - mae: 0.1199 
Epoch 106/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 90ms/step - loss: 0.0180 - mae: 0.1058

 6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 0.0215 - mae: 0.1174

12/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0221 - mae: 0.1187

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0221 - mae: 0.1187
Epoch 107/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.0197 - mae: 0.1119

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0216 - mae: 0.1165 
Epoch 108/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0222 - mae: 0.1151

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0225 - mae: 0.1189 
Epoch 109/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0175 - mae: 0.1044

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0200 - mae: 0.1118 
Epoch 110/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0208 - mae: 0.1127

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0208 - mae: 0.1139 
Epoch 111/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0188 - mae: 0.1157

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0204 - mae: 0.1136 
Epoch 112/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0177 - mae: 0.1048

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0191 - mae: 0.1092 
Epoch 113/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0245 - mae: 0.1268

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0208 - mae: 0.1141 
Epoch 114/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0140 - mae: 0.0967

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0198 - mae: 0.1120 
Epoch 115/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 104ms/step - loss: 0.0254 - mae: 0.1255

 6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0212 - mae: 0.1153 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0204 - mae: 0.1131 
Epoch 116/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0222 - mae: 0.1164

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0207 - mae: 0.1136 
Epoch 117/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0205 - mae: 0.1139

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0212 - mae: 0.1147 
Epoch 118/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0209 - mae: 0.1153

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0204 - mae: 0.1130 
Epoch 119/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 99ms/step - loss: 0.0222 - mae: 0.1168

 6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0205 - mae: 0.1130

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0204 - mae: 0.1134 
Epoch 120/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0192 - mae: 0.1124

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0199 - mae: 0.1123 
Epoch 121/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0155 - mae: 0.0963

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0182 - mae: 0.1064 
Epoch 122/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0212 - mae: 0.1217

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0192 - mae: 0.1111 
Epoch 123/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0181 - mae: 0.1044

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0187 - mae: 0.1073 
Epoch 124/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 103ms/step - loss: 0.0158 - mae: 0.1042

 6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0179 - mae: 0.1055 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0187 - mae: 0.1077 
Epoch 125/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0166 - mae: 0.1028

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0194 - mae: 0.1090 
Epoch 126/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0236 - mae: 0.1190

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0201 - mae: 0.1116 
Epoch 127/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 97ms/step - loss: 0.0202 - mae: 0.1110

 7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0188 - mae: 0.1083 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0189 - mae: 0.1088
Epoch 128/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.0138 - mae: 0.0912

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0185 - mae: 0.1066 
Epoch 129/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0138 - mae: 0.0938

12/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0169 - mae: 0.1029 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0172 - mae: 0.1036
Epoch 130/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0155 - mae: 0.0973

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0173 - mae: 0.1033 
Epoch 131/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 83ms/step - loss: 0.0236 - mae: 0.1214

11/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0215 - mae: 0.1159 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0211 - mae: 0.1149
Epoch 132/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0194 - mae: 0.1139

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0186 - mae: 0.1088 
Epoch 133/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.0175 - mae: 0.1071

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0185 - mae: 0.1078 
Epoch 134/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0223 - mae: 0.1241

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0186 - mae: 0.1094 
Epoch 135/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0124 - mae: 0.0898

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0173 - mae: 0.1048 
Epoch 136/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.0193 - mae: 0.1090

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0194 - mae: 0.1096 
Epoch 137/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0240 - mae: 0.1253

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0199 - mae: 0.1116 
Epoch 138/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0150 - mae: 0.0989

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0180 - mae: 0.1050 
Epoch 139/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0240 - mae: 0.1255

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0186 - mae: 0.1072 
Epoch 140/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0211 - mae: 0.1196

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0177 - mae: 0.1061 
Epoch 141/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 98ms/step - loss: 0.0145 - mae: 0.0943

 6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0163 - mae: 0.0996

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0167 - mae: 0.1015
Epoch 142/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0162 - mae: 0.1018

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0181 - mae: 0.1061 
Epoch 143/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0198 - mae: 0.1087

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0189 - mae: 0.1084 
Epoch 144/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0156 - mae: 0.0981

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0177 - mae: 0.1047 
Epoch 145/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0165 - mae: 0.1030

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0177 - mae: 0.1051 
Epoch 146/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 93ms/step - loss: 0.0181 - mae: 0.1046

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0179 - mae: 0.1062 
Epoch 147/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0164 - mae: 0.1008

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0175 - mae: 0.1034 
Epoch 148/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0145 - mae: 0.0972

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0170 - mae: 0.1023 
Epoch 149/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0165 - mae: 0.0950

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0174 - mae: 0.1033 
Epoch 150/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 85ms/step - loss: 0.0182 - mae: 0.1084

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0178 - mae: 0.1055 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0179 - mae: 0.1056
Epoch 151/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0209 - mae: 0.1165

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0181 - mae: 0.1074 
Epoch 152/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 80ms/step - loss: 0.0156 - mae: 0.0999

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0164 - mae: 0.1007 
Epoch 153/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0164 - mae: 0.1066

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0177 - mae: 0.1062 
Epoch 154/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.0157 - mae: 0.0980

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0166 - mae: 0.1024 
Epoch 155/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0148 - mae: 0.0939

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0171 - mae: 0.1026 
Epoch 156/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0161 - mae: 0.1018

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0173 - mae: 0.1035 
Epoch 157/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0161 - mae: 0.0980

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0172 - mae: 0.1036 
Epoch 158/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0203 - mae: 0.1068

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0178 - mae: 0.1039 
Epoch 159/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0152 - mae: 0.0969

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0159 - mae: 0.1000 
Epoch 160/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0194 - mae: 0.1115

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0182 - mae: 0.1077 
Epoch 161/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0151 - mae: 0.0969

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0166 - mae: 0.1023 
Epoch 162/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0198 - mae: 0.1054

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0158 - mae: 0.0976 
Epoch 163/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0150 - mae: 0.0994

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0168 - mae: 0.1035 
Epoch 164/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0152 - mae: 0.0978

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0156 - mae: 0.0990 
Epoch 165/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0163 - mae: 0.1041

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0175 - mae: 0.1047 
Epoch 166/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 102ms/step - loss: 0.0160 - mae: 0.1008

 7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0159 - mae: 0.0996 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0162 - mae: 0.1003
Epoch 167/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0148 - mae: 0.0985

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0172 - mae: 0.1040 
Epoch 168/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 101ms/step - loss: 0.0144 - mae: 0.0921

 7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0169 - mae: 0.1020 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0167 - mae: 0.1017
Epoch 169/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0161 - mae: 0.0992

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0164 - mae: 0.1015 
Epoch 170/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0117 - mae: 0.0853

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0160 - mae: 0.1005 
Epoch 171/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0142 - mae: 0.0945

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0164 - mae: 0.1016 
Epoch 172/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0120 - mae: 0.0886

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0167 - mae: 0.1023 
Epoch 173/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0161 - mae: 0.1020

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0162 - mae: 0.1009 
Epoch 174/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0156 - mae: 0.0952

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0156 - mae: 0.0972 
Epoch 175/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0203 - mae: 0.1146

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0182 - mae: 0.1063 
Epoch 176/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0188 - mae: 0.1098

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0168 - mae: 0.1030 
Epoch 177/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0137 - mae: 0.0913

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0155 - mae: 0.0977 
Epoch 178/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 99ms/step - loss: 0.0171 - mae: 0.1038

 6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0167 - mae: 0.1013

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0165 - mae: 0.1012 
Epoch 179/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0180 - mae: 0.1027

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0164 - mae: 0.1010 
Epoch 180/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0152 - mae: 0.0941

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0156 - mae: 0.0980 
Epoch 181/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 102ms/step - loss: 0.0190 - mae: 0.1100

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0164 - mae: 0.1023  

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0163 - mae: 0.1021
Epoch 182/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 69ms/step - loss: 0.0148 - mae: 0.0960

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0159 - mae: 0.0992 
Epoch 183/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0160 - mae: 0.0976

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0170 - mae: 0.1031 
Epoch 184/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 102ms/step - loss: 0.0162 - mae: 0.0955

 6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0164 - mae: 0.0996 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0163 - mae: 0.1001 
Epoch 185/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0150 - mae: 0.0951

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0164 - mae: 0.1007 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0164 - mae: 0.1006
Epoch 186/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 69ms/step - loss: 0.0178 - mae: 0.1047

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0163 - mae: 0.1003 
Epoch 187/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.0158 - mae: 0.1023

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0166 - mae: 0.1031 
Epoch 188/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 86ms/step - loss: 0.0146 - mae: 0.0949

11/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0163 - mae: 0.1007 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - loss: 0.0162 - mae: 0.1003
Epoch 189/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0228 - mae: 0.1229

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0167 - mae: 0.1028 
Epoch 190/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0132 - mae: 0.0922

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0159 - mae: 0.0992 
Epoch 191/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 105ms/step - loss: 0.0135 - mae: 0.0894

10/13 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0161 - mae: 0.0994  

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0160 - mae: 0.0994
Epoch 192/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0128 - mae: 0.0871

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0152 - mae: 0.0968 
Epoch 193/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0127 - mae: 0.0890

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0148 - mae: 0.0951 
Epoch 194/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 99ms/step - loss: 0.0129 - mae: 0.0925

 6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0134 - mae: 0.0931

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0145 - mae: 0.0960 
Epoch 195/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0133 - mae: 0.0876

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0161 - mae: 0.1000 
Epoch 196/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 86ms/step - loss: 0.0176 - mae: 0.1049

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0166 - mae: 0.1021 
Epoch 197/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.0089 - mae: 0.0749

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0142 - mae: 0.0944 
Epoch 198/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.0127 - mae: 0.0873

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0147 - mae: 0.0953 
Epoch 199/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 91ms/step - loss: 0.0180 - mae: 0.1046

 7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0163 - mae: 0.1009 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0159 - mae: 0.0995

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 12ms/step - loss: 0.0159 - mae: 0.0995
Epoch 200/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0207 - mae: 0.1151

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0164 - mae: 0.1020 
phase apprentissage: 20.27 seconds
modele_lstm.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                     Output Shape                  Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ lstm (LSTM)                     │ (None, 14)             │           896 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 14)             │           210 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense_1 (Dense)                 │ (None, 7)              │           105 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense_2 (Dense)                 │ (None, 7)              │            56 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 3,803 (14.86 KB)
 Trainable params: 1,267 (4.95 KB)
 Non-trainable params: 0 (0.00 B)
 Optimizer params: 2,536 (9.91 KB)
ypred = modele_lstm.predict(Xlearn, verbose=True)
print(Xlearn.shape,ypred.shape)
Ylearn = ylearn.reshape(ylearn.shape[0],nap,)
print("R2 score {:.2f}".format(r2_score(Ylearn, ypred)))
print("model evaluate loss/mae")
modele_lstm.evaluate(Xlearn,ylearn)
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 103ms/step

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step  

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step
(400, 14, 1) (400, 7)
R2 score 0.98
model evaluate loss/mae
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 165ms/step - loss: 0.0172 - mae: 0.1042

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0162 - mae: 0.1012  
[0.015177113004028797, 0.09750653058290482]
# prediction à partir de t2
t2 = t0 
Xpred  = np.array([ys[t2-nav:t2]]).reshape(1,nav,1)
ypred = modele_lstm.predict(Xpred, verbose=True)
print(Xpred.shape,ypred.shape)
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step

1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step
(1, 14, 1) (1, 7)
Xpred = Xpred.reshape(1,nav,)
ypred = ypred.reshape(nap)
plot_pred()
../../_images/3f7097c5c9697df83eff2e72689ea4174c45f2764108cbcf78d4958eb3c41259.png

9.7. bibliographie#

9.8. FIN#