6. Analyse de séries temporelles avec IA#

Marc Buffat dpt mécanique, UCB Lyon1

time series

import tensorflow as tf
2025-09-17 14:28:41.984946: I external/local_xla/xla/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used.
2025-09-17 14:28:44.661996: I external/local_xla/xla/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used.
2025-09-17 14:28:46.231040: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1758112127.414686  244886 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1758112127.615160  244886 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1758112129.837735  244886 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1758112129.837800  244886 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1758112129.837808  244886 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1758112129.837815  244886 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-09-17 14:28:50.020226: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
%matplotlib inline
import numpy as np
import matplotlib.pyplot as plt
# police des titres
plt.rc('font', family='serif', size='18')
from IPython.display import display,Markdown
# IA
import sklearn as sk
import tensorflow as tf
_uid_ = 12345
def serie_temp(N,a0=1.0,a1=0.5,a2 = 0.4, a3=0.1):
    # data / jours 
    np.random.seed(_uid_)
    # time series
    Ts = np.array([x for x in np.arange(N)],dtype=int)
    ys = [ a0*np.sin(2*np.pi*x/180) + a1*np.cos(2*np.pi*x/15) \
         + a2*x/360  for x in range(N)] + \
           a3*np.random.normal(size=N,scale=0.2)
    return Ts,ys

6.1. Objectifs#

On étudie un système temporel \(Y(t)\) et on souhaite prédire l’évolution du système: i.e. la prévision de ses futures réalisations en se basant sur ses valeurs passées

Une série temporelle Yt est communément décomposée en tendance, saisonnalité, bruit:

\[Y(t) =T(t)+S(t)+\epsilon(t)\]
  • tendance \(T(t)\) = évolution à long terme

  • saisonnalité \(S(t)\) = phénoméne périodique

  • bruit \(\epsilon(t)\) = partie aléatoire

6.1.1. méthodes#

méthodes classiques: (modélisation de série chro. linéaires):

  • lissages exponentiels,

  • modèles de régression (régression linéaire, modèles non-paramétriques… ),

  • modèles SARIMA

utilisation de l’IA:

  • random forest,

  • réseaux de neuronnes récurrents LSTM

6.2. Génération des données#

  • Série temporelle \(Y = Y(t)\)

  • N mesures à intervalle régulier \(\Delta t\)

    • tableau de données ys

      \[ys[i] = Y(i\Delta t)\]
    • tableau ts (pour l’analyse)

      \[ts[i] = i\Delta t\]

tests

  1. série périodique simple

  2. serie bi-périodique (modulation)

  3. avec tendance à long terme

  4. avec du bruit

# construction serie temporelle
# cas periodique le plus simple
Ts,ys = serie_temp(1000,a0=0,a1=0.5,a2=0.0,a3 = 0.)
# cas bi-periodique 
#Ts,ys = serie_temp(1000,a0=1.0,a1=0.5,a2=0.0,a3=0.0)
# + tendance 
#Ts,ys = serie_temp(1000,a0=1.0,a1=0.5,a2=0.2,a3=0.0)
# + bruit
Ts,ys = serie_temp(1000,a0=1.0,a1=0.5,a2=0.2,a3=0.3)
plt.figure(figsize=(12,8))
plt.subplot(1,2,1)
plt.plot(Ts[:],ys)
plt.xlabel("jour")
plt.title("serie temporelle");
plt.subplot(1,2,2)
plt.plot(Ts[:100],ys[:100])
plt.xlabel("jour")
Text(0.5, 0, 'jour')
../../_images/bf9b80bd688eda840d0243d4f61002379bb6d6331c1d67c910d6515d340fa59d.png

6.3. préparation des données#

fenêtrage des données:

choix d’une fenêtre de nav jours précédents pour prédire nap valeurs (i.e. sur nap jours)

  • nav taille de la fenêtre d’histoire (avant)

  • nap taille de la fenêtre prédiction (après)

  • N nbre de fenêtres

  • t0 date de début prédiction

def dataset(Ts,ys,nav,nap,N,t0):
    # choix d'une fenetre de nav jours précédents pour prédir nap valeurs (i.e. sur nap jours)
    # nav taille de la fenetre d'histoire (avant)
    # nap taille de la fenetre prediction (apres)
    # N nbre de fenetres
    # t0 date de debut prediction
    # 
    t1 = t0 - N - nav -nap
    print(f"apprentissage sur {N} fenetres de {nav}-{nap} jours entre le jour {t1} et {t0}")
    # 
    X  = np.zeros((N,nav))
    y  = np.zeros((N,nap))
    t  = np.zeros(N,dtype=int)
    # construction de la base de données
    for i in range(N):
        X[i,:] = ys[t1+i:t1+i+nav]
        y[i]   = ys[t1+i+nav:t1+i+nav+nap]
        t[i]   = Ts[t1+i+nav]
    return X,y,t
# N fenetres: de 14 jours -> 7 jours pour prediction à partir du jour t0
nav = 14
nap = 7
#N  = 200
#t0 = 300
N = 400
t0 = 600
X,y,t = dataset(Ts,ys,nav,nap,N,t0)
apprentissage sur 400 fenetres de 14-7 jours entre le jour 179 et 600
X.shape, y.shape, t.shape
((400, 14), (400, 7), (400,))
def plot_dataset():
    plt.figure(figsize=(14,6))
    plt.subplot(1,2,1)
    plt.plot(t-nav,X[:,0])
    plt.plot(t,y[:,0])
    plt.xlabel("jour")
    plt.ylabel("y")
    plt.title("data apprentissage")
    plt.subplot(1,2,2)
    plt.plot(np.arange(t[0]-nav,t[0]+nap),ys[t[0]-nav:t[0]+nap],'--')
    plt.plot(np.arange(t[0]-nav,t[0]),X[0,:],'or')
    plt.plot(np.arange(t[0],t[0]+nap),y[0,:],'xg')
    plt.plot(np.arange(t[-1]-nav,t[-1]+nap),ys[t[-1]-nav:t[-1]+nap],'--')
    plt.plot(np.arange(t[-1]-nav,t[-1]),X[-1,:],'or')
    plt.plot(np.arange(t[-1],t[-1]+nap),y[-1,:],'xg')
    plt.xlabel("jour")
    plt.title("first/last window");
    return
plot_dataset()
../../_images/d3ab5bfeb0f4bad6b29fbeee0fbae3990913353745ba27072e90fc94e31e28db.png

6.4. Scikit Learn RandomForest#

“forêt aléatoire” d’arbres de décision

  • prédiction 1 valeur à la fois

random forest

6.5. Réseau de neurones: LSTM/ RNN#

LSTM = Long Short-Term Memory

  • réseau RNN récurrent

  • fonction activation: évite l’explosion de la sortie (tanh )

  • méthode de gradient numérique (\(\alpha\) taux d’apprentissage) $\( w_{k+1} = w_k - \alpha F_w\)$

  • EPOCH = nbre d’epoques pour l’apprentissage

Le nombre d’époques est un hyperparamètre qui définit le nombre de fois que l’algorithme d’apprentissage parcours l’ensemble des données d’entraînement

  1. Modèle de neuronne informatique

../../_images/neuroneformel-1.png

la sortie \(y\) est une fonction non linéaire des entrées (f = fonction d’activation)

\[ y = f(\sum_i w_i x_i + b) \]

les coefficients \(w_i, b\) sont obtenu par minimisation d’une erreur \(Err = || y_{pred} - \hat{y} ||\) à partir d’une base de données d’apprentissage \(\hat{y}\) en utilisant des algorithmes de minimisation (gradient)

  1. Réseau de neuronnes par couche

../../_images/reseau_neuronne.png
  1. Réseau de neuronnes récurrents (traitement de séquence temporelle)

../../_images/reseau-RNN.png
\[ y^t = f(\sum_i w_i x^t_i + b + \sum_j r_j y^t_j) \]

6.5.1. Réseaux RNN#

images/Architecture-RNN.jpg

6.5.2. La problématique de l’apprentissage d’un réseau récurrent#

réseau récurrent simple classique constitué d’une couche récurrente suivie d’une couche dense :

../../_images/RNNsimple.png

Il comprend trois matrices de poids : W, R et V ; R étant la matrice des poids récurrents. L’apprentissage du réseau consiste donc à apprendre ces trois matrices sur une base d’exemples étiquetés.

Or l’algorithme de minimisation par gradient pour les réseaux de neuronnes utilise un algorithme appelé rétropropagation du gradient. Cet algorithme rétropropage le gradient de l’erreur à travers les différentes couches de poids du réseau, en remontant de la dernière à la première couche.

Malheureusement, dans le cas des réseaux récurrents, la présence du cycle de récurrence (matrice R) interdit l’utilisation de cet algorithme

6.5.3. solution : rétropropagation à travers le temps#

La solution à ce problème consiste à exploiter la version dépliée du réseau, qui élimine les cycles.

Nous allons donc utiliser une approximation du réseau récurrent par un réseau déplié K fois (K = profondeur = nbre de couches internes cachés de 10 a 100) , comme présenté sur la figure suivante avec K=2 :

../../_images/RNNdeplie.png

Attention

  • Le réseau déplié étant plus profond, la disparition du gradient (ou gradient évanescent) est plus importante durant l’apprentissage, et il est plus difficile à entraîner à cause d’une erreur qui tend à s’annuler en se rapprochant des couches basses.

Il est donc important d’utiliser toutes les stratégies possibles permettant de lutter contre ce phénomène : Batch Normalization, dropout, régularisation L1 et L2, etc.

  • Comme les poids de la couche récurrente sont dupliqués, les réseaux récurrents sont également sujets à un autre phénomène appelé explosion du gradient. Il s’agit d’un gradient d’erreur dont la norme est supérieure à 1.

Une méthode simple et efficace pour éviter cela consiste à tester cette norme, et à la limiter si elle est trop importante (aussi appelée gradient clipping, en anglais).

6.5.4. neuronne LSTM : Long Short Term Memory#

Afin de modéliser des dépendances à très long terme, il est nécessaire de donner aux réseaux de neurones récurrents la capacité de maintenir un état sur une longue période de temps.

C’est le but des cellules LSTM (Long Short Term Memory), qui possèdent une mémoire interne appelée cellule (ou cell). La cellule permet de maintenir un état aussi longtemps que nécessaire. Cette cellule consiste en une valeur numérique que le réseau peut piloter en fonction des situations.

../../_images/RNN_LSTM.png

la cellule mémoire peut être pilotée par trois portes de contrôle qu’on peut voir comme des vannes :

  • la porte d’entrée décide si l’entrée doit modifier le contenu de la cellule

  • la porte d’oubli décide s’il faut remettre à 0 le contenu de la cellule

  • la porte de sortie décide si le contenu de la cellule doit influer sur la sortie du neurone

Le mécanisme des trois portes est strictement similaire. L’ouverture/la fermeture de la vanne est modélisée par une fonction d’activation f qui est généralement une sigmoïde. Cette sigmoïde est appliquée à la somme pondérée des entrées, des sorties et de la cellule, avec des poids spécifiques.

Pour calculer la sortie \(y^t\), on utilise donc l’entrée \(x^t\), les états cachés \(h^{t-1}\) (\(x^{t-1},x^{t-2}\)) (dépliement de la récurrence) qui représentent la mémoire à court terme (short-term mémory) et les états des cellules mémoires \(c^{t-1}\) qui représentent la mémoire à long terme (long-term memory)

Comme n’importe quel neurone, les neurones LSTM sont généralement utilisés en couches. Dans ce cas, les sorties de tous les neurones sont réinjectées en entrée de tous les neurones.

Compte tenu de toutes les connexions nécessaires au pilotage de la cellule mémoire, les couches de neurones de type LSTM sont deux fois plus « lourdes » que les couches récurrentes simples, qui elles-mêmes sont deux fois plus lourdes que les couches denses classiques.

Les couches LSTM sont donc à utiliser avec parcimonie !

6.6. Mise en oeuvre#

6.6.1. Apprentissage RandomForest#

  • scikit learn

from sklearn.linear_model import LinearRegression
from sklearn.ensemble import RandomForestRegressor
from sklearn.neighbors import KNeighborsRegressor
from sklearn.metrics   import r2_score
# choix de l'algorithme
clf = RandomForestRegressor()
#clf = KNeighborsRegressor()
#clf = LinearRegression()
Xlearn = X.copy()
ylearn = y[:,0]
clf.fit(Xlearn,ylearn)
RandomForestRegressor()
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
print("score = {:2d}%".format(int(100*clf.score(Xlearn, ylearn))))
yp = clf.predict(Xlearn)
print("R2 = {:3.2f}%".format(r2_score(ylearn,yp)))
score = 99%
R2 = 1.00%
def plot_pred():
    plt.figure(figsize=(10,6))
    plt.plot(Ts[t2:t2+nap],ypred,'x')
    plt.plot(Ts[t2-nav:t2],Xpred[0],'--o')
    plt.plot(Ts[t2-nav:t2+nap],ys[t2-nav:t2+nap],'--')
    plt.xlabel("jour")
    plt.title(f"prediction sur {nap} jours à partir du jour {t2}");
    return
# prediction à partir de t2
t2 = t0 
Xpred  = np.array([ys[t2-nav:t2]])
ypred  = np.zeros(nap)
Xp     = Xpred.copy()
ypred[0] = clf.predict(Xp)[0]
for i in range(1,nap):
    Xp[0,:-i] = Xpred[0,i:]
    Xp[0,-i:] = ypred[:i]
    ypred[i] = clf.predict(Xp)[0]
Xpred.shape, ypred.shape
((1, 14), (7,))
plot_pred()
../../_images/79f480d398aa718a733d769ad75cd2cabf127fd10181be7eaea7c6f649cc6515.png

6.6.2. Mise en oeuvre LSTM RNN#

  • bibliothèque tensor flow Keras RNN

#Machine learning
from sklearn import preprocessing
import tensorflow as tf
import statsmodels as st
from statsmodels.tsa.seasonal import STL
from sklearn.model_selection  import train_test_split
Xlearn = X.copy()
ylearn = y.copy()
Xlearn = Xlearn.reshape(X.shape[0], nav, 1)
ylearn = ylearn.reshape(y.shape[0], nap, 1)
Xlearn.shape, ylearn.shape
((400, 14, 1), (400, 7, 1))
#Nombre d'époque d'entrainement (fenetre de taille nav)
#EPOQUE = 300
EPOQUE = 200
#EPOQUE = 50
# modèle du réseaux de neurones(4 rangées (100,100,50,50) dont la première LSTM)
# si pas activation: activation='linear' lineaire a(x)=x, sinon test avec 'relu'
modele_lstm = tf.keras.models.Sequential([
    tf.keras.layers.LSTM(nav),
    tf.keras.layers.Dense(nav,activation='tanh'),
    tf.keras.layers.Dense(nap,activation='tanh'),
    tf.keras.layers.Dense(nap)
])
#Configuration du modèle(on minimise avec la méthode des moindres carrés)
modele_lstm.compile(optimizer='adam', metrics=['mae'], loss='mse')
print(EPOQUE)
E0000 00:00:1758112152.879626  244886 cuda_executor.cc:1228] INTERNAL: CUDA Runtime error: Failed call to cudaGetRuntimeVersion: Error loading CUDA libraries. GPU will not be used.: Error loading CUDA libraries. GPU will not be used.
W0000 00:00:1758112153.435935  244886 gpu_device.cc:2341] Cannot dlopen some GPU libraries. Please make sure the missing libraries mentioned above are installed properly if you would like to use GPU. Follow the guide at https://www.tensorflow.org/install/gpu for how to download and setup the required libraries for your platform.
Skipping registering GPU devices...
200
#Lance l'entrainement du modèle
import time
time_start = time.time()
modele_lstm.fit(Xlearn, ylearn, epochs=EPOQUE, verbose = True)
print('phase apprentissage: {:.2f} seconds'.format(time.time()-time_start))
Epoch 1/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 35s 3s/step - loss: 0.8127 - mae: 0.7610

 6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 0.7618 - mae: 0.7349

13/13 ━━━━━━━━━━━━━━━━━━━━ 3s 8ms/step - loss: 0.7411 - mae: 0.7282 
Epoch 2/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.5495 - mae: 0.6027

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.6075 - mae: 0.6547 
Epoch 3/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.4991 - mae: 0.5778

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.5320 - mae: 0.6115 
Epoch 4/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.4947 - mae: 0.5960

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.4428 - mae: 0.5532 
Epoch 5/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.3032 - mae: 0.4532

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.3673 - mae: 0.4974 
Epoch 6/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.3105 - mae: 0.4507

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.3110 - mae: 0.4555 
Epoch 7/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.3087 - mae: 0.4606

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2628 - mae: 0.4207 
Epoch 8/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2445 - mae: 0.4006

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2418 - mae: 0.4010 
Epoch 9/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2228 - mae: 0.3704

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2218 - mae: 0.3817 
Epoch 10/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2074 - mae: 0.3766

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2188 - mae: 0.3862 
Epoch 11/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2035 - mae: 0.3715

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2082 - mae: 0.3759 
Epoch 12/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2402 - mae: 0.4199

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2159 - mae: 0.3867 
Epoch 13/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1817 - mae: 0.3510

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2007 - mae: 0.3688 
Epoch 14/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2792 - mae: 0.4468

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2126 - mae: 0.3827 
Epoch 15/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1877 - mae: 0.3532

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2012 - mae: 0.3692 
Epoch 16/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1781 - mae: 0.3462

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1942 - mae: 0.3634 
Epoch 17/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 100ms/step - loss: 0.1622 - mae: 0.3254

 6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.1852 - mae: 0.3542 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.1865 - mae: 0.3557 
Epoch 18/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2196 - mae: 0.3975

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1957 - mae: 0.3672 
Epoch 19/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1527 - mae: 0.3158

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1799 - mae: 0.3506 
Epoch 20/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2264 - mae: 0.4028

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1859 - mae: 0.3574 
Epoch 21/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2115 - mae: 0.3892

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1798 - mae: 0.3529 
Epoch 22/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1634 - mae: 0.3331

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1617 - mae: 0.3325 
Epoch 23/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1877 - mae: 0.3663

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1580 - mae: 0.3293 
Epoch 24/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1659 - mae: 0.3403

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1606 - mae: 0.3360 
Epoch 25/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1651 - mae: 0.3452

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1517 - mae: 0.3246 
Epoch 26/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1396 - mae: 0.3055

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1428 - mae: 0.3090 
Epoch 27/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1533 - mae: 0.3295

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1293 - mae: 0.2988 
Epoch 28/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0858 - mae: 0.2371

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1159 - mae: 0.2827 
Epoch 29/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0936 - mae: 0.2523

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1151 - mae: 0.2802 
Epoch 30/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1274 - mae: 0.3078

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1108 - mae: 0.2767 
Epoch 31/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.1278 - mae: 0.2956

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1137 - mae: 0.2788 
Epoch 32/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.1015 - mae: 0.2537

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1058 - mae: 0.2651 
Epoch 33/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0961 - mae: 0.2573

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1061 - mae: 0.2687 
Epoch 34/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.1163 - mae: 0.2898

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1072 - mae: 0.2708 
Epoch 35/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0911 - mae: 0.2451

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1029 - mae: 0.2626 
Epoch 36/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1199 - mae: 0.2794

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0999 - mae: 0.2560 
Epoch 37/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0922 - mae: 0.2482

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0989 - mae: 0.2558 
Epoch 38/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1157 - mae: 0.2752

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1009 - mae: 0.2617 
Epoch 39/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1130 - mae: 0.2745

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0964 - mae: 0.2521 
Epoch 40/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0983 - mae: 0.2578

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0927 - mae: 0.2491 
Epoch 41/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1176 - mae: 0.2826

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0914 - mae: 0.2463 
Epoch 42/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1005 - mae: 0.2706

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0878 - mae: 0.2442 
Epoch 43/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0802 - mae: 0.2340

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0836 - mae: 0.2377 
Epoch 44/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0688 - mae: 0.2206

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0788 - mae: 0.2296 
Epoch 45/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0686 - mae: 0.2079

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0872 - mae: 0.2402 
Epoch 46/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0752 - mae: 0.2241

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0784 - mae: 0.2272 
Epoch 47/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0878 - mae: 0.2452

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0796 - mae: 0.2316 
Epoch 48/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0688 - mae: 0.2138

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0767 - mae: 0.2257 
Epoch 49/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0810 - mae: 0.2287

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0739 - mae: 0.2217 
Epoch 50/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0668 - mae: 0.2160

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0703 - mae: 0.2163 
Epoch 51/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0810 - mae: 0.2315

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0726 - mae: 0.2197 
Epoch 52/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 98ms/step - loss: 0.0750 - mae: 0.2273

 6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0718 - mae: 0.2170

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0712 - mae: 0.2169 
Epoch 53/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0803 - mae: 0.2348

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0692 - mae: 0.2143 
Epoch 54/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0767 - mae: 0.2290

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0704 - mae: 0.2151 
Epoch 55/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0674 - mae: 0.2114

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0635 - mae: 0.2023 
Epoch 56/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0411 - mae: 0.1640

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0550 - mae: 0.1880 
Epoch 57/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0649 - mae: 0.2094

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0595 - mae: 0.1983 
Epoch 58/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0552 - mae: 0.1892

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0547 - mae: 0.1874 
Epoch 59/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0560 - mae: 0.1971

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0536 - mae: 0.1877 
Epoch 60/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0449 - mae: 0.1721

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0500 - mae: 0.1806 
Epoch 61/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0681 - mae: 0.2079

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0519 - mae: 0.1813 
Epoch 62/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0622 - mae: 0.1989

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0477 - mae: 0.1731 
Epoch 63/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 103ms/step - loss: 0.0358 - mae: 0.1508

 6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0405 - mae: 0.1569 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0415 - mae: 0.1602 
Epoch 64/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0412 - mae: 0.1561

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0405 - mae: 0.1579 
Epoch 65/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0418 - mae: 0.1582

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0393 - mae: 0.1569 
Epoch 66/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0343 - mae: 0.1421

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0372 - mae: 0.1490 
Epoch 67/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 102ms/step - loss: 0.0248 - mae: 0.1242

 7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0312 - mae: 0.1409  

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - loss: 0.0325 - mae: 0.1435
Epoch 68/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0286 - mae: 0.1321

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0311 - mae: 0.1394 
Epoch 69/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0285 - mae: 0.1317

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0298 - mae: 0.1365 
Epoch 70/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0225 - mae: 0.1199

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0287 - mae: 0.1332 
Epoch 71/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0361 - mae: 0.1526

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0302 - mae: 0.1383 
Epoch 72/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0354 - mae: 0.1402

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0318 - mae: 0.1375 
Epoch 73/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0313 - mae: 0.1360

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0282 - mae: 0.1301 
Epoch 74/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0237 - mae: 0.1243

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0253 - mae: 0.1256 
Epoch 75/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0288 - mae: 0.1425

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0261 - mae: 0.1280 
Epoch 76/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0237 - mae: 0.1239

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0258 - mae: 0.1276 
Epoch 77/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0244 - mae: 0.1217

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0248 - mae: 0.1244 
Epoch 78/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0212 - mae: 0.1179

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0237 - mae: 0.1230 
Epoch 79/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0174 - mae: 0.1076

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0215 - mae: 0.1166 
Epoch 80/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0261 - mae: 0.1220

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0231 - mae: 0.1190 
Epoch 81/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0219 - mae: 0.1143

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0231 - mae: 0.1197 
Epoch 82/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0214 - mae: 0.1120

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0226 - mae: 0.1175 
Epoch 83/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0204 - mae: 0.1118

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0226 - mae: 0.1184 
Epoch 84/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0230 - mae: 0.1172

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0221 - mae: 0.1176 
Epoch 85/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 103ms/step - loss: 0.0222 - mae: 0.1206

 6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0238 - mae: 0.1207 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0229 - mae: 0.1182 
Epoch 86/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0178 - mae: 0.1063

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0221 - mae: 0.1176 
Epoch 87/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0193 - mae: 0.1101

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0230 - mae: 0.1201 
Epoch 88/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0264 - mae: 0.1345

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0248 - mae: 0.1267 
Epoch 89/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0179 - mae: 0.1058

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0202 - mae: 0.1124 
Epoch 90/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0196 - mae: 0.1088

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0195 - mae: 0.1107 
Epoch 91/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0212 - mae: 0.1211

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0214 - mae: 0.1160 
Epoch 92/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0178 - mae: 0.1072

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0198 - mae: 0.1118 
Epoch 93/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0169 - mae: 0.1003

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0200 - mae: 0.1115 
Epoch 94/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0150 - mae: 0.0973

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0193 - mae: 0.1100 
Epoch 95/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0180 - mae: 0.1076

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0199 - mae: 0.1112 
Epoch 96/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0188 - mae: 0.1088

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0201 - mae: 0.1119 
Epoch 97/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0182 - mae: 0.1081

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0201 - mae: 0.1120 
Epoch 98/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0191 - mae: 0.1083

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0193 - mae: 0.1102 
Epoch 99/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0247 - mae: 0.1253

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0209 - mae: 0.1142 
Epoch 100/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0225 - mae: 0.1167

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0222 - mae: 0.1185 
Epoch 101/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0199 - mae: 0.1117

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0208 - mae: 0.1134 
Epoch 102/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0215 - mae: 0.1145

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0196 - mae: 0.1108 
Epoch 103/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0171 - mae: 0.1072

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0203 - mae: 0.1132 
Epoch 104/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0171 - mae: 0.1003

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0186 - mae: 0.1079 
Epoch 105/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0193 - mae: 0.1060

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0191 - mae: 0.1076 
Epoch 106/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0185 - mae: 0.1092

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0197 - mae: 0.1120 
Epoch 107/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0148 - mae: 0.0958

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0198 - mae: 0.1104 
Epoch 108/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0146 - mae: 0.0978

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0179 - mae: 0.1064 
Epoch 109/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 102ms/step - loss: 0.0190 - mae: 0.1119

 6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0186 - mae: 0.1075 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0187 - mae: 0.1079
Epoch 110/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0143 - mae: 0.0957

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0186 - mae: 0.1085 
Epoch 111/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0201 - mae: 0.1145

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0195 - mae: 0.1102 
Epoch 112/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0195 - mae: 0.1099

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0210 - mae: 0.1141 
Epoch 113/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0224 - mae: 0.1227

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0198 - mae: 0.1121 
Epoch 114/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0169 - mae: 0.1035

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0184 - mae: 0.1083 
Epoch 115/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0187 - mae: 0.1086

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0189 - mae: 0.1086 
Epoch 116/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0217 - mae: 0.1079

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0218 - mae: 0.1154 
Epoch 117/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0214 - mae: 0.1148

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0223 - mae: 0.1186 
Epoch 118/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0215 - mae: 0.1184

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0210 - mae: 0.1154 
Epoch 119/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0167 - mae: 0.1040

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0189 - mae: 0.1082 
Epoch 120/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0173 - mae: 0.1066

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0188 - mae: 0.1078 
Epoch 121/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0183 - mae: 0.1116

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0177 - mae: 0.1066 
Epoch 122/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 101ms/step - loss: 0.0154 - mae: 0.0951

 6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0158 - mae: 0.0983 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0167 - mae: 0.1018 
Epoch 123/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0167 - mae: 0.1008

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0178 - mae: 0.1061 
Epoch 124/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0204 - mae: 0.1121

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0178 - mae: 0.1056 
Epoch 125/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0155 - mae: 0.0993

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0170 - mae: 0.1035 
Epoch 126/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0191 - mae: 0.1112

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0180 - mae: 0.1072 
Epoch 127/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 101ms/step - loss: 0.0149 - mae: 0.0990

 6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 0.0166 - mae: 0.1042 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0171 - mae: 0.1051 
Epoch 128/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0143 - mae: 0.0938

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0169 - mae: 0.1039 
Epoch 129/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0127 - mae: 0.0879

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0163 - mae: 0.1012 
Epoch 130/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0154 - mae: 0.0889

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0177 - mae: 0.1034 
Epoch 131/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0167 - mae: 0.1020

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0181 - mae: 0.1064 
Epoch 132/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0231 - mae: 0.1225

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0191 - mae: 0.1097 
Epoch 133/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0196 - mae: 0.1045

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0175 - mae: 0.1039 
Epoch 134/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0163 - mae: 0.1039

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0165 - mae: 0.1024 
Epoch 135/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0176 - mae: 0.1088

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0169 - mae: 0.1039 
Epoch 136/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0190 - mae: 0.1099

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0172 - mae: 0.1042 
Epoch 137/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0168 - mae: 0.1032

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0166 - mae: 0.1027 
Epoch 138/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0202 - mae: 0.1126

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0182 - mae: 0.1071 
Epoch 139/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0151 - mae: 0.0988

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0174 - mae: 0.1057 
Epoch 140/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0164 - mae: 0.1010

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0181 - mae: 0.1067 
Epoch 141/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0215 - mae: 0.1195

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0191 - mae: 0.1104 
Epoch 142/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0191 - mae: 0.1100

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0186 - mae: 0.1085 
Epoch 143/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 100ms/step - loss: 0.0179 - mae: 0.1092

 6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0189 - mae: 0.1112 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0192 - mae: 0.1106 
Epoch 144/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0167 - mae: 0.1044

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0189 - mae: 0.1095 
Epoch 145/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0187 - mae: 0.1134

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0177 - mae: 0.1066 
Epoch 146/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0196 - mae: 0.1143

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0190 - mae: 0.1105 
Epoch 147/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0183 - mae: 0.1075

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0171 - mae: 0.1032 
Epoch 148/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0173 - mae: 0.1083

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0175 - mae: 0.1057 
Epoch 149/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0209 - mae: 0.1155

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0186 - mae: 0.1092 
Epoch 150/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0205 - mae: 0.1101

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0181 - mae: 0.1059 
Epoch 151/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0188 - mae: 0.1097

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0169 - mae: 0.1035 
Epoch 152/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0172 - mae: 0.1069

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0170 - mae: 0.1047 
Epoch 153/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 103ms/step - loss: 0.0188 - mae: 0.1145

 6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0180 - mae: 0.1086 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0170 - mae: 0.1048
Epoch 154/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0142 - mae: 0.0969

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0165 - mae: 0.1026 
Epoch 155/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0142 - mae: 0.0925

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0154 - mae: 0.0971 
Epoch 156/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0222 - mae: 0.1184

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0166 - mae: 0.1021 
Epoch 157/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0140 - mae: 0.0955

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0154 - mae: 0.0989 
Epoch 158/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0142 - mae: 0.0971

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0158 - mae: 0.1006 
Epoch 159/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0137 - mae: 0.0909

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0170 - mae: 0.1032 
Epoch 160/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0109 - mae: 0.0820

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0148 - mae: 0.0959 
Epoch 161/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0157 - mae: 0.1031

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0154 - mae: 0.0992 
Epoch 162/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0129 - mae: 0.0932

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0153 - mae: 0.0987 
Epoch 163/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0213 - mae: 0.1136

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0163 - mae: 0.1010 
Epoch 164/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0189 - mae: 0.1095

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0159 - mae: 0.1000 
Epoch 165/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0171 - mae: 0.1026

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0157 - mae: 0.0993 
Epoch 166/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0160 - mae: 0.1006

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0152 - mae: 0.0983 
Epoch 167/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0153 - mae: 0.1019

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0165 - mae: 0.1021 
Epoch 168/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0148 - mae: 0.0991

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0169 - mae: 0.1038 
Epoch 169/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0196 - mae: 0.1130

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0173 - mae: 0.1039 
Epoch 170/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0169 - mae: 0.1060

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0158 - mae: 0.0999 
Epoch 171/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0157 - mae: 0.0989

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0157 - mae: 0.0995 
Epoch 172/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0138 - mae: 0.0903

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0159 - mae: 0.0983 
Epoch 173/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0189 - mae: 0.1032

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0158 - mae: 0.0983 
Epoch 174/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0162 - mae: 0.1002

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0153 - mae: 0.0979 
Epoch 175/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 103ms/step - loss: 0.0143 - mae: 0.0959

 9/13 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0142 - mae: 0.0947  

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - loss: 0.0146 - mae: 0.0959
Epoch 176/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0125 - mae: 0.0890

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0136 - mae: 0.0924 
Epoch 177/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0193 - mae: 0.1138

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0164 - mae: 0.1012 
Epoch 178/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0123 - mae: 0.0896

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0153 - mae: 0.0971 
Epoch 179/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 99ms/step - loss: 0.0112 - mae: 0.0839

 6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0132 - mae: 0.0917

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0146 - mae: 0.0957 
Epoch 180/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - loss: 0.0147 - mae: 0.0987

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0151 - mae: 0.0966 
Epoch 181/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0164 - mae: 0.1024

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0160 - mae: 0.1009 
Epoch 182/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0145 - mae: 0.0982

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0165 - mae: 0.1017 
Epoch 183/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0144 - mae: 0.0961

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0145 - mae: 0.0964 
Epoch 184/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0134 - mae: 0.0946

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0144 - mae: 0.0956 
Epoch 185/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0178 - mae: 0.1080

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0161 - mae: 0.1013 
Epoch 186/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 101ms/step - loss: 0.0150 - mae: 0.0948

 7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0145 - mae: 0.0955  

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0145 - mae: 0.0957
Epoch 187/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0093 - mae: 0.0780

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0142 - mae: 0.0937 
Epoch 188/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0154 - mae: 0.0903

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0158 - mae: 0.0975 
Epoch 189/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0251 - mae: 0.1291

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0166 - mae: 0.1019 
Epoch 190/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0155 - mae: 0.0992

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0145 - mae: 0.0962 
Epoch 191/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0141 - mae: 0.0950

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0150 - mae: 0.0977 
Epoch 192/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0123 - mae: 0.0881

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0141 - mae: 0.0939 
Epoch 193/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 99ms/step - loss: 0.0155 - mae: 0.0963

 7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0141 - mae: 0.0946 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0143 - mae: 0.0955
Epoch 194/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0166 - mae: 0.1068

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0150 - mae: 0.0972 
Epoch 195/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0180 - mae: 0.1060

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0158 - mae: 0.0993 
Epoch 196/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0177 - mae: 0.1024

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0151 - mae: 0.0965 
Epoch 197/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0103 - mae: 0.0821

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0131 - mae: 0.0903 
Epoch 198/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0225 - mae: 0.1215

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0162 - mae: 0.1014 
Epoch 199/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0121 - mae: 0.0866

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0146 - mae: 0.0958 
Epoch 200/200
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 101ms/step - loss: 0.0106 - mae: 0.0825

 6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0130 - mae: 0.0911 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0139 - mae: 0.0935 

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0139 - mae: 0.0936
phase apprentissage: 18.77 seconds
modele_lstm.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                     Output Shape                  Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ lstm (LSTM)                     │ (None, 14)             │           896 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 14)             │           210 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense_1 (Dense)                 │ (None, 7)              │           105 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense_2 (Dense)                 │ (None, 7)              │            56 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 3,803 (14.86 KB)
 Trainable params: 1,267 (4.95 KB)
 Non-trainable params: 0 (0.00 B)
 Optimizer params: 2,536 (9.91 KB)
ypred = modele_lstm.predict(Xlearn, verbose=True)
print(Xlearn.shape,ypred.shape)
Ylearn = ylearn.reshape(ylearn.shape[0],nap,)
print("R2 score {:.2f}".format(r2_score(Ylearn, ypred)))
print("model evaluate loss/mae")
modele_lstm.evaluate(Xlearn,ylearn)
 1/13 ━━━━━━━━━━━━━━━━━━━ 1s 106ms/step

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step  

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step
(400, 14, 1) (400, 7)
R2 score 0.98
model evaluate loss/mae
 1/13 ━━━━━━━━━━━━━━━━━━━ 2s 172ms/step - loss: 0.0151 - mae: 0.0994

13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0162 - mae: 0.1017  
[0.015241777524352074, 0.09779181331396103]
# prediction à partir de t2
t2 = t0 
Xpred  = np.array([ys[t2-nav:t2]]).reshape(1,nav,1)
ypred = modele_lstm.predict(Xpred, verbose=True)
print(Xpred.shape,ypred.shape)
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step

1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step
(1, 14, 1) (1, 7)
Xpred = Xpred.reshape(1,nav,)
ypred = ypred.reshape(nap)
plot_pred()
../../_images/24701e79a0a758173d32556922eb8c87e22fca9646274dbb7030e085513c4ce4.png

6.7. bibliographie#

6.8. FIN#