6. Analyse de séries temporelles avec IA#
Marc Buffat dpt mécanique, UCB Lyon1
import tensorflow as tf
2025-03-19 18:04:52.914706: I external/local_xla/xla/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used.
2025-03-19 18:04:52.918206: I external/local_xla/xla/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used.
2025-03-19 18:04:52.928638: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:477] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1742403892.946106 3690842 cuda_dnn.cc:8310] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1742403892.951244 3690842 cuda_blas.cc:1418] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
2025-03-19 18:04:52.969235: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
%matplotlib inline
import numpy as np
import matplotlib.pyplot as plt
# police des titres
plt.rc('font', family='serif', size='18')
from IPython.display import display,Markdown
# IA
import sklearn as sk
import tensorflow as tf
_uid_ = 12345
def serie_temp(N,a0=1.0,a1=0.5,a2 = 0.4, a3=0.1):
# data / jours
np.random.seed(_uid_)
# time series
Ts = np.array([x for x in np.arange(N)],dtype=int)
ys = [ a0*np.sin(2*np.pi*x/180) + a1*np.cos(2*np.pi*x/15) \
+ a2*x/360 for x in range(N)] + \
a3*np.random.normal(size=N,scale=0.2)
return Ts,ys
6.1. Objectifs#
On étudie un système temporel
Une série temporelle Yt est communément décomposée en tendance, saisonnalité, bruit:
tendance
= évolution à long termesaisonnalité
= phénoméne périodiquebruit
= partie aléatoire
6.1.1. méthodes#
méthodes classiques: (modélisation de série chro. linéaires):
lissages exponentiels,
modèles de régression (régression linéaire, modèles non-paramétriques… ),
modèles SARIMA
utilisation de l’IA:
random forest,
réseaux de neuronnes récurrents LSTM
6.2. Génération des données#
Série temporelle
N mesures à intervalle régulier
tableau de données ys
tableau ts (pour l’analyse)
tests
série périodique simple
serie bi-périodique (modulation)
avec tendance à long terme
avec du bruit
# construction serie temporelle
# cas periodique le plus simple
Ts,ys = serie_temp(1000,a0=0,a1=0.5,a2=0.0,a3 = 0.)
# cas bi-periodique
#Ts,ys = serie_temp(1000,a0=1.0,a1=0.5,a2=0.0,a3=0.0)
# + tendance
#Ts,ys = serie_temp(1000,a0=1.0,a1=0.5,a2=0.2,a3=0.0)
# + bruit
Ts,ys = serie_temp(1000,a0=1.0,a1=0.5,a2=0.2,a3=0.3)
plt.figure(figsize=(12,8))
plt.subplot(1,2,1)
plt.plot(Ts[:],ys)
plt.xlabel("jour")
plt.title("serie temporelle");
plt.subplot(1,2,2)
plt.plot(Ts[:100],ys[:100])
plt.xlabel("jour")
Text(0.5, 0, 'jour')

6.3. préparation des données#
fenêtrage des données:
choix d’une fenêtre de nav jours précédents pour prédire nap valeurs (i.e. sur nap jours)
nav taille de la fenêtre d’histoire (avant)
nap taille de la fenêtre prédiction (après)
N nbre de fenêtres
t0 date de début prédiction
def dataset(Ts,ys,nav,nap,N,t0):
# choix d'une fenetre de nav jours précédents pour prédir nap valeurs (i.e. sur nap jours)
# nav taille de la fenetre d'histoire (avant)
# nap taille de la fenetre prediction (apres)
# N nbre de fenetres
# t0 date de debut prediction
#
t1 = t0 - N - nav -nap
print(f"apprentissage sur {N} fenetres de {nav}-{nap} jours entre le jour {t1} et {t0}")
#
X = np.zeros((N,nav))
y = np.zeros((N,nap))
t = np.zeros(N,dtype=int)
# construction de la base de données
for i in range(N):
X[i,:] = ys[t1+i:t1+i+nav]
y[i] = ys[t1+i+nav:t1+i+nav+nap]
t[i] = Ts[t1+i+nav]
return X,y,t
# N fenetres: de 14 jours -> 7 jours pour prediction à partir du jour t0
nav = 14
nap = 7
#N = 200
#t0 = 300
N = 400
t0 = 600
X,y,t = dataset(Ts,ys,nav,nap,N,t0)
apprentissage sur 400 fenetres de 14-7 jours entre le jour 179 et 600
X.shape, y.shape, t.shape
((400, 14), (400, 7), (400,))
def plot_dataset():
plt.figure(figsize=(14,6))
plt.subplot(1,2,1)
plt.plot(t-nav,X[:,0])
plt.plot(t,y[:,0])
plt.xlabel("jour")
plt.ylabel("y")
plt.title("data apprentissage")
plt.subplot(1,2,2)
plt.plot(np.arange(t[0]-nav,t[0]+nap),ys[t[0]-nav:t[0]+nap],'--')
plt.plot(np.arange(t[0]-nav,t[0]),X[0,:],'or')
plt.plot(np.arange(t[0],t[0]+nap),y[0,:],'xg')
plt.plot(np.arange(t[-1]-nav,t[-1]+nap),ys[t[-1]-nav:t[-1]+nap],'--')
plt.plot(np.arange(t[-1]-nav,t[-1]),X[-1,:],'or')
plt.plot(np.arange(t[-1],t[-1]+nap),y[-1,:],'xg')
plt.xlabel("jour")
plt.title("first/last window");
return
plot_dataset()

6.4. Scikit Learn RandomForest#
“forêt aléatoire” d’arbres de décision
prédiction 1 valeur à la fois
6.5. Réseau de neurones: LSTM/ RNN#
LSTM = Long Short-Term Memory
réseau RNN récurrent
fonction activation: évite l’explosion de la sortie (tanh )
méthode de gradient numérique (
taux d’apprentissage) $ $EPOCH = nbre d’epoques pour l’apprentissage
Le nombre d’époques est un hyperparamètre qui définit le nombre de fois que l’algorithme d’apprentissage parcours l’ensemble des données d’entraînement
Modèle de neuronne informatique

la sortie
les coefficients
Réseau de neuronnes par couche

Réseau de neuronnes récurrents (traitement de séquence temporelle)

6.5.1. Réseaux RNN#
6.5.2. La problématique de l’apprentissage d’un réseau récurrent#
réseau récurrent simple classique constitué d’une couche récurrente suivie d’une couche dense :

Il comprend trois matrices de poids : W, R et V ; R étant la matrice des poids récurrents. L’apprentissage du réseau consiste donc à apprendre ces trois matrices sur une base d’exemples étiquetés.
Or l’algorithme de minimisation par gradient pour les réseaux de neuronnes utilise un algorithme appelé rétropropagation du gradient. Cet algorithme rétropropage le gradient de l’erreur à travers les différentes couches de poids du réseau, en remontant de la dernière à la première couche.
Malheureusement, dans le cas des réseaux récurrents, la présence du cycle de récurrence (matrice R) interdit l’utilisation de cet algorithme
6.5.3. solution : rétropropagation à travers le temps#
La solution à ce problème consiste à exploiter la version dépliée du réseau, qui élimine les cycles.
Nous allons donc utiliser une approximation du réseau récurrent par un réseau déplié K fois (K = profondeur = nbre de couches internes cachés de 10 a 100) , comme présenté sur la figure suivante avec K=2 :

Attention
Le réseau déplié étant plus profond, la disparition du gradient (ou gradient évanescent) est plus importante durant l’apprentissage, et il est plus difficile à entraîner à cause d’une erreur qui tend à s’annuler en se rapprochant des couches basses.
Il est donc important d’utiliser toutes les stratégies possibles permettant de lutter contre ce phénomène : Batch Normalization, dropout, régularisation L1 et L2, etc.
Comme les poids de la couche récurrente sont dupliqués, les réseaux récurrents sont également sujets à un autre phénomène appelé explosion du gradient. Il s’agit d’un gradient d’erreur dont la norme est supérieure à 1.
Une méthode simple et efficace pour éviter cela consiste à tester cette norme, et à la limiter si elle est trop importante (aussi appelée gradient clipping, en anglais).
6.5.4. neuronne LSTM : Long Short Term Memory#
Afin de modéliser des dépendances à très long terme, il est nécessaire de donner aux réseaux de neurones récurrents la capacité de maintenir un état sur une longue période de temps.
C’est le but des cellules LSTM (Long Short Term Memory), qui possèdent une mémoire interne appelée cellule (ou cell). La cellule permet de maintenir un état aussi longtemps que nécessaire. Cette cellule consiste en une valeur numérique que le réseau peut piloter en fonction des situations.

la cellule mémoire peut être pilotée par trois portes de contrôle qu’on peut voir comme des vannes :
la porte d’entrée décide si l’entrée doit modifier le contenu de la cellule
la porte d’oubli décide s’il faut remettre à 0 le contenu de la cellule
la porte de sortie décide si le contenu de la cellule doit influer sur la sortie du neurone
Le mécanisme des trois portes est strictement similaire. L’ouverture/la fermeture de la vanne est modélisée par une fonction d’activation f qui est généralement une sigmoïde. Cette sigmoïde est appliquée à la somme pondérée des entrées, des sorties et de la cellule, avec des poids spécifiques.
Pour calculer la sortie
Comme n’importe quel neurone, les neurones LSTM sont généralement utilisés en couches. Dans ce cas, les sorties de tous les neurones sont réinjectées en entrée de tous les neurones.
Compte tenu de toutes les connexions nécessaires au pilotage de la cellule mémoire, les couches de neurones de type LSTM sont deux fois plus « lourdes » que les couches récurrentes simples, qui elles-mêmes sont deux fois plus lourdes que les couches denses classiques.
Les couches LSTM sont donc à utiliser avec parcimonie !
6.6. Mise en oeuvre#
6.6.1. Apprentissage RandomForest#
scikit learn
from sklearn.linear_model import LinearRegression
from sklearn.ensemble import RandomForestRegressor
from sklearn.neighbors import KNeighborsRegressor
from sklearn.metrics import r2_score
# choix de l'algorithme
clf = RandomForestRegressor()
#clf = KNeighborsRegressor()
#clf = LinearRegression()
Xlearn = X.copy()
ylearn = y[:,0]
clf.fit(Xlearn,ylearn)
RandomForestRegressor()In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
RandomForestRegressor()
print("score = {:2d}%".format(int(100*clf.score(Xlearn, ylearn))))
yp = clf.predict(Xlearn)
print("R2 = {:3.2f}%".format(r2_score(ylearn,yp)))
score = 99%
R2 = 1.00%
def plot_pred():
plt.figure(figsize=(10,6))
plt.plot(Ts[t2:t2+nap],ypred,'x')
plt.plot(Ts[t2-nav:t2],Xpred[0],'--o')
plt.plot(Ts[t2-nav:t2+nap],ys[t2-nav:t2+nap],'--')
plt.xlabel("jour")
plt.title(f"prediction sur {nap} jours à partir du jour {t2}");
return
# prediction à partir de t2
t2 = t0
Xpred = np.array([ys[t2-nav:t2]])
ypred = np.zeros(nap)
Xp = Xpred.copy()
ypred[0] = clf.predict(Xp)[0]
for i in range(1,nap):
Xp[0,:-i] = Xpred[0,i:]
Xp[0,-i:] = ypred[:i]
ypred[i] = clf.predict(Xp)[0]
Xpred.shape, ypred.shape
((1, 14), (7,))
plot_pred()

6.6.2. Mise en oeuvre LSTM RNN#
bibliothèque tensor flow Keras RNN
#Machine learning
from sklearn import preprocessing
import tensorflow as tf
import statsmodels as st
from statsmodels.tsa.seasonal import STL
from sklearn.model_selection import train_test_split
Xlearn = X.copy()
ylearn = y.copy()
Xlearn = Xlearn.reshape(X.shape[0], nav, 1)
ylearn = ylearn.reshape(y.shape[0], nap, 1)
Xlearn.shape, ylearn.shape
((400, 14, 1), (400, 7, 1))
#Nombre d'époque d'entrainement (fenetre de taille nav)
#EPOQUE = 300
EPOQUE = 200
#EPOQUE = 50
# modèle du réseaux de neurones(4 rangées (100,100,50,50) dont la première LSTM)
# si pas activation: activation='linear' lineaire a(x)=x, sinon test avec 'relu'
modele_lstm = tf.keras.models.Sequential([
tf.keras.layers.LSTM(nav),
tf.keras.layers.Dense(nav,activation='tanh'),
tf.keras.layers.Dense(nap,activation='tanh'),
tf.keras.layers.Dense(nap)
])
#Configuration du modèle(on minimise avec la méthode des moindres carrés)
modele_lstm.compile(optimizer='adam', metrics=['mae'], loss='mse')
print(EPOQUE)
200
W0000 00:00:1742403896.757141 3690842 gpu_device.cc:2344] Cannot dlopen some GPU libraries. Please make sure the missing libraries mentioned above are installed properly if you would like to use GPU. Follow the guide at https://www.tensorflow.org/install/gpu for how to download and setup the required libraries for your platform.
Skipping registering GPU devices...
#Lance l'entrainement du modèle
import time
time_start = time.time()
modele_lstm.fit(Xlearn, ylearn, epochs=EPOQUE, verbose = True)
print('phase apprentissage: {:.2f} seconds'.format(time.time()-time_start))
Epoch 1/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 18s 2s/step - loss: 0.6802 - mae: 0.7018
13/13 ━━━━━━━━━━━━━━━━━━━━ 2s 4ms/step - loss: 0.7459 - mae: 0.7292
Epoch 2/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.5280 - mae: 0.6003
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.6000 - mae: 0.6464
Epoch 3/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.5777 - mae: 0.6389
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.5172 - mae: 0.5948
Epoch 4/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.3407 - mae: 0.4792
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.3876 - mae: 0.5078
Epoch 5/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.3395 - mae: 0.4724
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.3204 - mae: 0.4629
Epoch 6/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 97ms/step - loss: 0.2947 - mae: 0.4571
7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.2669 - mae: 0.4230
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.2684 - mae: 0.4250
Epoch 7/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 33ms/step - loss: 0.2939 - mae: 0.4495
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2495 - mae: 0.4112
Epoch 8/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 102ms/step - loss: 0.2348 - mae: 0.3924
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 0.2302 - mae: 0.3889
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.2299 - mae: 0.3910
Epoch 9/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.2648 - mae: 0.4380
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2147 - mae: 0.3802
Epoch 10/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2151 - mae: 0.3850
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2203 - mae: 0.3883
Epoch 11/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2063 - mae: 0.3779
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2074 - mae: 0.3758
Epoch 12/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2501 - mae: 0.4231
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2262 - mae: 0.3947
Epoch 13/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2226 - mae: 0.3952
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2063 - mae: 0.3738
Epoch 14/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.1954 - mae: 0.3638
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2036 - mae: 0.3730
Epoch 15/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1790 - mae: 0.3399
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2038 - mae: 0.3714
Epoch 16/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 91ms/step - loss: 0.1980 - mae: 0.3691
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 0.1979 - mae: 0.3669
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.1979 - mae: 0.3662
Epoch 17/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.1736 - mae: 0.3514
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1932 - mae: 0.3620
Epoch 18/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2336 - mae: 0.3998
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1931 - mae: 0.3614
Epoch 19/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.1617 - mae: 0.3269
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1892 - mae: 0.3565
Epoch 20/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.1935 - mae: 0.3567
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1942 - mae: 0.3605
Epoch 21/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1920 - mae: 0.3614
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1863 - mae: 0.3557
Epoch 22/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 100ms/step - loss: 0.2227 - mae: 0.4028
7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.1966 - mae: 0.3730
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.1934 - mae: 0.3659
Epoch 23/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1862 - mae: 0.3534
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1862 - mae: 0.3546
Epoch 24/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 86ms/step - loss: 0.2125 - mae: 0.3836
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1805 - mae: 0.3487
Epoch 25/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 77ms/step - loss: 0.1219 - mae: 0.2865
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1727 - mae: 0.3394
Epoch 26/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1548 - mae: 0.3218
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1759 - mae: 0.3414
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1761 - mae: 0.3417
Epoch 27/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 74ms/step - loss: 0.1678 - mae: 0.3361
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1798 - mae: 0.3471
Epoch 28/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 84ms/step - loss: 0.1310 - mae: 0.2943
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1646 - mae: 0.3315
Epoch 29/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1569 - mae: 0.3148
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1681 - mae: 0.3325
Epoch 30/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.1699 - mae: 0.3320
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1632 - mae: 0.3287
Epoch 31/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.1533 - mae: 0.3168
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1535 - mae: 0.3187
Epoch 32/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1418 - mae: 0.2987
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1580 - mae: 0.3218
Epoch 33/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.1663 - mae: 0.3386
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1530 - mae: 0.3179
Epoch 34/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 80ms/step - loss: 0.1709 - mae: 0.3418
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1546 - mae: 0.3205
Epoch 35/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 81ms/step - loss: 0.1419 - mae: 0.2984
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1488 - mae: 0.3098
Epoch 36/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.1370 - mae: 0.2918
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1540 - mae: 0.3196
Epoch 37/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 88ms/step - loss: 0.1258 - mae: 0.2747
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1357 - mae: 0.2946
Epoch 38/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.1511 - mae: 0.3106
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1381 - mae: 0.2962
Epoch 39/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1051 - mae: 0.2543
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1302 - mae: 0.2852
Epoch 40/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1389 - mae: 0.3039
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1341 - mae: 0.2935
Epoch 41/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1284 - mae: 0.2719
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1299 - mae: 0.2853
Epoch 42/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1164 - mae: 0.2764
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1237 - mae: 0.2802
Epoch 43/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1261 - mae: 0.2743
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1192 - mae: 0.2719
Epoch 44/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1299 - mae: 0.2783
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1149 - mae: 0.2679
Epoch 45/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1238 - mae: 0.2861
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1121 - mae: 0.2666
Epoch 46/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1197 - mae: 0.2698
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1077 - mae: 0.2598
Epoch 47/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0876 - mae: 0.2198
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1062 - mae: 0.2547
Epoch 48/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 96ms/step - loss: 0.0905 - mae: 0.2384
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 0.0965 - mae: 0.2463
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.1002 - mae: 0.2516
Epoch 49/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.1093 - mae: 0.2580
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1047 - mae: 0.2577
Epoch 50/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0912 - mae: 0.2433
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0934 - mae: 0.2430
Epoch 51/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0921 - mae: 0.2444
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0922 - mae: 0.2423
Epoch 52/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0878 - mae: 0.2301
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0894 - mae: 0.2356
Epoch 53/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1047 - mae: 0.2559
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0944 - mae: 0.2454
Epoch 54/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1044 - mae: 0.2586
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0845 - mae: 0.2320
Epoch 55/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0822 - mae: 0.2245
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0850 - mae: 0.2325
Epoch 56/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0930 - mae: 0.2417
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0808 - mae: 0.2268
Epoch 57/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0698 - mae: 0.2038
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0759 - mae: 0.2178
Epoch 58/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0918 - mae: 0.2457
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0753 - mae: 0.2176
Epoch 59/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0646 - mae: 0.2048
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0717 - mae: 0.2131
Epoch 60/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0665 - mae: 0.2022
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0722 - mae: 0.2138
Epoch 61/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0617 - mae: 0.1936
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0683 - mae: 0.2074
Epoch 62/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0685 - mae: 0.2023
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0663 - mae: 0.2028
Epoch 63/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0608 - mae: 0.1966
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0611 - mae: 0.1937
Epoch 64/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0664 - mae: 0.2053
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0600 - mae: 0.1948
Epoch 65/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0636 - mae: 0.1938
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0582 - mae: 0.1898
Epoch 66/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0528 - mae: 0.1861
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0508 - mae: 0.1799
Epoch 67/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0525 - mae: 0.1881
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0495 - mae: 0.1775
Epoch 68/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0468 - mae: 0.1697
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0461 - mae: 0.1692
Epoch 69/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0412 - mae: 0.1611
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0415 - mae: 0.1614
Epoch 70/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0501 - mae: 0.1786
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0410 - mae: 0.1594
Epoch 71/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0266 - mae: 0.1292
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0357 - mae: 0.1505
Epoch 72/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0279 - mae: 0.1300
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0327 - mae: 0.1417
Epoch 73/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0368 - mae: 0.1531
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0330 - mae: 0.1439
Epoch 74/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0290 - mae: 0.1346
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0305 - mae: 0.1374
Epoch 75/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0315 - mae: 0.1464
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0318 - mae: 0.1422
Epoch 76/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0280 - mae: 0.1333
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0310 - mae: 0.1409
Epoch 77/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0262 - mae: 0.1277
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0296 - mae: 0.1355
Epoch 78/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0343 - mae: 0.1475
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0308 - mae: 0.1393
Epoch 79/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0289 - mae: 0.1357
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0283 - mae: 0.1332
Epoch 80/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0249 - mae: 0.1227
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0282 - mae: 0.1319
Epoch 81/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0193 - mae: 0.1066
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0268 - mae: 0.1282
Epoch 82/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0319 - mae: 0.1436
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0280 - mae: 0.1335
Epoch 83/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.0256 - mae: 0.1208
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0273 - mae: 0.1298
Epoch 84/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0293 - mae: 0.1359
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0268 - mae: 0.1304
Epoch 85/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0219 - mae: 0.1162
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0273 - mae: 0.1303
Epoch 86/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0243 - mae: 0.1233
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0277 - mae: 0.1305
Epoch 87/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 97ms/step - loss: 0.0248 - mae: 0.1283
8/13 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - loss: 0.0258 - mae: 0.1285
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - loss: 0.0259 - mae: 0.1281
Epoch 88/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0214 - mae: 0.1168
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0249 - mae: 0.1251
Epoch 89/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0276 - mae: 0.1306
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0250 - mae: 0.1248
Epoch 90/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0231 - mae: 0.1235
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0232 - mae: 0.1219
Epoch 91/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0296 - mae: 0.1411
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0259 - mae: 0.1286
Epoch 92/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0205 - mae: 0.1096
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0257 - mae: 0.1264
Epoch 93/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0251 - mae: 0.1273
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0248 - mae: 0.1249
Epoch 94/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0214 - mae: 0.1190
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0221 - mae: 0.1184
Epoch 95/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 97ms/step - loss: 0.0220 - mae: 0.1186
7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0241 - mae: 0.1244
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - loss: 0.0241 - mae: 0.1240
Epoch 96/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0249 - mae: 0.1289
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0248 - mae: 0.1260
Epoch 97/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0242 - mae: 0.1248
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0231 - mae: 0.1210
Epoch 98/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0254 - mae: 0.1265
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0244 - mae: 0.1237
Epoch 99/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0187 - mae: 0.1101
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0220 - mae: 0.1165
Epoch 100/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0205 - mae: 0.1138
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0229 - mae: 0.1208
Epoch 101/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0341 - mae: 0.1410
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0244 - mae: 0.1225
Epoch 102/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0241 - mae: 0.1204
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0237 - mae: 0.1210
Epoch 103/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0246 - mae: 0.1281
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0244 - mae: 0.1243
Epoch 104/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0246 - mae: 0.1259
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0229 - mae: 0.1198
Epoch 105/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 94ms/step - loss: 0.0204 - mae: 0.1157
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 0.0244 - mae: 0.1236
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0241 - mae: 0.1227
Epoch 106/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0227 - mae: 0.1194
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0236 - mae: 0.1203
Epoch 107/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0240 - mae: 0.1240
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0242 - mae: 0.1236
Epoch 108/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0173 - mae: 0.1027
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0223 - mae: 0.1184
Epoch 109/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0219 - mae: 0.1175
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0222 - mae: 0.1174
Epoch 110/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0195 - mae: 0.1081
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0214 - mae: 0.1149
Epoch 111/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0229 - mae: 0.1151
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0215 - mae: 0.1145
Epoch 112/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0314 - mae: 0.1504
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0245 - mae: 0.1265
Epoch 113/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0206 - mae: 0.1094
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0228 - mae: 0.1184
Epoch 114/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0250 - mae: 0.1283
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0234 - mae: 0.1208
Epoch 115/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0250 - mae: 0.1230
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0228 - mae: 0.1190
Epoch 116/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0210 - mae: 0.1169
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0207 - mae: 0.1136
Epoch 117/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0219 - mae: 0.1179
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0214 - mae: 0.1158
Epoch 118/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0218 - mae: 0.1204
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0197 - mae: 0.1114
Epoch 119/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0190 - mae: 0.1086
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0209 - mae: 0.1156
Epoch 120/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 98ms/step - loss: 0.0230 - mae: 0.1168
7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0217 - mae: 0.1149
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - loss: 0.0219 - mae: 0.1159
Epoch 121/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0183 - mae: 0.1041
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0215 - mae: 0.1150
Epoch 122/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0245 - mae: 0.1274
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0205 - mae: 0.1147
Epoch 123/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0199 - mae: 0.1130
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0209 - mae: 0.1145
Epoch 124/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0166 - mae: 0.1008
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0213 - mae: 0.1144
Epoch 125/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0239 - mae: 0.1247
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0210 - mae: 0.1150
Epoch 126/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0196 - mae: 0.1103
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0205 - mae: 0.1126
Epoch 127/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 95ms/step - loss: 0.0237 - mae: 0.1215
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0209 - mae: 0.1127
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - loss: 0.0205 - mae: 0.1120
Epoch 128/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0240 - mae: 0.1261
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0221 - mae: 0.1188
Epoch 129/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0164 - mae: 0.1049
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0200 - mae: 0.1129
Epoch 130/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0190 - mae: 0.1070
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0197 - mae: 0.1101
Epoch 131/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0210 - mae: 0.1159
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0198 - mae: 0.1113
Epoch 132/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 88ms/step - loss: 0.0214 - mae: 0.1181
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0189 - mae: 0.1088
Epoch 133/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0174 - mae: 0.1025
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0192 - mae: 0.1093
Epoch 134/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0245 - mae: 0.1240
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0206 - mae: 0.1133
Epoch 135/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0158 - mae: 0.1000
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0187 - mae: 0.1086
Epoch 136/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - loss: 0.0193 - mae: 0.1099
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0197 - mae: 0.1103
Epoch 137/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0194 - mae: 0.1107
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0187 - mae: 0.1078
Epoch 138/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0204 - mae: 0.1138
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0207 - mae: 0.1139
Epoch 139/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0214 - mae: 0.1123
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0193 - mae: 0.1081
Epoch 140/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0228 - mae: 0.1217
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0211 - mae: 0.1143
Epoch 141/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 103ms/step - loss: 0.0213 - mae: 0.1219
8/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0190 - mae: 0.1109
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - loss: 0.0191 - mae: 0.1103
Epoch 142/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0155 - mae: 0.1015
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0194 - mae: 0.1110
Epoch 143/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0220 - mae: 0.1189
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0199 - mae: 0.1121
Epoch 144/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0193 - mae: 0.1100
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0195 - mae: 0.1106
Epoch 145/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0170 - mae: 0.1063
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0188 - mae: 0.1080
Epoch 146/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0232 - mae: 0.1208
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0210 - mae: 0.1147
Epoch 147/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0179 - mae: 0.1044
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0206 - mae: 0.1132
Epoch 148/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0127 - mae: 0.0894
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0174 - mae: 0.1030
Epoch 149/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0169 - mae: 0.1044
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0178 - mae: 0.1059
Epoch 150/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.0221 - mae: 0.1147
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0189 - mae: 0.1093
Epoch 151/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0174 - mae: 0.1040
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0194 - mae: 0.1084
Epoch 152/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.0201 - mae: 0.1108
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0187 - mae: 0.1084
Epoch 153/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0157 - mae: 0.0980
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0187 - mae: 0.1068
Epoch 154/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0187 - mae: 0.1087
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0188 - mae: 0.1076
Epoch 155/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0166 - mae: 0.1005
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0182 - mae: 0.1052
Epoch 156/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 101ms/step - loss: 0.0147 - mae: 0.0989
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 0.0168 - mae: 0.1032
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0175 - mae: 0.1052
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 0.0176 - mae: 0.1053
Epoch 157/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0169 - mae: 0.1040
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0182 - mae: 0.1063
Epoch 158/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0254 - mae: 0.1292
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0212 - mae: 0.1163
Epoch 159/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 102ms/step - loss: 0.0135 - mae: 0.0898
7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0163 - mae: 0.0998
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - loss: 0.0176 - mae: 0.1039
Epoch 160/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0215 - mae: 0.1153
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0186 - mae: 0.1067
Epoch 161/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0200 - mae: 0.1135
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0196 - mae: 0.1104
Epoch 162/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0198 - mae: 0.1128
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0200 - mae: 0.1126
Epoch 163/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0208 - mae: 0.1138
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0190 - mae: 0.1089
Epoch 164/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0236 - mae: 0.1156
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0194 - mae: 0.1092
Epoch 165/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0161 - mae: 0.0990
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0181 - mae: 0.1051
Epoch 166/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0182 - mae: 0.1114
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0187 - mae: 0.1086
Epoch 167/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0184 - mae: 0.1007
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0178 - mae: 0.1032
Epoch 168/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0126 - mae: 0.0860
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0167 - mae: 0.1016
Epoch 169/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0230 - mae: 0.1249
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0188 - mae: 0.1086
Epoch 170/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0120 - mae: 0.0880
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0164 - mae: 0.1011
Epoch 171/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.0145 - mae: 0.0959
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0160 - mae: 0.0997
Epoch 172/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0159 - mae: 0.0971
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0181 - mae: 0.1058
Epoch 173/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0173 - mae: 0.1049
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0172 - mae: 0.1038
Epoch 174/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 93ms/step - loss: 0.0213 - mae: 0.1154
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 0.0207 - mae: 0.1153
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0194 - mae: 0.1110
Epoch 175/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0155 - mae: 0.0965
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0176 - mae: 0.1046
Epoch 176/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0203 - mae: 0.1131
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0192 - mae: 0.1113
Epoch 177/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0160 - mae: 0.0978
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0163 - mae: 0.1010
Epoch 178/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0226 - mae: 0.1172
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0185 - mae: 0.1074
Epoch 179/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0140 - mae: 0.0949
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0171 - mae: 0.1031
Epoch 180/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0137 - mae: 0.0914
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0175 - mae: 0.1028
Epoch 181/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0204 - mae: 0.1102
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0182 - mae: 0.1059
Epoch 182/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0155 - mae: 0.0903
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0167 - mae: 0.1011
Epoch 183/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0180 - mae: 0.1081
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0166 - mae: 0.1020
Epoch 184/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0192 - mae: 0.1094
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0169 - mae: 0.1025
Epoch 185/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0192 - mae: 0.1084
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0180 - mae: 0.1057
Epoch 186/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0154 - mae: 0.0967
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0173 - mae: 0.1032
Epoch 187/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0225 - mae: 0.1200
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0197 - mae: 0.1110
Epoch 188/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0184 - mae: 0.1080
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0175 - mae: 0.1038
Epoch 189/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0153 - mae: 0.0985
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0162 - mae: 0.1009
Epoch 190/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0162 - mae: 0.0987
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0160 - mae: 0.0988
Epoch 191/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0135 - mae: 0.0897
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0160 - mae: 0.1000
Epoch 192/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 99ms/step - loss: 0.0157 - mae: 0.0968
7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0152 - mae: 0.0976
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0160 - mae: 0.0996
Epoch 193/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0136 - mae: 0.0924
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0161 - mae: 0.0997
Epoch 194/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 101ms/step - loss: 0.0243 - mae: 0.1232
7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0207 - mae: 0.1134
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - loss: 0.0191 - mae: 0.1089
Epoch 195/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 56ms/step - loss: 0.0190 - mae: 0.1094
7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0182 - mae: 0.1073
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0178 - mae: 0.1059
Epoch 196/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0180 - mae: 0.1058
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0165 - mae: 0.1009
Epoch 197/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0180 - mae: 0.1052
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0171 - mae: 0.1029
Epoch 198/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0164 - mae: 0.1011
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0166 - mae: 0.1021
Epoch 199/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 102ms/step - loss: 0.0139 - mae: 0.0944
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0150 - mae: 0.0970
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0155 - mae: 0.0987
Epoch 200/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0144 - mae: 0.0959
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0187 - mae: 0.1079
phase apprentissage: 18.36 seconds
modele_lstm.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩ │ lstm (LSTM) │ (None, 14) │ 896 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dense (Dense) │ (None, 14) │ 210 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dense_1 (Dense) │ (None, 7) │ 105 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dense_2 (Dense) │ (None, 7) │ 56 │ └─────────────────────────────────┴────────────────────────┴───────────────┘
Total params: 3,803 (14.86 KB)
Trainable params: 1,267 (4.95 KB)
Non-trainable params: 0 (0.00 B)
Optimizer params: 2,536 (9.91 KB)
ypred = modele_lstm.predict(Xlearn, verbose=True)
print(Xlearn.shape,ypred.shape)
Ylearn = ylearn.reshape(ylearn.shape[0],nap,)
print("R2 score {:.2f}".format(r2_score(Ylearn, ypred)))
print("model evaluate loss/mae")
modele_lstm.evaluate(Xlearn,ylearn)
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 107ms/step
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step
(400, 14, 1) (400, 7)
R2 score 0.97
model evaluate loss/mae
1/13 ━━━━━━━━━━━━━━━━━━━━ 2s 174ms/step - loss: 0.0207 - mae: 0.1164
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0171 - mae: 0.1028
[0.01707562804222107, 0.10179268568754196]
# prediction à partir de t2
t2 = t0
Xpred = np.array([ys[t2-nav:t2]]).reshape(1,nav,1)
ypred = modele_lstm.predict(Xpred, verbose=True)
print(Xpred.shape,ypred.shape)
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step
(1, 14, 1) (1, 7)
Xpred = Xpred.reshape(1,nav,)
ypred = ypred.reshape(nap)
plot_pred()
