9. Analyse de séries temporelles avec IA#
Marc Buffat dpt mécanique, UCB Lyon1

import tensorflow as tf
2026-04-09 14:21:46.589253: I external/local_xla/xla/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used.
2026-04-09 14:21:46.593406: I external/local_xla/xla/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used.
2026-04-09 14:21:46.604356: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1775737306.621358 271252 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1775737306.626373 271252 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1775737306.639757 271252 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1775737306.639776 271252 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1775737306.639778 271252 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1775737306.639780 271252 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2026-04-09 14:21:46.643894: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
%matplotlib inline
import numpy as np
import matplotlib.pyplot as plt
# police des titres
plt.rc('font', family='serif', size='18')
from IPython.display import display,Markdown
# IA
import sklearn as sk
import tensorflow as tf
_uid_ = 12345
def serie_temp(N,a0=1.0,a1=0.5,a2 = 0.4, a3=0.1):
# data / jours
np.random.seed(_uid_)
# time series
Ts = np.array([x for x in np.arange(N)],dtype=int)
ys = [ a0*np.sin(2*np.pi*x/180) + a1*np.cos(2*np.pi*x/15) \
+ a2*x/360 for x in range(N)] + \
a3*np.random.normal(size=N,scale=0.2)
return Ts,ys
9.1. Objectifs#
On étudie un système temporel \(Y(t)\) et on souhaite prédire l’évolution du système: i.e. la prévision de ses futures réalisations en se basant sur ses valeurs passées
Une série temporelle Yt est communément décomposée en tendance, saisonnalité, bruit:
tendance \(T(t)\) = évolution à long terme
saisonnalité \(S(t)\) = phénoméne périodique
bruit \(\epsilon(t)\) = partie aléatoire
9.1.1. méthodes#
méthodes classiques: (modélisation de série chro. linéaires):
lissages exponentiels,
modèles de régression (régression linéaire, modèles non-paramétriques… ),
modèles SARIMA
utilisation de l’IA:
random forest,
réseaux de neuronnes récurrents LSTM
9.2. Scikit Learn RandomForest#
“forêt aléatoire” d’arbres de décision
prédiction 1 valeur à la fois

9.3. Réseau de neurones: LSTM/ RNN#
LSTM = Long Short-Term Memory
réseau RNN récurrent
fonction activation: évite l’explosion de la sortie (tanh )
méthode de gradient numérique (\(\alpha\) taux d’apprentissage) $\( w_{k+1} = w_k - \alpha F_w\)$
EPOCH = nbre d’epoques pour l’apprentissage
Le nombre d’époques est un hyperparamètre qui définit le nombre de fois que l’algorithme d’apprentissage parcours l’ensemble des données d’entraînement
Modèle de neuronne informatique
la sortie \(y\) est une fonction non linéaire des entrées (f = fonction d’activation)
les coefficients \(w_i, b\) sont obtenu par minimisation d’une erreur \(Err = || y_{pred} - \hat{y} ||\) à partir d’une base de données d’apprentissage \(\hat{y}\) en utilisant des algorithmes de minimisation (gradient)
Réseau de neuronnes par couche
Réseau de neuronnes récurrents (traitement de séquence temporelle)
9.3.1. Réseaux RNN#

9.3.2. La problématique de l’apprentissage d’un réseau récurrent#
réseau récurrent simple classique constitué d’une couche récurrente suivie d’une couche dense :
Il comprend trois matrices de poids : W, R et V ; R étant la matrice des poids récurrents. L’apprentissage du réseau consiste donc à apprendre ces trois matrices sur une base d’exemples étiquetés.
Or l’algorithme de minimisation par gradient pour les réseaux de neuronnes utilise un algorithme appelé rétropropagation du gradient. Cet algorithme rétropropage le gradient de l’erreur à travers les différentes couches de poids du réseau, en remontant de la dernière à la première couche.
Malheureusement, dans le cas des réseaux récurrents, la présence du cycle de récurrence (matrice R) interdit l’utilisation de cet algorithme
9.3.3. solution : rétropropagation à travers le temps#
La solution à ce problème consiste à exploiter la version dépliée du réseau, qui élimine les cycles.
Nous allons donc utiliser une approximation du réseau récurrent par un réseau déplié K fois (K = profondeur = nbre de couches internes cachés de 10 a 100) , comme présenté sur la figure suivante avec K=2 :
Attention
Le réseau déplié étant plus profond, la disparition du gradient (ou gradient évanescent) est plus importante durant l’apprentissage, et il est plus difficile à entraîner à cause d’une erreur qui tend à s’annuler en se rapprochant des couches basses.
Il est donc important d’utiliser toutes les stratégies possibles permettant de lutter contre ce phénomène : Batch Normalization, dropout, régularisation L1 et L2, etc.
Comme les poids de la couche récurrente sont dupliqués, les réseaux récurrents sont également sujets à un autre phénomène appelé explosion du gradient. Il s’agit d’un gradient d’erreur dont la norme est supérieure à 1.
Une méthode simple et efficace pour éviter cela consiste à tester cette norme, et à la limiter si elle est trop importante (aussi appelée gradient clipping, en anglais).
9.3.4. neuronne LSTM : Long Short Term Memory#
Afin de modéliser des dépendances à très long terme, il est nécessaire de donner aux réseaux de neurones récurrents la capacité de maintenir un état sur une longue période de temps.
C’est le but des cellules LSTM (Long Short Term Memory), qui possèdent une mémoire interne appelée cellule (ou cell). La cellule permet de maintenir un état aussi longtemps que nécessaire. Cette cellule consiste en une valeur numérique que le réseau peut piloter en fonction des situations.
la cellule mémoire peut être pilotée par trois portes de contrôle qu’on peut voir comme des vannes :
la porte d’entrée décide si l’entrée doit modifier le contenu de la cellule
la porte d’oubli décide s’il faut remettre à 0 le contenu de la cellule
la porte de sortie décide si le contenu de la cellule doit influer sur la sortie du neurone
Le mécanisme des trois portes est strictement similaire. L’ouverture/la fermeture de la vanne est modélisée par une fonction d’activation f qui est généralement une sigmoïde. Cette sigmoïde est appliquée à la somme pondérée des entrées, des sorties et de la cellule, avec des poids spécifiques.
Pour calculer la sortie \(y^t\), on utilise donc l’entrée \(x^t\), les états cachés \(h^{t-1}\) (\(x^{t-1},x^{t-2}\)) (dépliement de la récurrence) qui représentent la mémoire à court terme (short-term mémory) et les états des cellules mémoires \(c^{t-1}\) qui représentent la mémoire à long terme (long-term memory)
Comme n’importe quel neurone, les neurones LSTM sont généralement utilisés en couches. Dans ce cas, les sorties de tous les neurones sont réinjectées en entrée de tous les neurones.
Compte tenu de toutes les connexions nécessaires au pilotage de la cellule mémoire, les couches de neurones de type LSTM sont deux fois plus « lourdes » que les couches récurrentes simples, qui elles-mêmes sont deux fois plus lourdes que les couches denses classiques.
Les couches LSTM sont donc à utiliser avec parcimonie !
9.4. Application: analyse d’une serie temporelle#
Série temporelle \(Y = Y(t)\)
N mesures à intervalle régulier \(\Delta t\)
tableau de données ys
\[ys[i] = Y(i\Delta t)\]tableau ts (pour l’analyse)
\[ts[i] = i\Delta t\]
Base de données de tests
série périodique simple
serie bi-périodique (modulation)
avec tendance à long terme
du bruit
# construction serie temporelle
# cas periodique le plus simple
Ts,ys = serie_temp(1000,a0=0,a1=0.5,a2=0.0,a3 = 0.)
# cas bi-periodique
#Ts,ys = serie_temp(1000,a0=1.0,a1=0.5,a2=0.0,a3=0.0)
# + tendance
#Ts,ys = serie_temp(1000,a0=1.0,a1=0.5,a2=0.2,a3=0.0)
# + bruit
Ts,ys = serie_temp(1000,a0=1.0,a1=0.5,a2=0.2,a3=0.3)
plt.figure(figsize=(12,8))
plt.subplot(1,2,1)
plt.plot(Ts[:],ys)
plt.xlabel("jour")
plt.title("serie temporelle");
plt.subplot(1,2,2)
plt.plot(Ts[:100],ys[:100])
plt.xlabel("jour")
Text(0.5, 0, 'jour')
9.4.1. Objectifs#
base de données journalières : serie temporelle
préparations des donnees: fenétrage « avant : après »
data X: (avant)
on se donne les données sur 14 jours avantrésulata y: a(pres)
on veut pérédire les données sur les 7 jours suivants
ATTENTION: approche différente de l’approche classique y=F(X) car la pédiction est récurrente !
utilisation de l’IA:
random forest,
réseaux de neuronnes récurrents LSTM
9.5. Notebook version étudiant#
mise en oeuvre
source/Cours3_Serie_temp/NotesCours_serie_temp.ipynb
9.6. Notebook solution#
9.6.1. préparation des données#
fenêtrage des données:
choix d’une fenêtre de nav jours précédents pour prédire nap valeurs (i.e. sur nap jours)
nav taille de la fenêtre d’histoire (avant)
nap taille de la fenêtre prédiction (après)
N nbre de fenêtres
t0 date de début prédiction
def dataset(Ts,ys,nav,nap,N,t0):
# choix d'une fenetre de nav jours précédents pour prédir nap valeurs (i.e. sur nap jours)
# nav taille de la fenetre d'histoire (avant)
# nap taille de la fenetre prediction (apres)
# N nbre de fenetres
# t0 date de debut prediction
#
t1 = t0 - N - nav -nap
print(f"apprentissage sur {N} fenetres de {nav}-{nap} jours entre le jour {t1} et {t0}")
#
X = np.zeros((N,nav))
y = np.zeros((N,nap))
t = np.zeros(N,dtype=int)
# construction de la base de données
for i in range(N):
X[i,:] = ys[t1+i:t1+i+nav]
y[i] = ys[t1+i+nav:t1+i+nav+nap]
t[i] = Ts[t1+i+nav]
return X,y,t
# N fenetres: de 14 jours -> 7 jours pour prediction à partir du jour t0
nav = 14
nap = 7
#N = 200
#t0 = 300
N = 400
t0 = 600
X,y,t = dataset(Ts,ys,nav,nap,N,t0)
apprentissage sur 400 fenetres de 14-7 jours entre le jour 179 et 600
X.shape, y.shape, t.shape
((400, 14), (400, 7), (400,))
def plot_dataset():
plt.figure(figsize=(14,6))
plt.subplot(1,2,1)
plt.plot(t-nav,X[:,0])
plt.plot(t,y[:,0])
plt.xlabel("jour")
plt.ylabel("y")
plt.title("data apprentissage")
plt.subplot(1,2,2)
plt.plot(np.arange(t[0]-nav,t[0]+nap),ys[t[0]-nav:t[0]+nap],'--')
plt.plot(np.arange(t[0]-nav,t[0]),X[0,:],'or')
plt.plot(np.arange(t[0],t[0]+nap),y[0,:],'xg')
plt.plot(np.arange(t[-1]-nav,t[-1]+nap),ys[t[-1]-nav:t[-1]+nap],'--')
plt.plot(np.arange(t[-1]-nav,t[-1]),X[-1,:],'or')
plt.plot(np.arange(t[-1],t[-1]+nap),y[-1,:],'xg')
plt.xlabel("jour")
plt.title("first/last window");
return
plot_dataset()
9.6.2. Mise en oeuvre: apprentissage RandomForest#
scikit learn
from sklearn.linear_model import LinearRegression
from sklearn.ensemble import RandomForestRegressor
from sklearn.neighbors import KNeighborsRegressor
from sklearn.metrics import r2_score
# choix de l'algorithme
clf = RandomForestRegressor()
#clf = KNeighborsRegressor()
#clf = LinearRegression()
Xlearn = X.copy()
ylearn = y[:,0]
clf.fit(Xlearn,ylearn)
RandomForestRegressor()In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
RandomForestRegressor()
print("score = {:2d}%".format(int(100*clf.score(Xlearn, ylearn))))
yp = clf.predict(Xlearn)
print("R2 = {:3.2f}%".format(r2_score(ylearn,yp)))
score = 99%
R2 = 1.00%
def plot_pred():
plt.figure(figsize=(10,6))
plt.plot(Ts[t2:t2+nap],ypred,'x')
plt.plot(Ts[t2-nav:t2],Xpred[0],'--o')
plt.plot(Ts[t2-nav:t2+nap],ys[t2-nav:t2+nap],'--')
plt.xlabel("jour")
plt.title(f"prediction sur {nap} jours à partir du jour {t2}");
return
# prediction à partir de t2
t2 = t0
Xpred = np.array([ys[t2-nav:t2]])
ypred = np.zeros(nap)
Xp = Xpred.copy()
ypred[0] = clf.predict(Xp)[0]
for i in range(1,nap):
Xp[0,:-i] = Xpred[0,i:]
Xp[0,-i:] = ypred[:i]
ypred[i] = clf.predict(Xp)[0]
Xpred.shape, ypred.shape
((1, 14), (7,))
plot_pred()
9.6.3. Mise en oeuvre: LSTM RNN#
bibliothèque tensor flow Keras RNN
#Machine learning
from sklearn import preprocessing
import tensorflow as tf
import statsmodels as st
from statsmodels.tsa.seasonal import STL
from sklearn.model_selection import train_test_split
Xlearn = X.copy()
ylearn = y.copy()
Xlearn = Xlearn.reshape(X.shape[0], nav, 1)
ylearn = ylearn.reshape(y.shape[0], nap, 1)
Xlearn.shape, ylearn.shape
((400, 14, 1), (400, 7, 1))
#Nombre d'époque d'entrainement (fenetre de taille nav)
#EPOQUE = 300
EPOQUE = 200
#EPOQUE = 50
# modèle du réseaux de neurones(4 rangées (100,100,50,50) dont la première LSTM)
# si pas activation: activation='linear' lineaire a(x)=x, sinon test avec 'relu'
modele_lstm = tf.keras.models.Sequential([
tf.keras.layers.LSTM(nav),
tf.keras.layers.Dense(nav,activation='tanh'),
tf.keras.layers.Dense(nap,activation='tanh'),
tf.keras.layers.Dense(nap)
])
#Configuration du modèle(on minimise avec la méthode des moindres carrés)
modele_lstm.compile(optimizer='adam', metrics=['mae'], loss='mse')
print(EPOQUE)
E0000 00:00:1775737334.480729 271252 cuda_executor.cc:1228] INTERNAL: CUDA Runtime error: Failed call to cudaGetRuntimeVersion: Error loading CUDA libraries. GPU will not be used.: Error loading CUDA libraries. GPU will not be used.
W0000 00:00:1775737334.571971 271252 gpu_device.cc:2341] Cannot dlopen some GPU libraries. Please make sure the missing libraries mentioned above are installed properly if you would like to use GPU. Follow the guide at https://www.tensorflow.org/install/gpu for how to download and setup the required libraries for your platform.
Skipping registering GPU devices...
200
#Lance l'entrainement du modèle
import time
time_start = time.time()
modele_lstm.fit(Xlearn, ylearn, epochs=EPOQUE, verbose = True)
print('phase apprentissage: {:.2f} seconds'.format(time.time()-time_start))
Epoch 1/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 35s 3s/step - loss: 0.7778 - mae: 0.7243
13/13 ━━━━━━━━━━━━━━━━━━━━ 3s 5ms/step - loss: 0.7460 - mae: 0.7309
Epoch 2/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.7036 - mae: 0.6942
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.6922 - mae: 0.7009
Epoch 3/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.6470 - mae: 0.6785
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.6246 - mae: 0.6660
Epoch 4/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.5959 - mae: 0.6503
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.5475 - mae: 0.6150
Epoch 5/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.4480 - mae: 0.5272
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.4665 - mae: 0.5565
Epoch 6/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.4560 - mae: 0.5575
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.3970 - mae: 0.5151
Epoch 7/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 100ms/step - loss: 0.3090 - mae: 0.4512
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.3352 - mae: 0.4690
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.3387 - mae: 0.4722
Epoch 8/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.3261 - mae: 0.4712
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.3149 - mae: 0.4604
Epoch 9/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.3160 - mae: 0.4582
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2867 - mae: 0.4368
Epoch 10/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 100ms/step - loss: 0.2304 - mae: 0.3809
7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.2583 - mae: 0.4113
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.2619 - mae: 0.4166
Epoch 11/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2323 - mae: 0.3927
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2522 - mae: 0.4117
Epoch 12/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.3046 - mae: 0.4622
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2426 - mae: 0.4031
Epoch 13/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1979 - mae: 0.3555
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2206 - mae: 0.3813
Epoch 14/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2354 - mae: 0.3961
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2256 - mae: 0.3899
Epoch 15/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2148 - mae: 0.3774
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2183 - mae: 0.3821
Epoch 16/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1482 - mae: 0.3063
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1960 - mae: 0.3613
Epoch 17/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2353 - mae: 0.4074
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2208 - mae: 0.3886
Epoch 18/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1590 - mae: 0.3126
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2031 - mae: 0.3670
Epoch 19/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2087 - mae: 0.3771
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2149 - mae: 0.3835
Epoch 20/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2158 - mae: 0.3934
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2036 - mae: 0.3749
Epoch 21/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2150 - mae: 0.3907
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1956 - mae: 0.3652
Epoch 22/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2063 - mae: 0.3782
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2019 - mae: 0.3720
Epoch 23/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1785 - mae: 0.3518
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1839 - mae: 0.3539
Epoch 24/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2037 - mae: 0.3722
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1939 - mae: 0.3648
Epoch 25/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1619 - mae: 0.3355
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1761 - mae: 0.3477
Epoch 26/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1499 - mae: 0.3096
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1737 - mae: 0.3425
Epoch 27/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1753 - mae: 0.3456
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1707 - mae: 0.3401
Epoch 28/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.1652 - mae: 0.3406
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1739 - mae: 0.3467
Epoch 29/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1803 - mae: 0.3548
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1720 - mae: 0.3425
Epoch 30/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1503 - mae: 0.3137
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1678 - mae: 0.3378
Epoch 31/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 99ms/step - loss: 0.1402 - mae: 0.3127
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 0.1544 - mae: 0.3257
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.1644 - mae: 0.3350
Epoch 32/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1579 - mae: 0.3200
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1615 - mae: 0.3305
Epoch 33/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1842 - mae: 0.3513
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1630 - mae: 0.3317
Epoch 34/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1609 - mae: 0.3337
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1518 - mae: 0.3207
Epoch 35/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 100ms/step - loss: 0.1313 - mae: 0.2921
9/13 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1449 - mae: 0.3127
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1464 - mae: 0.3141
Epoch 36/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1324 - mae: 0.2969
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1431 - mae: 0.3092
Epoch 37/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1457 - mae: 0.3071
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1436 - mae: 0.3098
Epoch 38/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1438 - mae: 0.3101
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1403 - mae: 0.3061
Epoch 39/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1364 - mae: 0.3030
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1317 - mae: 0.2958
Epoch 40/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1244 - mae: 0.2841
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1254 - mae: 0.2857
Epoch 41/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1159 - mae: 0.2643
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1340 - mae: 0.2965
Epoch 42/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1509 - mae: 0.3221
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1336 - mae: 0.2973
Epoch 43/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1388 - mae: 0.3115
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1210 - mae: 0.2827
Epoch 44/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1124 - mae: 0.2719
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1220 - mae: 0.2806
Epoch 45/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1297 - mae: 0.3067
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1127 - mae: 0.2756
Epoch 46/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0897 - mae: 0.2366
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1070 - mae: 0.2646
Epoch 47/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1287 - mae: 0.2964
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1093 - mae: 0.2675
Epoch 48/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0987 - mae: 0.2659
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1018 - mae: 0.2620
Epoch 49/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1201 - mae: 0.2788
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1054 - mae: 0.2631
Epoch 50/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 98ms/step - loss: 0.0966 - mae: 0.2524
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 0.1104 - mae: 0.2728
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.1077 - mae: 0.2688
Epoch 51/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0734 - mae: 0.2272
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0939 - mae: 0.2491
Epoch 52/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1080 - mae: 0.2718
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1004 - mae: 0.2592
Epoch 53/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1036 - mae: 0.2602
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0953 - mae: 0.2510
Epoch 54/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0737 - mae: 0.2213
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0901 - mae: 0.2425
Epoch 55/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1014 - mae: 0.2602
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0962 - mae: 0.2549
Epoch 56/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0859 - mae: 0.2389
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0953 - mae: 0.2525
Epoch 57/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1009 - mae: 0.2535
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0927 - mae: 0.2466
Epoch 58/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1015 - mae: 0.2617
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0892 - mae: 0.2435
Epoch 59/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0969 - mae: 0.2466
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0914 - mae: 0.2445
Epoch 60/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0567 - mae: 0.1892
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0785 - mae: 0.2268
Epoch 61/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0900 - mae: 0.2422
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0881 - mae: 0.2414
Epoch 62/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0824 - mae: 0.2270
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0843 - mae: 0.2345
Epoch 63/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0924 - mae: 0.2486
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0880 - mae: 0.2418
Epoch 64/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0678 - mae: 0.2124
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0854 - mae: 0.2383
Epoch 65/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0860 - mae: 0.2329
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0835 - mae: 0.2320
Epoch 66/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0818 - mae: 0.2339
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0790 - mae: 0.2296
Epoch 67/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0730 - mae: 0.2155
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0799 - mae: 0.2290
Epoch 68/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0885 - mae: 0.2329
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0815 - mae: 0.2317
Epoch 69/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0717 - mae: 0.2167
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0742 - mae: 0.2197
Epoch 70/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0669 - mae: 0.2139
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0715 - mae: 0.2178
Epoch 71/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0830 - mae: 0.2342
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0787 - mae: 0.2288
Epoch 72/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0571 - mae: 0.1940
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0702 - mae: 0.2149
Epoch 73/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0548 - mae: 0.1890
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0682 - mae: 0.2111
Epoch 74/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0748 - mae: 0.2264
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0715 - mae: 0.2189
Epoch 75/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0739 - mae: 0.2307
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0685 - mae: 0.2139
Epoch 76/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0578 - mae: 0.1977
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0626 - mae: 0.2052
Epoch 77/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 99ms/step - loss: 0.0581 - mae: 0.1834
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 0.0670 - mae: 0.2078
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0644 - mae: 0.2051
Epoch 78/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0470 - mae: 0.1714
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0582 - mae: 0.1924
Epoch 79/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.0458 - mae: 0.1760
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0538 - mae: 0.1881
Epoch 80/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0666 - mae: 0.2148
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0551 - mae: 0.1908
Epoch 81/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0497 - mae: 0.1849
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0545 - mae: 0.1892
Epoch 82/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0471 - mae: 0.1729
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0478 - mae: 0.1750
Epoch 83/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0442 - mae: 0.1703
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0451 - mae: 0.1695
Epoch 84/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0476 - mae: 0.1786
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0424 - mae: 0.1659
Epoch 85/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0438 - mae: 0.1677
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0404 - mae: 0.1607
Epoch 86/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0524 - mae: 0.1868
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0401 - mae: 0.1596
Epoch 87/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 99ms/step - loss: 0.0295 - mae: 0.1347
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 0.0335 - mae: 0.1444
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0346 - mae: 0.1464
Epoch 88/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0342 - mae: 0.1424
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0345 - mae: 0.1467
Epoch 89/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0260 - mae: 0.1318
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0314 - mae: 0.1415
Epoch 90/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0246 - mae: 0.1219
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0301 - mae: 0.1368
Epoch 91/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0282 - mae: 0.1287
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0308 - mae: 0.1383
Epoch 92/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 100ms/step - loss: 0.0304 - mae: 0.1403
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 0.0309 - mae: 0.1405
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0314 - mae: 0.1417
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0314 - mae: 0.1416
Epoch 93/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0297 - mae: 0.1285
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0309 - mae: 0.1383
Epoch 94/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0274 - mae: 0.1341
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0271 - mae: 0.1296
Epoch 95/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0298 - mae: 0.1407
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0264 - mae: 0.1289
Epoch 96/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0310 - mae: 0.1417
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0272 - mae: 0.1304
Epoch 97/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0221 - mae: 0.1190
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0250 - mae: 0.1255
Epoch 98/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0327 - mae: 0.1460
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0254 - mae: 0.1268
Epoch 99/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0264 - mae: 0.1274
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0238 - mae: 0.1215
Epoch 100/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0206 - mae: 0.1186
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0220 - mae: 0.1183
Epoch 101/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0236 - mae: 0.1126
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0252 - mae: 0.1220
Epoch 102/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0234 - mae: 0.1208
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0241 - mae: 0.1224
Epoch 103/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0202 - mae: 0.1107
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0212 - mae: 0.1143
Epoch 104/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0220 - mae: 0.1185
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0225 - mae: 0.1177
Epoch 105/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0227 - mae: 0.1193
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0216 - mae: 0.1163
Epoch 106/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0239 - mae: 0.1246
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0210 - mae: 0.1147
Epoch 107/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0193 - mae: 0.1077
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0208 - mae: 0.1131
Epoch 108/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0215 - mae: 0.1177
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0214 - mae: 0.1163
Epoch 109/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 102ms/step - loss: 0.0194 - mae: 0.1109
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0190 - mae: 0.1099
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0199 - mae: 0.1124
Epoch 110/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0216 - mae: 0.1160
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0221 - mae: 0.1172
Epoch 111/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0244 - mae: 0.1233
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0225 - mae: 0.1186
Epoch 112/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0192 - mae: 0.1137
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0208 - mae: 0.1161
Epoch 113/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.0229 - mae: 0.1149
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0200 - mae: 0.1108
Epoch 114/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0156 - mae: 0.0992
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0189 - mae: 0.1091
Epoch 115/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0166 - mae: 0.1023
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0187 - mae: 0.1082
Epoch 116/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 94ms/step - loss: 0.0140 - mae: 0.0952
7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0169 - mae: 0.1040
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0185 - mae: 0.1082
Epoch 117/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0294 - mae: 0.1393
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0216 - mae: 0.1167
Epoch 118/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0151 - mae: 0.0981
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0181 - mae: 0.1069
Epoch 119/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0178 - mae: 0.1080
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0189 - mae: 0.1094
Epoch 120/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0199 - mae: 0.1149
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0205 - mae: 0.1149
Epoch 121/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0198 - mae: 0.1143
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0186 - mae: 0.1089
Epoch 122/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0237 - mae: 0.1225
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0206 - mae: 0.1142
Epoch 123/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0142 - mae: 0.0929
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0188 - mae: 0.1090
Epoch 124/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0175 - mae: 0.1040
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0187 - mae: 0.1079
Epoch 125/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0209 - mae: 0.1174
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0199 - mae: 0.1127
Epoch 126/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0144 - mae: 0.0935
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0179 - mae: 0.1059
Epoch 127/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0157 - mae: 0.0989
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0191 - mae: 0.1095
Epoch 128/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0152 - mae: 0.1006
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0196 - mae: 0.1114
Epoch 129/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0204 - mae: 0.1179
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0207 - mae: 0.1162
Epoch 130/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0139 - mae: 0.0924
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0179 - mae: 0.1067
Epoch 131/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 98ms/step - loss: 0.0199 - mae: 0.1165
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 0.0189 - mae: 0.1095
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0184 - mae: 0.1075
Epoch 132/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0217 - mae: 0.1179
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0181 - mae: 0.1070
Epoch 133/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0118 - mae: 0.0877
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0177 - mae: 0.1061
Epoch 134/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0259 - mae: 0.1286
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0188 - mae: 0.1084
Epoch 135/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0179 - mae: 0.1079
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0179 - mae: 0.1072
Epoch 136/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0205 - mae: 0.1141
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0190 - mae: 0.1100
Epoch 137/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0179 - mae: 0.1035
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0182 - mae: 0.1066
Epoch 138/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0171 - mae: 0.1038
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0187 - mae: 0.1079
Epoch 139/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0189 - mae: 0.1108
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0180 - mae: 0.1072
Epoch 140/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0195 - mae: 0.1101
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0189 - mae: 0.1107
Epoch 141/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0277 - mae: 0.1343
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0191 - mae: 0.1097
Epoch 142/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0152 - mae: 0.1009
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0185 - mae: 0.1087
Epoch 143/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0160 - mae: 0.1011
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0192 - mae: 0.1092
Epoch 144/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0128 - mae: 0.0916
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0167 - mae: 0.1019
Epoch 145/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0143 - mae: 0.0984
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0160 - mae: 0.1017
Epoch 146/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0207 - mae: 0.1119
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0191 - mae: 0.1084
Epoch 147/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0157 - mae: 0.1013
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0166 - mae: 0.1023
Epoch 148/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0136 - mae: 0.0932
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0163 - mae: 0.1014
Epoch 149/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0172 - mae: 0.1052
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0172 - mae: 0.1036
Epoch 150/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0231 - mae: 0.1144
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0179 - mae: 0.1043
Epoch 151/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0179 - mae: 0.1056
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0189 - mae: 0.1085
Epoch 152/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0144 - mae: 0.0983
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0168 - mae: 0.1040
Epoch 153/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0213 - mae: 0.1133
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0173 - mae: 0.1039
Epoch 154/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0183 - mae: 0.1066
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0179 - mae: 0.1063
Epoch 155/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0176 - mae: 0.1058
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0170 - mae: 0.1029
Epoch 156/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0150 - mae: 0.0987
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0176 - mae: 0.1056
Epoch 157/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0196 - mae: 0.1097
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0172 - mae: 0.1038
Epoch 158/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.0208 - mae: 0.1203
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0173 - mae: 0.1061
Epoch 159/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 100ms/step - loss: 0.0143 - mae: 0.0867
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0144 - mae: 0.0928
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0154 - mae: 0.0970
Epoch 160/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0133 - mae: 0.0865
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0164 - mae: 0.1011
Epoch 161/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0203 - mae: 0.1121
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0174 - mae: 0.1051
Epoch 162/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0123 - mae: 0.0838
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0156 - mae: 0.0973
Epoch 163/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0153 - mae: 0.1005
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0154 - mae: 0.0985
Epoch 164/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0119 - mae: 0.0845
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0158 - mae: 0.0989
Epoch 165/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0154 - mae: 0.0991
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0155 - mae: 0.0992
Epoch 166/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 101ms/step - loss: 0.0128 - mae: 0.0879
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0153 - mae: 0.0979
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0157 - mae: 0.0992
Epoch 167/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0137 - mae: 0.0935
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0154 - mae: 0.0985
Epoch 168/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0150 - mae: 0.1003
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0183 - mae: 0.1075
Epoch 169/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0222 - mae: 0.1152
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0201 - mae: 0.1123
Epoch 170/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0164 - mae: 0.1010
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0167 - mae: 0.1015
Epoch 171/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0213 - mae: 0.1162
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0172 - mae: 0.1037
Epoch 172/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0142 - mae: 0.0923
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0163 - mae: 0.1014
Epoch 173/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0193 - mae: 0.1103
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0170 - mae: 0.1035
Epoch 174/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 101ms/step - loss: 0.0129 - mae: 0.0907
7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0149 - mae: 0.0970
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0156 - mae: 0.0990
Epoch 175/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0114 - mae: 0.0872
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0154 - mae: 0.0992
Epoch 176/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0203 - mae: 0.1138
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0166 - mae: 0.1024
Epoch 177/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0163 - mae: 0.0977
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0158 - mae: 0.0990
Epoch 178/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 99ms/step - loss: 0.0136 - mae: 0.0941
7/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0148 - mae: 0.0973
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0150 - mae: 0.0979
Epoch 179/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0141 - mae: 0.0935
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0147 - mae: 0.0963
Epoch 180/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0218 - mae: 0.1191
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0174 - mae: 0.1043
Epoch 181/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0113 - mae: 0.0853
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0150 - mae: 0.0975
Epoch 182/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0144 - mae: 0.0931
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0152 - mae: 0.0968
Epoch 183/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0133 - mae: 0.0907
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0141 - mae: 0.0937
Epoch 184/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0137 - mae: 0.0949
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0160 - mae: 0.1015
Epoch 185/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0132 - mae: 0.0909
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0162 - mae: 0.1013
Epoch 186/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0196 - mae: 0.1120
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0170 - mae: 0.1033
Epoch 187/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0193 - mae: 0.1177
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0171 - mae: 0.1053
Epoch 188/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0169 - mae: 0.1033
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0153 - mae: 0.0979
Epoch 189/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0169 - mae: 0.1003
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0165 - mae: 0.1012
Epoch 190/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0147 - mae: 0.0978
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0148 - mae: 0.0970
Epoch 191/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0117 - mae: 0.0851
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0143 - mae: 0.0945
Epoch 192/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0131 - mae: 0.0897
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0151 - mae: 0.0969
Epoch 193/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 99ms/step - loss: 0.0130 - mae: 0.0898
10/13 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0148 - mae: 0.0967
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0149 - mae: 0.0971
Epoch 194/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0198 - mae: 0.1083
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0168 - mae: 0.1019
Epoch 195/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0126 - mae: 0.0903
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0142 - mae: 0.0939
Epoch 196/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0134 - mae: 0.0909
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0152 - mae: 0.0975
Epoch 197/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0185 - mae: 0.1102
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0161 - mae: 0.1012
Epoch 198/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0162 - mae: 0.1023
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0159 - mae: 0.1002
Epoch 199/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 101ms/step - loss: 0.0162 - mae: 0.1027
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 0.0162 - mae: 0.1020
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0157 - mae: 0.0999
Epoch 200/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0169 - mae: 0.1021
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0154 - mae: 0.0982
phase apprentissage: 18.87 seconds
modele_lstm.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩ │ lstm (LSTM) │ (None, 14) │ 896 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dense (Dense) │ (None, 14) │ 210 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dense_1 (Dense) │ (None, 7) │ 105 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dense_2 (Dense) │ (None, 7) │ 56 │ └─────────────────────────────────┴────────────────────────┴───────────────┘
Total params: 3,803 (14.86 KB)
Trainable params: 1,267 (4.95 KB)
Non-trainable params: 0 (0.00 B)
Optimizer params: 2,536 (9.91 KB)
ypred = modele_lstm.predict(Xlearn, verbose=True)
print(Xlearn.shape,ypred.shape)
Ylearn = ylearn.reshape(ylearn.shape[0],nap,)
print("R2 score {:.2f}".format(r2_score(Ylearn, ypred)))
print("model evaluate loss/mae")
modele_lstm.evaluate(Xlearn,ylearn)
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 102ms/step
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step
(400, 14, 1) (400, 7)
R2 score 0.98
model evaluate loss/mae
1/13 ━━━━━━━━━━━━━━━━━━━━ 2s 168ms/step - loss: 0.0152 - mae: 0.1020
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0150 - mae: 0.0981
[0.0145084448158741, 0.09593187272548676]
# prediction à partir de t2
t2 = t0
Xpred = np.array([ys[t2-nav:t2]]).reshape(1,nav,1)
ypred = modele_lstm.predict(Xpred, verbose=True)
print(Xpred.shape,ypred.shape)
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 12ms/step
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step
(1, 14, 1) (1, 7)
Xpred = Xpred.reshape(1,nav,)
ypred = ypred.reshape(nap)
plot_pred()