6. Analyse de séries temporelles avec IA#
Marc Buffat dpt mécanique, UCB Lyon1
import tensorflow as tf
2025-07-01 15:04:19.769489: I external/local_xla/xla/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used.
2025-07-01 15:04:21.528146: I external/local_xla/xla/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used.
2025-07-01 15:04:22.518689: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:477] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1751375063.648948 651732 cuda_dnn.cc:8310] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1751375063.877807 651732 cuda_blas.cc:1418] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
2025-07-01 15:04:25.855639: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
%matplotlib inline
import numpy as np
import matplotlib.pyplot as plt
# police des titres
plt.rc('font', family='serif', size='18')
from IPython.display import display,Markdown
# IA
import sklearn as sk
import tensorflow as tf
_uid_ = 12345
def serie_temp(N,a0=1.0,a1=0.5,a2 = 0.4, a3=0.1):
# data / jours
np.random.seed(_uid_)
# time series
Ts = np.array([x for x in np.arange(N)],dtype=int)
ys = [ a0*np.sin(2*np.pi*x/180) + a1*np.cos(2*np.pi*x/15) \
+ a2*x/360 for x in range(N)] + \
a3*np.random.normal(size=N,scale=0.2)
return Ts,ys
6.1. Objectifs#
On étudie un système temporel \(Y(t)\) et on souhaite prédire l’évolution du système: i.e. la prévision de ses futures réalisations en se basant sur ses valeurs passées
Une série temporelle Yt est communément décomposée en tendance, saisonnalité, bruit:
tendance \(T(t)\) = évolution à long terme
saisonnalité \(S(t)\) = phénoméne périodique
bruit \(\epsilon(t)\) = partie aléatoire
6.1.1. méthodes#
méthodes classiques: (modélisation de série chro. linéaires):
lissages exponentiels,
modèles de régression (régression linéaire, modèles non-paramétriques… ),
modèles SARIMA
utilisation de l’IA:
random forest,
réseaux de neuronnes récurrents LSTM
6.2. Génération des données#
Série temporelle \(Y = Y(t)\)
N mesures à intervalle régulier \(\Delta t\)
tableau de données ys
\[ys[i] = Y(i\Delta t)\]tableau ts (pour l’analyse)
\[ts[i] = i\Delta t\]
tests
série périodique simple
serie bi-périodique (modulation)
avec tendance à long terme
avec du bruit
# construction serie temporelle
# cas periodique le plus simple
Ts,ys = serie_temp(1000,a0=0,a1=0.5,a2=0.0,a3 = 0.)
# cas bi-periodique
#Ts,ys = serie_temp(1000,a0=1.0,a1=0.5,a2=0.0,a3=0.0)
# + tendance
#Ts,ys = serie_temp(1000,a0=1.0,a1=0.5,a2=0.2,a3=0.0)
# + bruit
Ts,ys = serie_temp(1000,a0=1.0,a1=0.5,a2=0.2,a3=0.3)
plt.figure(figsize=(12,8))
plt.subplot(1,2,1)
plt.plot(Ts[:],ys)
plt.xlabel("jour")
plt.title("serie temporelle");
plt.subplot(1,2,2)
plt.plot(Ts[:100],ys[:100])
plt.xlabel("jour")
Text(0.5, 0, 'jour')

6.3. préparation des données#
fenêtrage des données:
choix d’une fenêtre de nav jours précédents pour prédire nap valeurs (i.e. sur nap jours)
nav taille de la fenêtre d’histoire (avant)
nap taille de la fenêtre prédiction (après)
N nbre de fenêtres
t0 date de début prédiction
def dataset(Ts,ys,nav,nap,N,t0):
# choix d'une fenetre de nav jours précédents pour prédir nap valeurs (i.e. sur nap jours)
# nav taille de la fenetre d'histoire (avant)
# nap taille de la fenetre prediction (apres)
# N nbre de fenetres
# t0 date de debut prediction
#
t1 = t0 - N - nav -nap
print(f"apprentissage sur {N} fenetres de {nav}-{nap} jours entre le jour {t1} et {t0}")
#
X = np.zeros((N,nav))
y = np.zeros((N,nap))
t = np.zeros(N,dtype=int)
# construction de la base de données
for i in range(N):
X[i,:] = ys[t1+i:t1+i+nav]
y[i] = ys[t1+i+nav:t1+i+nav+nap]
t[i] = Ts[t1+i+nav]
return X,y,t
# N fenetres: de 14 jours -> 7 jours pour prediction à partir du jour t0
nav = 14
nap = 7
#N = 200
#t0 = 300
N = 400
t0 = 600
X,y,t = dataset(Ts,ys,nav,nap,N,t0)
apprentissage sur 400 fenetres de 14-7 jours entre le jour 179 et 600
X.shape, y.shape, t.shape
((400, 14), (400, 7), (400,))
def plot_dataset():
plt.figure(figsize=(14,6))
plt.subplot(1,2,1)
plt.plot(t-nav,X[:,0])
plt.plot(t,y[:,0])
plt.xlabel("jour")
plt.ylabel("y")
plt.title("data apprentissage")
plt.subplot(1,2,2)
plt.plot(np.arange(t[0]-nav,t[0]+nap),ys[t[0]-nav:t[0]+nap],'--')
plt.plot(np.arange(t[0]-nav,t[0]),X[0,:],'or')
plt.plot(np.arange(t[0],t[0]+nap),y[0,:],'xg')
plt.plot(np.arange(t[-1]-nav,t[-1]+nap),ys[t[-1]-nav:t[-1]+nap],'--')
plt.plot(np.arange(t[-1]-nav,t[-1]),X[-1,:],'or')
plt.plot(np.arange(t[-1],t[-1]+nap),y[-1,:],'xg')
plt.xlabel("jour")
plt.title("first/last window");
return
plot_dataset()

6.4. Scikit Learn RandomForest#
“forêt aléatoire” d’arbres de décision
prédiction 1 valeur à la fois
6.5. Réseau de neurones: LSTM/ RNN#
LSTM = Long Short-Term Memory
réseau RNN récurrent
fonction activation: évite l’explosion de la sortie (tanh )
méthode de gradient numérique (\(\alpha\) taux d’apprentissage) $\( w_{k+1} = w_k - \alpha F_w\)$
EPOCH = nbre d’epoques pour l’apprentissage
Le nombre d’époques est un hyperparamètre qui définit le nombre de fois que l’algorithme d’apprentissage parcours l’ensemble des données d’entraînement
Modèle de neuronne informatique

la sortie \(y\) est une fonction non linéaire des entrées (f = fonction d’activation)
les coefficients \(w_i, b\) sont obtenu par minimisation d’une erreur \(Err = || y_{pred} - \hat{y} ||\) à partir d’une base de données d’apprentissage \(\hat{y}\) en utilisant des algorithmes de minimisation (gradient)
Réseau de neuronnes par couche

Réseau de neuronnes récurrents (traitement de séquence temporelle)

6.5.1. Réseaux RNN#
6.5.2. La problématique de l’apprentissage d’un réseau récurrent#
réseau récurrent simple classique constitué d’une couche récurrente suivie d’une couche dense :

Il comprend trois matrices de poids : W, R et V ; R étant la matrice des poids récurrents. L’apprentissage du réseau consiste donc à apprendre ces trois matrices sur une base d’exemples étiquetés.
Or l’algorithme de minimisation par gradient pour les réseaux de neuronnes utilise un algorithme appelé rétropropagation du gradient. Cet algorithme rétropropage le gradient de l’erreur à travers les différentes couches de poids du réseau, en remontant de la dernière à la première couche.
Malheureusement, dans le cas des réseaux récurrents, la présence du cycle de récurrence (matrice R) interdit l’utilisation de cet algorithme
6.5.3. solution : rétropropagation à travers le temps#
La solution à ce problème consiste à exploiter la version dépliée du réseau, qui élimine les cycles.
Nous allons donc utiliser une approximation du réseau récurrent par un réseau déplié K fois (K = profondeur = nbre de couches internes cachés de 10 a 100) , comme présenté sur la figure suivante avec K=2 :

Attention
Le réseau déplié étant plus profond, la disparition du gradient (ou gradient évanescent) est plus importante durant l’apprentissage, et il est plus difficile à entraîner à cause d’une erreur qui tend à s’annuler en se rapprochant des couches basses.
Il est donc important d’utiliser toutes les stratégies possibles permettant de lutter contre ce phénomène : Batch Normalization, dropout, régularisation L1 et L2, etc.
Comme les poids de la couche récurrente sont dupliqués, les réseaux récurrents sont également sujets à un autre phénomène appelé explosion du gradient. Il s’agit d’un gradient d’erreur dont la norme est supérieure à 1.
Une méthode simple et efficace pour éviter cela consiste à tester cette norme, et à la limiter si elle est trop importante (aussi appelée gradient clipping, en anglais).
6.5.4. neuronne LSTM : Long Short Term Memory#
Afin de modéliser des dépendances à très long terme, il est nécessaire de donner aux réseaux de neurones récurrents la capacité de maintenir un état sur une longue période de temps.
C’est le but des cellules LSTM (Long Short Term Memory), qui possèdent une mémoire interne appelée cellule (ou cell). La cellule permet de maintenir un état aussi longtemps que nécessaire. Cette cellule consiste en une valeur numérique que le réseau peut piloter en fonction des situations.

la cellule mémoire peut être pilotée par trois portes de contrôle qu’on peut voir comme des vannes :
la porte d’entrée décide si l’entrée doit modifier le contenu de la cellule
la porte d’oubli décide s’il faut remettre à 0 le contenu de la cellule
la porte de sortie décide si le contenu de la cellule doit influer sur la sortie du neurone
Le mécanisme des trois portes est strictement similaire. L’ouverture/la fermeture de la vanne est modélisée par une fonction d’activation f qui est généralement une sigmoïde. Cette sigmoïde est appliquée à la somme pondérée des entrées, des sorties et de la cellule, avec des poids spécifiques.
Pour calculer la sortie \(y^t\), on utilise donc l’entrée \(x^t\), les états cachés \(h^{t-1}\) (\(x^{t-1},x^{t-2}\)) (dépliement de la récurrence) qui représentent la mémoire à court terme (short-term mémory) et les états des cellules mémoires \(c^{t-1}\) qui représentent la mémoire à long terme (long-term memory)
Comme n’importe quel neurone, les neurones LSTM sont généralement utilisés en couches. Dans ce cas, les sorties de tous les neurones sont réinjectées en entrée de tous les neurones.
Compte tenu de toutes les connexions nécessaires au pilotage de la cellule mémoire, les couches de neurones de type LSTM sont deux fois plus « lourdes » que les couches récurrentes simples, qui elles-mêmes sont deux fois plus lourdes que les couches denses classiques.
Les couches LSTM sont donc à utiliser avec parcimonie !
6.6. Mise en oeuvre#
6.6.1. Apprentissage RandomForest#
scikit learn
from sklearn.linear_model import LinearRegression
from sklearn.ensemble import RandomForestRegressor
from sklearn.neighbors import KNeighborsRegressor
from sklearn.metrics import r2_score
# choix de l'algorithme
clf = RandomForestRegressor()
#clf = KNeighborsRegressor()
#clf = LinearRegression()
Xlearn = X.copy()
ylearn = y[:,0]
clf.fit(Xlearn,ylearn)
RandomForestRegressor()In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
RandomForestRegressor()
print("score = {:2d}%".format(int(100*clf.score(Xlearn, ylearn))))
yp = clf.predict(Xlearn)
print("R2 = {:3.2f}%".format(r2_score(ylearn,yp)))
score = 99%
R2 = 1.00%
def plot_pred():
plt.figure(figsize=(10,6))
plt.plot(Ts[t2:t2+nap],ypred,'x')
plt.plot(Ts[t2-nav:t2],Xpred[0],'--o')
plt.plot(Ts[t2-nav:t2+nap],ys[t2-nav:t2+nap],'--')
plt.xlabel("jour")
plt.title(f"prediction sur {nap} jours à partir du jour {t2}");
return
# prediction à partir de t2
t2 = t0
Xpred = np.array([ys[t2-nav:t2]])
ypred = np.zeros(nap)
Xp = Xpred.copy()
ypred[0] = clf.predict(Xp)[0]
for i in range(1,nap):
Xp[0,:-i] = Xpred[0,i:]
Xp[0,-i:] = ypred[:i]
ypred[i] = clf.predict(Xp)[0]
Xpred.shape, ypred.shape
((1, 14), (7,))
plot_pred()

6.6.2. Mise en oeuvre LSTM RNN#
bibliothèque tensor flow Keras RNN
#Machine learning
from sklearn import preprocessing
import tensorflow as tf
import statsmodels as st
from statsmodels.tsa.seasonal import STL
from sklearn.model_selection import train_test_split
Xlearn = X.copy()
ylearn = y.copy()
Xlearn = Xlearn.reshape(X.shape[0], nav, 1)
ylearn = ylearn.reshape(y.shape[0], nap, 1)
Xlearn.shape, ylearn.shape
((400, 14, 1), (400, 7, 1))
#Nombre d'époque d'entrainement (fenetre de taille nav)
#EPOQUE = 300
EPOQUE = 200
#EPOQUE = 50
# modèle du réseaux de neurones(4 rangées (100,100,50,50) dont la première LSTM)
# si pas activation: activation='linear' lineaire a(x)=x, sinon test avec 'relu'
modele_lstm = tf.keras.models.Sequential([
tf.keras.layers.LSTM(nav),
tf.keras.layers.Dense(nav,activation='tanh'),
tf.keras.layers.Dense(nap,activation='tanh'),
tf.keras.layers.Dense(nap)
])
#Configuration du modèle(on minimise avec la méthode des moindres carrés)
modele_lstm.compile(optimizer='adam', metrics=['mae'], loss='mse')
print(EPOQUE)
200
W0000 00:00:1751375086.387260 651732 gpu_device.cc:2344] Cannot dlopen some GPU libraries. Please make sure the missing libraries mentioned above are installed properly if you would like to use GPU. Follow the guide at https://www.tensorflow.org/install/gpu for how to download and setup the required libraries for your platform.
Skipping registering GPU devices...
#Lance l'entrainement du modèle
import time
time_start = time.time()
modele_lstm.fit(Xlearn, ylearn, epochs=EPOQUE, verbose = True)
print('phase apprentissage: {:.2f} seconds'.format(time.time()-time_start))
Epoch 1/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 32s 3s/step - loss: 0.7431 - mae: 0.7253
11/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.7065 - mae: 0.7115
13/13 ━━━━━━━━━━━━━━━━━━━━ 3s 6ms/step - loss: 0.7100 - mae: 0.7139
Epoch 2/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.8484 - mae: 0.8123
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.6960 - mae: 0.7086
Epoch 3/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 97ms/step - loss: 0.7205 - mae: 0.7181
9/13 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.6388 - mae: 0.6715
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.6280 - mae: 0.6657
Epoch 4/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.5831 - mae: 0.6595
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.5472 - mae: 0.6212
Epoch 5/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.4087 - mae: 0.5250
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.4434 - mae: 0.5473
Epoch 6/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.3884 - mae: 0.5100
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.3769 - mae: 0.5034
Epoch 7/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.3647 - mae: 0.4998
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.3027 - mae: 0.4498
Epoch 8/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.2620 - mae: 0.4069
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2752 - mae: 0.4265
Epoch 9/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.2702 - mae: 0.4296
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2460 - mae: 0.4058
Epoch 10/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2304 - mae: 0.3891
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2366 - mae: 0.3968
Epoch 11/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2006 - mae: 0.3597
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2150 - mae: 0.3771
Epoch 12/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.2230 - mae: 0.3865
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2191 - mae: 0.3855
Epoch 13/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.1668 - mae: 0.3328
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2058 - mae: 0.3716
Epoch 14/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2229 - mae: 0.4002
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2154 - mae: 0.3856
Epoch 15/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.1946 - mae: 0.3618
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.2036 - mae: 0.3728
Epoch 16/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2055 - mae: 0.3644
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1950 - mae: 0.3605
Epoch 17/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.1754 - mae: 0.3447
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1963 - mae: 0.3665
Epoch 18/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2084 - mae: 0.3795
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1985 - mae: 0.3680
Epoch 19/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2157 - mae: 0.3863
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1915 - mae: 0.3606
Epoch 20/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1700 - mae: 0.3307
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1734 - mae: 0.3387
Epoch 21/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1725 - mae: 0.3265
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1714 - mae: 0.3360
Epoch 22/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1714 - mae: 0.3303
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1768 - mae: 0.3435
Epoch 23/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1515 - mae: 0.3210
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1556 - mae: 0.3203
Epoch 24/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1786 - mae: 0.3553
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1541 - mae: 0.3220
Epoch 25/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1335 - mae: 0.2973
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1480 - mae: 0.3148
Epoch 26/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 101ms/step - loss: 0.1400 - mae: 0.3072
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.1551 - mae: 0.3245
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.1515 - mae: 0.3198
Epoch 27/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1451 - mae: 0.3131
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1422 - mae: 0.3091
Epoch 28/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1189 - mae: 0.2818
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1326 - mae: 0.2952
Epoch 29/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1438 - mae: 0.3123
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1315 - mae: 0.2965
Epoch 30/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1285 - mae: 0.3008
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1210 - mae: 0.2840
Epoch 31/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1464 - mae: 0.3176
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1293 - mae: 0.2952
Epoch 32/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 100ms/step - loss: 0.1409 - mae: 0.3061
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 0.1262 - mae: 0.2895
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.1232 - mae: 0.2867
Epoch 33/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1495 - mae: 0.3250
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1224 - mae: 0.2863
Epoch 34/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1223 - mae: 0.2885
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1134 - mae: 0.2741
Epoch 35/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0859 - mae: 0.2416
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1051 - mae: 0.2660
Epoch 36/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0872 - mae: 0.2393
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1043 - mae: 0.2625
Epoch 37/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0901 - mae: 0.2443
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1029 - mae: 0.2639
Epoch 38/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0902 - mae: 0.2535
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1044 - mae: 0.2649
Epoch 39/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1153 - mae: 0.2841
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.1018 - mae: 0.2636
Epoch 40/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1078 - mae: 0.2644
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0968 - mae: 0.2534
Epoch 41/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1061 - mae: 0.2809
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0980 - mae: 0.2597
Epoch 42/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1063 - mae: 0.2639
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0916 - mae: 0.2473
Epoch 43/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0863 - mae: 0.2428
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0882 - mae: 0.2444
Epoch 44/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.1013 - mae: 0.2578
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0915 - mae: 0.2474
Epoch 45/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0845 - mae: 0.2424
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0861 - mae: 0.2418
Epoch 46/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0938 - mae: 0.2526
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0847 - mae: 0.2385
Epoch 47/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0619 - mae: 0.2069
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0751 - mae: 0.2233
Epoch 48/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0683 - mae: 0.2137
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0718 - mae: 0.2189
Epoch 49/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0999 - mae: 0.2617
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0763 - mae: 0.2256
Epoch 50/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0797 - mae: 0.2335
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0714 - mae: 0.2192
Epoch 51/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0730 - mae: 0.2267
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0726 - mae: 0.2210
Epoch 52/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0738 - mae: 0.2252
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0676 - mae: 0.2123
Epoch 53/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0708 - mae: 0.2207
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0662 - mae: 0.2113
Epoch 54/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0615 - mae: 0.2016
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0615 - mae: 0.2032
Epoch 55/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0593 - mae: 0.1931
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0592 - mae: 0.1992
Epoch 56/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0537 - mae: 0.1913
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0550 - mae: 0.1913
Epoch 57/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0680 - mae: 0.2086
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0558 - mae: 0.1910
Epoch 58/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0413 - mae: 0.1678
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0455 - mae: 0.1739
Epoch 59/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0515 - mae: 0.1855
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0466 - mae: 0.1749
Epoch 60/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0513 - mae: 0.1848
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0416 - mae: 0.1627
Epoch 61/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0397 - mae: 0.1638
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0393 - mae: 0.1582
Epoch 62/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0355 - mae: 0.1576
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0384 - mae: 0.1572
Epoch 63/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 100ms/step - loss: 0.0328 - mae: 0.1448
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0334 - mae: 0.1451
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0340 - mae: 0.1466
Epoch 64/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0322 - mae: 0.1413
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0322 - mae: 0.1422
Epoch 65/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0284 - mae: 0.1406
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0319 - mae: 0.1442
Epoch 66/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0340 - mae: 0.1474
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0319 - mae: 0.1425
Epoch 67/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0317 - mae: 0.1413
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0288 - mae: 0.1347
Epoch 68/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0257 - mae: 0.1253
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0284 - mae: 0.1328
Epoch 69/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0280 - mae: 0.1380
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0268 - mae: 0.1304
Epoch 70/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0270 - mae: 0.1356
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0250 - mae: 0.1254
Epoch 71/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0192 - mae: 0.1113
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0251 - mae: 0.1242
Epoch 72/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0267 - mae: 0.1322
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0246 - mae: 0.1251
Epoch 73/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0248 - mae: 0.1192
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0246 - mae: 0.1217
Epoch 74/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0263 - mae: 0.1279
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0243 - mae: 0.1230
Epoch 75/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0221 - mae: 0.1225
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0237 - mae: 0.1226
Epoch 76/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0212 - mae: 0.1164
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0220 - mae: 0.1174
Epoch 77/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0239 - mae: 0.1237
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0231 - mae: 0.1210
Epoch 78/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0231 - mae: 0.1250
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0220 - mae: 0.1180
Epoch 79/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0230 - mae: 0.1238
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0209 - mae: 0.1146
Epoch 80/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0194 - mae: 0.1140
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0225 - mae: 0.1193
Epoch 81/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0181 - mae: 0.1064
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0210 - mae: 0.1133
Epoch 82/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0228 - mae: 0.1196
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0215 - mae: 0.1153
Epoch 83/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0264 - mae: 0.1267
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0229 - mae: 0.1193
Epoch 84/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0210 - mae: 0.1159
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0205 - mae: 0.1135
Epoch 85/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0206 - mae: 0.1097
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0214 - mae: 0.1155
Epoch 86/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0183 - mae: 0.1138
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0206 - mae: 0.1150
Epoch 87/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0206 - mae: 0.1140
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0212 - mae: 0.1142
Epoch 88/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0189 - mae: 0.1128
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0192 - mae: 0.1106
Epoch 89/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0153 - mae: 0.0991
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0183 - mae: 0.1078
Epoch 90/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0210 - mae: 0.1164
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0202 - mae: 0.1123
Epoch 91/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0208 - mae: 0.1146
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0194 - mae: 0.1116
Epoch 92/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0253 - mae: 0.1245
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0205 - mae: 0.1124
Epoch 93/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 92ms/step - loss: 0.0178 - mae: 0.1050
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0191 - mae: 0.1089
Epoch 94/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0199 - mae: 0.1102
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0196 - mae: 0.1111
Epoch 95/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0157 - mae: 0.1003
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0193 - mae: 0.1095
Epoch 96/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0115 - mae: 0.0871
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0194 - mae: 0.1086
Epoch 97/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0141 - mae: 0.0949
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0187 - mae: 0.1082
Epoch 98/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0206 - mae: 0.1150
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0186 - mae: 0.1091
Epoch 99/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0193 - mae: 0.1036
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0198 - mae: 0.1095
Epoch 100/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0175 - mae: 0.1110
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0193 - mae: 0.1111
Epoch 101/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0214 - mae: 0.1158
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0196 - mae: 0.1116
Epoch 102/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0185 - mae: 0.1101
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0189 - mae: 0.1094
Epoch 103/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0211 - mae: 0.1157
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0191 - mae: 0.1094
Epoch 104/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0220 - mae: 0.1218
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0184 - mae: 0.1087
Epoch 105/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0174 - mae: 0.1013
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0192 - mae: 0.1090
Epoch 106/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0239 - mae: 0.1197
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0185 - mae: 0.1067
Epoch 107/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0179 - mae: 0.1093
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0187 - mae: 0.1086
Epoch 108/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0241 - mae: 0.1219
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0216 - mae: 0.1158
Epoch 109/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 101ms/step - loss: 0.0212 - mae: 0.1197
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 0.0203 - mae: 0.1166
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0201 - mae: 0.1143
Epoch 110/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0117 - mae: 0.0845
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0180 - mae: 0.1045
Epoch 111/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0189 - mae: 0.1106
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0187 - mae: 0.1077
Epoch 112/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0167 - mae: 0.1083
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0188 - mae: 0.1101
Epoch 113/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0204 - mae: 0.1142
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0199 - mae: 0.1110
Epoch 114/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0142 - mae: 0.0963
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0169 - mae: 0.1043
Epoch 115/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0227 - mae: 0.1185
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0189 - mae: 0.1082
Epoch 116/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0168 - mae: 0.1003
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0173 - mae: 0.1042
Epoch 117/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0109 - mae: 0.0830
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0169 - mae: 0.1032
Epoch 118/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0162 - mae: 0.0998
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0172 - mae: 0.1042
Epoch 119/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0180 - mae: 0.1059
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0176 - mae: 0.1045
Epoch 120/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0157 - mae: 0.1012
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0188 - mae: 0.1093
Epoch 121/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0215 - mae: 0.1154
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0188 - mae: 0.1069
Epoch 122/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0119 - mae: 0.0865
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0173 - mae: 0.1034
Epoch 123/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0176 - mae: 0.1063
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0181 - mae: 0.1059
Epoch 124/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0167 - mae: 0.1042
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0176 - mae: 0.1058
Epoch 125/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 100ms/step - loss: 0.0188 - mae: 0.1096
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 0.0179 - mae: 0.1065
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0176 - mae: 0.1053
Epoch 126/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0153 - mae: 0.0996
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0171 - mae: 0.1039
Epoch 127/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0206 - mae: 0.1113
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0176 - mae: 0.1043
Epoch 128/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0179 - mae: 0.1057
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0187 - mae: 0.1082
Epoch 129/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0183 - mae: 0.1096
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0186 - mae: 0.1084
Epoch 130/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 101ms/step - loss: 0.0159 - mae: 0.1034
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 0.0166 - mae: 0.1031
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0173 - mae: 0.1039
Epoch 131/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0260 - mae: 0.1275
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0202 - mae: 0.1122
Epoch 132/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0145 - mae: 0.0991
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0162 - mae: 0.1012
Epoch 133/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0170 - mae: 0.1052
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0170 - mae: 0.1037
Epoch 134/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0193 - mae: 0.1078
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0175 - mae: 0.1039
Epoch 135/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0210 - mae: 0.1170
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0174 - mae: 0.1043
Epoch 136/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0118 - mae: 0.0884
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0163 - mae: 0.1006
Epoch 137/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0154 - mae: 0.0995
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0171 - mae: 0.1029
Epoch 138/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0166 - mae: 0.0969
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0171 - mae: 0.1032
Epoch 139/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0125 - mae: 0.0908
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0156 - mae: 0.0995
Epoch 140/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0188 - mae: 0.1139
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0181 - mae: 0.1084
Epoch 141/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0189 - mae: 0.1093
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0176 - mae: 0.1047
Epoch 142/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0179 - mae: 0.1080
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0183 - mae: 0.1059
Epoch 143/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0176 - mae: 0.1056
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0174 - mae: 0.1051
Epoch 144/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0146 - mae: 0.0930
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0175 - mae: 0.1037
Epoch 145/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 88ms/step - loss: 0.0098 - mae: 0.0793
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0161 - mae: 0.0996
Epoch 146/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 86ms/step - loss: 0.0185 - mae: 0.1075
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0166 - mae: 0.1019
Epoch 147/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 89ms/step - loss: 0.0140 - mae: 0.0937
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0165 - mae: 0.1018
Epoch 148/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 80ms/step - loss: 0.0190 - mae: 0.1053
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0170 - mae: 0.1017
Epoch 149/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 88ms/step - loss: 0.0142 - mae: 0.0987
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0155 - mae: 0.0993
Epoch 150/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - loss: 0.0164 - mae: 0.1041
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0157 - mae: 0.1003
Epoch 151/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.0144 - mae: 0.0944
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0160 - mae: 0.1000
Epoch 152/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0222 - mae: 0.1239
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0168 - mae: 0.1035
Epoch 153/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 100ms/step - loss: 0.0145 - mae: 0.0901
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0152 - mae: 0.0966
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0154 - mae: 0.0980
Epoch 154/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0147 - mae: 0.0941
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0161 - mae: 0.0989
Epoch 155/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0140 - mae: 0.0946
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0155 - mae: 0.0985
Epoch 156/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0132 - mae: 0.0923
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0161 - mae: 0.1003
Epoch 157/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 99ms/step - loss: 0.0157 - mae: 0.0999
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0158 - mae: 0.1007
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0162 - mae: 0.1017
Epoch 158/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0164 - mae: 0.1042
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0160 - mae: 0.1008
Epoch 159/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 98ms/step - loss: 0.0182 - mae: 0.1105
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0162 - mae: 0.1019
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - loss: 0.0160 - mae: 0.1004
Epoch 160/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0201 - mae: 0.1167
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0168 - mae: 0.1041
Epoch 161/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 98ms/step - loss: 0.0195 - mae: 0.1083
8/13 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - loss: 0.0181 - mae: 0.1053
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - loss: 0.0182 - mae: 0.1054
Epoch 162/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0192 - mae: 0.1087
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0175 - mae: 0.1044
Epoch 163/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0118 - mae: 0.0858
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0154 - mae: 0.0974
Epoch 164/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0147 - mae: 0.0962
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0168 - mae: 0.1030
Epoch 165/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0193 - mae: 0.1143
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0164 - mae: 0.1012
Epoch 166/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0144 - mae: 0.0949
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0152 - mae: 0.0983
Epoch 167/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0232 - mae: 0.1185
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0180 - mae: 0.1048
Epoch 168/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0128 - mae: 0.0902
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0153 - mae: 0.0980
Epoch 169/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0146 - mae: 0.0957
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0163 - mae: 0.1008
Epoch 170/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0189 - mae: 0.1102
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0161 - mae: 0.1008
Epoch 171/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0143 - mae: 0.0977
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0163 - mae: 0.1013
Epoch 172/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 98ms/step - loss: 0.0201 - mae: 0.1136
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 0.0172 - mae: 0.1051
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0165 - mae: 0.1026
Epoch 173/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0156 - mae: 0.0985
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0162 - mae: 0.1003
Epoch 174/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0142 - mae: 0.0942
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0152 - mae: 0.0980
Epoch 175/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0132 - mae: 0.0905
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0172 - mae: 0.1032
Epoch 176/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0179 - mae: 0.1062
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0164 - mae: 0.1013
Epoch 177/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0193 - mae: 0.1115
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0176 - mae: 0.1061
Epoch 178/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0183 - mae: 0.1114
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0172 - mae: 0.1042
Epoch 179/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0151 - mae: 0.0995
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0153 - mae: 0.0988
Epoch 180/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0118 - mae: 0.0873
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0150 - mae: 0.0971
Epoch 181/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0165 - mae: 0.1045
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0157 - mae: 0.1000
Epoch 182/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0164 - mae: 0.1008
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0147 - mae: 0.0957
Epoch 183/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0105 - mae: 0.0826
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0157 - mae: 0.0986
Epoch 184/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0200 - mae: 0.1158
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0170 - mae: 0.1037
Epoch 185/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0184 - mae: 0.1040
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0160 - mae: 0.0987
Epoch 186/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0133 - mae: 0.0841
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0154 - mae: 0.0961
Epoch 187/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0160 - mae: 0.1004
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0151 - mae: 0.0977
Epoch 188/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0140 - mae: 0.0938
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0157 - mae: 0.0984
Epoch 189/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0160 - mae: 0.0961
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0155 - mae: 0.0984
Epoch 190/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0136 - mae: 0.0908
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0141 - mae: 0.0934
Epoch 191/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0173 - mae: 0.1085
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0158 - mae: 0.1008
Epoch 192/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0154 - mae: 0.0976
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0163 - mae: 0.1021
Epoch 193/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0131 - mae: 0.0892
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0156 - mae: 0.0978
Epoch 194/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0144 - mae: 0.0929
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0159 - mae: 0.0998
Epoch 195/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0173 - mae: 0.1072
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0164 - mae: 0.1018
Epoch 196/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0116 - mae: 0.0847
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0151 - mae: 0.0976
Epoch 197/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0098 - mae: 0.0772
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0153 - mae: 0.0970
Epoch 198/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0221 - mae: 0.1156
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0163 - mae: 0.0999
Epoch 199/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 101ms/step - loss: 0.0170 - mae: 0.1070
6/13 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 0.0149 - mae: 0.0984
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0148 - mae: 0.0970
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0148 - mae: 0.0969
Epoch 200/200
1/13 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0153 - mae: 0.1001
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0150 - mae: 0.0978
phase apprentissage: 18.67 seconds
modele_lstm.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩ │ lstm (LSTM) │ (None, 14) │ 896 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dense (Dense) │ (None, 14) │ 210 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dense_1 (Dense) │ (None, 7) │ 105 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dense_2 (Dense) │ (None, 7) │ 56 │ └─────────────────────────────────┴────────────────────────┴───────────────┘
Total params: 3,803 (14.86 KB)
Trainable params: 1,267 (4.95 KB)
Non-trainable params: 0 (0.00 B)
Optimizer params: 2,536 (9.91 KB)
ypred = modele_lstm.predict(Xlearn, verbose=True)
print(Xlearn.shape,ypred.shape)
Ylearn = ylearn.reshape(ylearn.shape[0],nap,)
print("R2 score {:.2f}".format(r2_score(Ylearn, ypred)))
print("model evaluate loss/mae")
modele_lstm.evaluate(Xlearn,ylearn)
1/13 ━━━━━━━━━━━━━━━━━━━━ 1s 107ms/step
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step
(400, 14, 1) (400, 7)
R2 score 0.98
model evaluate loss/mae
1/13 ━━━━━━━━━━━━━━━━━━━━ 2s 171ms/step - loss: 0.0217 - mae: 0.1171
13/13 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0161 - mae: 0.0992
[0.01563955284655094, 0.09787154942750931]
# prediction à partir de t2
t2 = t0
Xpred = np.array([ys[t2-nav:t2]]).reshape(1,nav,1)
ypred = modele_lstm.predict(Xpred, verbose=True)
print(Xpred.shape,ypred.shape)
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step
(1, 14, 1) (1, 7)
Xpred = Xpred.reshape(1,nav,)
ypred = ypred.reshape(nap)
plot_pred()
