Introduction

In this notebook, I am creating a tensorflow based timeseries forecasting model using CNN & LSTM.

Acknowledgments:

This notebook is inspired by the course 4 of TensorFlow in Practice Specialization which is Sequences, Time Series and Prediction by Laurence Moroney.

I used this course to prepare for the tensorflow speciality examination, and I am using the methods and codes that Mr. Moroney used in the course.

Sections:

  1. Introduction
  2. Importing and exploring the dataset
  3. Imputting missing values
  4. Naive forecast
  5. Moving average forecast
  6. Preparing a pre-fetched tensorflow dataset
  7. Creating a CNN-LSTM based model
  8. Model metrics
  9. Conclusion
  10. References

If you find this notebook helpful for you, please upvote!

I always like to import the libraries in the alphabetical order so that is it easy to review when needed

In [1]:
import matplotlib.pyplot as plt
import numpy as np
import pandas as pd
pd.set_option('display.max_rows', 20)

import tensorflow as tf

import warnings
warnings.filterwarnings('ignore')

Importing and exploring the dataset

In [2]:
data = pd.read_csv("/kaggle/input/daily-temperature-of-major-cities/city_temperature.csv")
data.head()
Out[2]:
Region Country State City Month Day Year AvgTemperature
0 Africa Algeria NaN Algiers 1 1 1995 64.2
1 Africa Algeria NaN Algiers 1 2 1995 49.4
2 Africa Algeria NaN Algiers 1 3 1995 48.8
3 Africa Algeria NaN Algiers 1 4 1995 46.4
4 Africa Algeria NaN Algiers 1 5 1995 47.9

Checking if all the cities has the data for a full range

In [3]:
data['City'].value_counts()
Out[3]:
Washington DC    18530
Springfield      18530
Columbus         18530
Portland         18530
Washington       18530
                 ...  
Frankfurt         4136
Flagstaff         3574
Pristina          3427
Yerevan           3226
Bonn              3133
Name: City, Length: 321, dtype: int64

I wanted to develop a timeseries model for a single city. For this purpouse, I am taking the city Chennai (previously known as Madras), from Tamil Nadu, India. The city where I reside.

Chennai generally has only two season. It is hot for almost throughout the year, and rains in November/December months.

In [4]:
chennai = data[data["City"] == "Chennai (Madras)"]
chennai.head()
Out[4]:
Region Country State City Month Day Year AvgTemperature
331055 Asia India NaN Chennai (Madras) 1 1 1995 72.4
331056 Asia India NaN Chennai (Madras) 1 2 1995 73.5
331057 Asia India NaN Chennai (Madras) 1 3 1995 72.6
331058 Asia India NaN Chennai (Madras) 1 4 1995 75.2
331059 Asia India NaN Chennai (Madras) 1 5 1995 74.8

Checking if all the year has complete records

In [5]:
chennai["Year"].value_counts()
Out[5]:
2015    366
2012    366
2008    366
2016    366
1996    366
       ... 
2006    365
1998    365
2013    365
2019    365
2020    134
Name: Year, Length: 26, dtype: int64

Imputing missing values

The dataset has recorded missing values with the number -99. The chennai dataset has missing values close to 29 records.

I will use forward fill method to impute the missing values for the dataset. That is, we will take the previously non missing value and fill it in the place of the missing value.

First replacing -99 with np.nan

In [6]:
"""-99 is put in place of missing values. 
We will have to forward fill with the last non missing value before -99
"""
chennai["AvgTemperature"] = np.where(chennai["AvgTemperature"] == -99, np.nan, chennai["AvgTemperature"])
chennai.isnull().sum()
Out[6]:
Region               0
Country              0
State             9266
City                 0
Month                0
Day                  0
Year                 0
AvgTemperature      29
dtype: int64

Now using ffill() method to fill the np.nan that we created

In [7]:
chennai["AvgTemperature"] = chennai["AvgTemperature"].ffill()
chennai.isnull().sum()
Out[7]:
Region               0
Country              0
State             9266
City                 0
Month                0
Day                  0
Year                 0
AvgTemperature       0
dtype: int64

Since there is no single column that contains the date, creating a new column called Time_steps to combine the year month and date fields

In [8]:
chennai.dtypes
chennai["Time_steps"] = pd.to_datetime((chennai.Year*10000 + chennai.Month*100 + chennai.Day).apply(str),format='%Y%m%d')
chennai.head()
Out[8]:
Region Country State City Month Day Year AvgTemperature Time_steps
331055 Asia India NaN Chennai (Madras) 1 1 1995 72.4 1995-01-01
331056 Asia India NaN Chennai (Madras) 1 2 1995 73.5 1995-01-02
331057 Asia India NaN Chennai (Madras) 1 3 1995 72.6 1995-01-03
331058 Asia India NaN Chennai (Madras) 1 4 1995 75.2 1995-01-04
331059 Asia India NaN Chennai (Madras) 1 5 1995 74.8 1995-01-05
In [9]:
def plot_series(time, series, format="-", start=0, end=None):
    """to plot the series"""
    plt.plot(time[start:end], series[start:end], format)
    plt.xlabel("Year")
    plt.ylabel("Temprature")
    plt.grid(True)

Plotting the timeseries for the entire duration

In [10]:
time_step = chennai["Time_steps"].tolist()
temprature = chennai["AvgTemperature"].tolist()

series = np.array(temprature)
time = np.array(time_step)
plt.figure(figsize=(10, 6))
plot_series(time, series)

Plotting for recent one year only

In [11]:
plt.figure(figsize=(10, 6))
plot_series(time[-365:], series[-365:])

There are totally 9,266 records on the dataset. We will keep 8000 records for training (85%) and keep remaining 15% for testing

In [12]:
split_time = 8000
time_train = time[:split_time]
x_train = series[:split_time]
time_valid = time[split_time:]
x_valid = series[split_time:]

Naive forecast

In naive forecast, we will take the record in month - 1 (the month previously) and assume that it will be carried forward for the next observation also.

In [13]:
naive_forecast = series[split_time - 1:-1]
In [14]:
plt.figure(figsize=(10, 6))
plot_series(time_valid, x_valid)
plot_series(time_valid, naive_forecast)

Since the plot above is so crowded, we will take for a small section of the dataset and visualize it.

In [15]:
#Zoom in and see only few points
plt.figure(figsize=(10, 6))
plot_series(time_valid, x_valid, start=0, end=150)
plot_series(time_valid, naive_forecast, start=1, end=151)
In [16]:
print(tf.keras.metrics.mean_squared_error(x_valid, naive_forecast).numpy())
print(tf.keras.metrics.mean_absolute_error(x_valid, naive_forecast).numpy())
2.4476619273301727
1.0900473933649286

Moving average forecast

In moving average forecast, we will take the value of average for the previous window period and take it as the prediction for the next period.

In [17]:
def moving_average_forecast(series, window_size):
    """Forecasts the mean of the last few values.
     If window_size=1, then this is equivalent to naive forecast"""
    forecast = []
    for time in range(len(series) - window_size):
        forecast.append(series[time:time + window_size].mean())
    return np.array(forecast)
In [18]:
moving_avg = moving_average_forecast(series, 30)[split_time - 30:]

plt.figure(figsize=(10, 6))
plot_series(time_valid, x_valid)
plot_series(time_valid, moving_avg)
In [19]:
print(tf.keras.metrics.mean_squared_error(x_valid, moving_avg).numpy())
print(tf.keras.metrics.mean_absolute_error(x_valid, moving_avg).numpy())
5.458766745655611
1.841850974196946

Differencing

We will use a technique called differencing to remove the trend and seasonality from the data. Here we difference the data between what the value was 365 days (1 year back). The differencing should always follow the seasonal pattern.

In [20]:
diff_series = (series[365:] - series[:-365])
diff_time = time[365:]

plt.figure(figsize=(10, 6))
plot_series(diff_time, diff_series)
plt.show()
In [21]:
diff_moving_avg = moving_average_forecast(diff_series, 50)[split_time - 365 - 50:]

plt.figure(figsize=(10, 6))
plot_series(time_valid, diff_series[split_time - 365:])
plot_series(time_valid, diff_moving_avg)
plt.show()

Restoring trend and seasonality

But these are just the forecast of the differenced timeseries. To get the value for the original timeseries, we have to add back the value of t-365

In [22]:
diff_moving_avg_plus_past = series[split_time - 365:-365] + diff_moving_avg

plt.figure(figsize=(10, 6))
plot_series(time_valid, x_valid)
plot_series(time_valid, diff_moving_avg_plus_past)
plt.show()
In [23]:
print(tf.keras.metrics.mean_squared_error(x_valid, diff_moving_avg_plus_past).numpy())
print(tf.keras.metrics.mean_absolute_error(x_valid, diff_moving_avg_plus_past).numpy())
7.969171658767771
2.1771153238546606

Smoothing with moving average again

The above plot has a lot of noise. To smooth it again, we do a moving average on that

In [24]:
diff_moving_avg_plus_smooth_past = moving_average_forecast(series[split_time - 370:-360], 10) + diff_moving_avg

plt.figure(figsize=(10, 6))
plot_series(time_valid, x_valid)
plot_series(time_valid, diff_moving_avg_plus_smooth_past)
plt.show()
In [25]:
print(tf.keras.metrics.mean_squared_error(x_valid, diff_moving_avg_plus_smooth_past).numpy())
print(tf.keras.metrics.mean_absolute_error(x_valid, diff_moving_avg_plus_smooth_past).numpy())
5.630983364928909
1.8089115323854659

How to prepare a window dataset?

A window dataset is used in the dataset prepration of the tensorflow. It yields a prefetched dataset with the x and y variables as tensors.

Step 1: Converting the numpy array into a tensor using tensor_slices

In [26]:
series1 = tf.expand_dims(series, axis=-1)
ds = tf.data.Dataset.from_tensor_slices(series1[:20])
for val in ds:
    print(val.numpy())
[72.4]
[73.5]
[72.6]
[75.2]
[74.8]
[76.4]
[78.4]
[78.6]
[78.1]
[79.3]
[77.9]
[79.]
[73.4]
[76.7]
[73.7]
[77.]
[71.1]
[72.6]
[76.1]
[75.7]

Step 2: tf window option groups 5 (window size) into a single line

But for the last observations for which there are no observations to group will be kept as remaining as in the outupt of this cell

In [27]:
dataset = ds.window(5, shift=1)
for window_dataset in dataset:
    for val in window_dataset:
        print(val.numpy(), end=" ")
    print()
[72.4] [73.5] [72.6] [75.2] [74.8] 
[73.5] [72.6] [75.2] [74.8] [76.4] 
[72.6] [75.2] [74.8] [76.4] [78.4] 
[75.2] [74.8] [76.4] [78.4] [78.6] 
[74.8] [76.4] [78.4] [78.6] [78.1] 
[76.4] [78.4] [78.6] [78.1] [79.3] 
[78.4] [78.6] [78.1] [79.3] [77.9] 
[78.6] [78.1] [79.3] [77.9] [79.] 
[78.1] [79.3] [77.9] [79.] [73.4] 
[79.3] [77.9] [79.] [73.4] [76.7] 
[77.9] [79.] [73.4] [76.7] [73.7] 
[79.] [73.4] [76.7] [73.7] [77.] 
[73.4] [76.7] [73.7] [77.] [71.1] 
[76.7] [73.7] [77.] [71.1] [72.6] 
[73.7] [77.] [71.1] [72.6] [76.1] 
[77.] [71.1] [72.6] [76.1] [75.7] 
[71.1] [72.6] [76.1] [75.7] 
[72.6] [76.1] [75.7] 
[76.1] [75.7] 
[75.7] 

Step 3: Drop reminder set to True will drop the variables which are not having the grouping

In [28]:
dataset = ds.window(5, shift=1, drop_remainder=True)
for window_dataset in dataset:
    for val in window_dataset:
        print(val.numpy(), end=" ")
    print()
[72.4] [73.5] [72.6] [75.2] [74.8] 
[73.5] [72.6] [75.2] [74.8] [76.4] 
[72.6] [75.2] [74.8] [76.4] [78.4] 
[75.2] [74.8] [76.4] [78.4] [78.6] 
[74.8] [76.4] [78.4] [78.6] [78.1] 
[76.4] [78.4] [78.6] [78.1] [79.3] 
[78.4] [78.6] [78.1] [79.3] [77.9] 
[78.6] [78.1] [79.3] [77.9] [79.] 
[78.1] [79.3] [77.9] [79.] [73.4] 
[79.3] [77.9] [79.] [73.4] [76.7] 
[77.9] [79.] [73.4] [76.7] [73.7] 
[79.] [73.4] [76.7] [73.7] [77.] 
[73.4] [76.7] [73.7] [77.] [71.1] 
[76.7] [73.7] [77.] [71.1] [72.6] 
[73.7] [77.] [71.1] [72.6] [76.1] 
[77.] [71.1] [72.6] [76.1] [75.7] 

Step 4: flat map option will group the 5 observation in a single tensor variable

In [29]:
dataset = ds.window(5, shift=1, drop_remainder=True)
dataset = dataset.flat_map(lambda window: window.batch(5))
for window in dataset:
    print(window.numpy())
[[72.4]
 [73.5]
 [72.6]
 [75.2]
 [74.8]]
[[73.5]
 [72.6]
 [75.2]
 [74.8]
 [76.4]]
[[72.6]
 [75.2]
 [74.8]
 [76.4]
 [78.4]]
[[75.2]
 [74.8]
 [76.4]
 [78.4]
 [78.6]]
[[74.8]
 [76.4]
 [78.4]
 [78.6]
 [78.1]]
[[76.4]
 [78.4]
 [78.6]
 [78.1]
 [79.3]]
[[78.4]
 [78.6]
 [78.1]
 [79.3]
 [77.9]]
[[78.6]
 [78.1]
 [79.3]
 [77.9]
 [79. ]]
[[78.1]
 [79.3]
 [77.9]
 [79. ]
 [73.4]]
[[79.3]
 [77.9]
 [79. ]
 [73.4]
 [76.7]]
[[77.9]
 [79. ]
 [73.4]
 [76.7]
 [73.7]]
[[79. ]
 [73.4]
 [76.7]
 [73.7]
 [77. ]]
[[73.4]
 [76.7]
 [73.7]
 [77. ]
 [71.1]]
[[76.7]
 [73.7]
 [77. ]
 [71.1]
 [72.6]]
[[73.7]
 [77. ]
 [71.1]
 [72.6]
 [76.1]]
[[77. ]
 [71.1]
 [72.6]
 [76.1]
 [75.7]]

Step 5: map option will split the variables into X and y variables

In [30]:
dataset = ds.window(5, shift=1, drop_remainder=True)
dataset = dataset.flat_map(lambda window: window.batch(5))
dataset = dataset.map(lambda window: (window[:-1], window[-1:]))
for x,y in dataset:
    print(x.numpy(), y.numpy())
[[72.4]
 [73.5]
 [72.6]
 [75.2]] [[74.8]]
[[73.5]
 [72.6]
 [75.2]
 [74.8]] [[76.4]]
[[72.6]
 [75.2]
 [74.8]
 [76.4]] [[78.4]]
[[75.2]
 [74.8]
 [76.4]
 [78.4]] [[78.6]]
[[74.8]
 [76.4]
 [78.4]
 [78.6]] [[78.1]]
[[76.4]
 [78.4]
 [78.6]
 [78.1]] [[79.3]]
[[78.4]
 [78.6]
 [78.1]
 [79.3]] [[77.9]]
[[78.6]
 [78.1]
 [79.3]
 [77.9]] [[79.]]
[[78.1]
 [79.3]
 [77.9]
 [79. ]] [[73.4]]
[[79.3]
 [77.9]
 [79. ]
 [73.4]] [[76.7]]
[[77.9]
 [79. ]
 [73.4]
 [76.7]] [[73.7]]
[[79. ]
 [73.4]
 [76.7]
 [73.7]] [[77.]]
[[73.4]
 [76.7]
 [73.7]
 [77. ]] [[71.1]]
[[76.7]
 [73.7]
 [77. ]
 [71.1]] [[72.6]]
[[73.7]
 [77. ]
 [71.1]
 [72.6]] [[76.1]]
[[77. ]
 [71.1]
 [72.6]
 [76.1]] [[75.7]]

Step 6: shuffle option will shuffle the dataset into random order.

Till the previous step, the observation would have been in the correct order. the shuffle will ensure that the data are randomly mixed up

In [31]:
dataset = ds.window(5, shift=1, drop_remainder=True)
dataset = dataset.flat_map(lambda window: window.batch(5))
dataset = dataset.map(lambda window: (window[:-1], window[-1:]))
dataset = dataset.shuffle(buffer_size=10)
for x,y in dataset:
    print(x.numpy(), y.numpy())
[[79.3]
 [77.9]
 [79. ]
 [73.4]] [[76.7]]
[[78.4]
 [78.6]
 [78.1]
 [79.3]] [[77.9]]
[[78.6]
 [78.1]
 [79.3]
 [77.9]] [[79.]]
[[75.2]
 [74.8]
 [76.4]
 [78.4]] [[78.6]]
[[76.4]
 [78.4]
 [78.6]
 [78.1]] [[79.3]]
[[72.6]
 [75.2]
 [74.8]
 [76.4]] [[78.4]]
[[73.7]
 [77. ]
 [71.1]
 [72.6]] [[76.1]]
[[76.7]
 [73.7]
 [77. ]
 [71.1]] [[72.6]]
[[73.5]
 [72.6]
 [75.2]
 [74.8]] [[76.4]]
[[74.8]
 [76.4]
 [78.4]
 [78.6]] [[78.1]]
[[78.1]
 [79.3]
 [77.9]
 [79. ]] [[73.4]]
[[77.9]
 [79. ]
 [73.4]
 [76.7]] [[73.7]]
[[79. ]
 [73.4]
 [76.7]
 [73.7]] [[77.]]
[[77. ]
 [71.1]
 [72.6]
 [76.1]] [[75.7]]
[[72.4]
 [73.5]
 [72.6]
 [75.2]] [[74.8]]
[[73.4]
 [76.7]
 [73.7]
 [77. ]] [[71.1]]

Step 7: Batch option will put the variables into mini-batches suitable for training. It will group both X and y into mini batches

In [32]:
dataset = ds.window(5, shift=1, drop_remainder=True)
dataset = dataset.flat_map(lambda window: window.batch(5))
dataset = dataset.map(lambda window: (window[:-1], window[-1:]))
dataset = dataset.shuffle(buffer_size=10)
dataset = dataset.batch(2).prefetch(1)
for x,y in dataset:
    print("x = ", x.numpy())
    print("y = ", y.numpy())
    print("*"*25)
x =  [[[74.8]
  [76.4]
  [78.4]
  [78.6]]

 [[72.4]
  [73.5]
  [72.6]
  [75.2]]]
y =  [[[78.1]]

 [[74.8]]]
*************************
x =  [[[77.9]
  [79. ]
  [73.4]
  [76.7]]

 [[76.4]
  [78.4]
  [78.6]
  [78.1]]]
y =  [[[73.7]]

 [[79.3]]]
*************************
x =  [[[76.7]
  [73.7]
  [77. ]
  [71.1]]

 [[73.5]
  [72.6]
  [75.2]
  [74.8]]]
y =  [[[72.6]]

 [[76.4]]]
*************************
x =  [[[75.2]
  [74.8]
  [76.4]
  [78.4]]

 [[72.6]
  [75.2]
  [74.8]
  [76.4]]]
y =  [[[78.6]]

 [[78.4]]]
*************************
x =  [[[73.4]
  [76.7]
  [73.7]
  [77. ]]

 [[79. ]
  [73.4]
  [76.7]
  [73.7]]]
y =  [[[71.1]]

 [[77. ]]]
*************************
x =  [[[73.7]
  [77. ]
  [71.1]
  [72.6]]

 [[77. ]
  [71.1]
  [72.6]
  [76.1]]]
y =  [[[76.1]]

 [[75.7]]]
*************************
x =  [[[79.3]
  [77.9]
  [79. ]
  [73.4]]

 [[78.4]
  [78.6]
  [78.1]
  [79.3]]]
y =  [[[76.7]]

 [[77.9]]]
*************************
x =  [[[78.1]
  [79.3]
  [77.9]
  [79. ]]

 [[78.6]
  [78.1]
  [79.3]
  [77.9]]]
y =  [[[73.4]]

 [[79. ]]]
*************************

Window size is how many observations in the past do you want to see before making a prediction. Batch size is similar to mini-batches set while training the neural network

In [33]:
window_size = 60
batch_size = 32
shuffle_buffer_size = 1000
In [34]:
def windowed_dataset(series, window_size, batch_size, shuffle_buffer):
    """
    To create a window dataset given a numpy as input
    
    Returns: A prefetched tensorflow dataset
    """
    series = tf.expand_dims(series, axis=-1)
    ds = tf.data.Dataset.from_tensor_slices(series)
    ds = ds.window(window_size + 1, shift=1, drop_remainder=True)
    ds = ds.flat_map(lambda w: w.batch(window_size + 1))
    ds = ds.shuffle(shuffle_buffer)
    ds = ds.map(lambda w: (w[:-1], w[1:]))
    return ds.batch(batch_size).prefetch(1)

Finding the correct learning rate

Using a call back for LearningRateScheduler(). For every epoch this just changes the learning rate a little so that the learning rate varies from 1e-8 to 1e-6

Also a new loss function Huber() is introduced which is less sensitive to outliers.

In [35]:
tf.keras.backend.clear_session()
tf.random.set_seed(51)
np.random.seed(51)
window_size = 64
batch_size = 256
train_set = windowed_dataset(x_train, window_size, batch_size, shuffle_buffer_size)
print(train_set)
print(x_train.shape)

model = tf.keras.models.Sequential([
  tf.keras.layers.Conv1D(filters=32, kernel_size=5,
                      strides=1, padding="causal",
                      activation="relu",
                      input_shape=[None, 1]),
  tf.keras.layers.LSTM(64, return_sequences=True),
  tf.keras.layers.LSTM(64, return_sequences=True),
  tf.keras.layers.Dense(30, activation="relu"),
  tf.keras.layers.Dense(10, activation="relu"),
  tf.keras.layers.Dense(1),
  tf.keras.layers.Lambda(lambda x: x * 400)
])

lr_schedule = tf.keras.callbacks.LearningRateScheduler(lambda epoch: 1e-8 * 10**(epoch / 20))

optimizer = tf.keras.optimizers.SGD(lr=1e-8, momentum=0.9)
model.compile(loss=tf.keras.losses.Huber(),
              optimizer=optimizer,
              metrics=["mae"])
history = model.fit(train_set, epochs=100, callbacks=[lr_schedule])
<PrefetchDataset shapes: ((None, None, 1), (None, None, 1)), types: (tf.float64, tf.float64)>
(8000,)
Epoch 1/100
31/31 [==============================] - 2s 55ms/step - loss: 60.4235 - mae: 60.9235 - lr: 1.0000e-08
Epoch 2/100
31/31 [==============================] - 1s 48ms/step - loss: 49.6979 - mae: 50.1979 - lr: 1.1220e-08
Epoch 3/100
31/31 [==============================] - 2s 55ms/step - loss: 37.1379 - mae: 37.6379 - lr: 1.2589e-08
Epoch 4/100
31/31 [==============================] - 2s 54ms/step - loss: 23.2685 - mae: 23.7685 - lr: 1.4125e-08
Epoch 5/100
31/31 [==============================] - 2s 49ms/step - loss: 9.2907 - mae: 9.7763 - lr: 1.5849e-08
Epoch 6/100
31/31 [==============================] - 1s 47ms/step - loss: 6.2288 - mae: 6.7014 - lr: 1.7783e-08
Epoch 7/100
31/31 [==============================] - 1s 47ms/step - loss: 6.0637 - mae: 6.5368 - lr: 1.9953e-08
Epoch 8/100
31/31 [==============================] - 1s 48ms/step - loss: 6.0161 - mae: 6.4889 - lr: 2.2387e-08
Epoch 9/100
31/31 [==============================] - 2s 60ms/step - loss: 5.9614 - mae: 6.4342 - lr: 2.5119e-08
Epoch 10/100
31/31 [==============================] - 1s 47ms/step - loss: 5.9036 - mae: 6.3761 - lr: 2.8184e-08
Epoch 11/100
31/31 [==============================] - 2s 53ms/step - loss: 5.8392 - mae: 6.3116 - lr: 3.1623e-08
Epoch 12/100
31/31 [==============================] - 2s 52ms/step - loss: 5.7670 - mae: 6.2391 - lr: 3.5481e-08
Epoch 13/100
31/31 [==============================] - 2s 54ms/step - loss: 5.6828 - mae: 6.1546 - lr: 3.9811e-08
Epoch 14/100
31/31 [==============================] - 2s 51ms/step - loss: 5.5730 - mae: 6.0443 - lr: 4.4668e-08
Epoch 15/100
31/31 [==============================] - 2s 57ms/step - loss: 5.3217 - mae: 5.7900 - lr: 5.0119e-08
Epoch 16/100
31/31 [==============================] - 2s 54ms/step - loss: 5.2447 - mae: 5.7123 - lr: 5.6234e-08
Epoch 17/100
31/31 [==============================] - 1s 47ms/step - loss: 5.3148 - mae: 5.7857 - lr: 6.3096e-08
Epoch 18/100
31/31 [==============================] - 1s 48ms/step - loss: 5.1991 - mae: 5.6695 - lr: 7.0795e-08
Epoch 19/100
31/31 [==============================] - 1s 46ms/step - loss: 5.0584 - mae: 5.5280 - lr: 7.9433e-08
Epoch 20/100
31/31 [==============================] - 2s 49ms/step - loss: 4.9006 - mae: 5.3692 - lr: 8.9125e-08
Epoch 21/100
31/31 [==============================] - 2s 53ms/step - loss: 4.7325 - mae: 5.2001 - lr: 1.0000e-07
Epoch 22/100
31/31 [==============================] - 1s 47ms/step - loss: 4.5254 - mae: 4.9913 - lr: 1.1220e-07
Epoch 23/100
31/31 [==============================] - 1s 47ms/step - loss: 4.2810 - mae: 4.7445 - lr: 1.2589e-07
Epoch 24/100
31/31 [==============================] - 1s 46ms/step - loss: 4.2077 - mae: 4.6715 - lr: 1.4125e-07
Epoch 25/100
31/31 [==============================] - 1s 48ms/step - loss: 4.0050 - mae: 4.4672 - lr: 1.5849e-07
Epoch 26/100
31/31 [==============================] - 1s 47ms/step - loss: 3.7234 - mae: 4.1811 - lr: 1.7783e-07
Epoch 27/100
31/31 [==============================] - 2s 52ms/step - loss: 3.4845 - mae: 3.9380 - lr: 1.9953e-07
Epoch 28/100
31/31 [==============================] - 2s 58ms/step - loss: 3.3732 - mae: 3.8262 - lr: 2.2387e-07
Epoch 29/100
31/31 [==============================] - 2s 50ms/step - loss: 3.3760 - mae: 3.8305 - lr: 2.5119e-07
Epoch 30/100
31/31 [==============================] - 1s 45ms/step - loss: 3.2844 - mae: 3.7383 - lr: 2.8184e-07
Epoch 31/100
31/31 [==============================] - 1s 45ms/step - loss: 3.1385 - mae: 3.5905 - lr: 3.1623e-07
Epoch 32/100
31/31 [==============================] - 2s 59ms/step - loss: 2.9926 - mae: 3.4427 - lr: 3.5481e-07
Epoch 33/100
31/31 [==============================] - 1s 46ms/step - loss: 2.8412 - mae: 3.2893 - lr: 3.9811e-07
Epoch 34/100
31/31 [==============================] - 2s 56ms/step - loss: 2.6500 - mae: 3.0935 - lr: 4.4668e-07
Epoch 35/100
31/31 [==============================] - 2s 56ms/step - loss: 2.5463 - mae: 2.9884 - lr: 5.0119e-07
Epoch 36/100
31/31 [==============================] - 2s 50ms/step - loss: 2.4693 - mae: 2.9109 - lr: 5.6234e-07
Epoch 37/100
31/31 [==============================] - 1s 48ms/step - loss: 2.3968 - mae: 2.8378 - lr: 6.3096e-07
Epoch 38/100
31/31 [==============================] - 2s 49ms/step - loss: 2.3335 - mae: 2.7742 - lr: 7.0795e-07
Epoch 39/100
31/31 [==============================] - 1s 48ms/step - loss: 2.2486 - mae: 2.6882 - lr: 7.9433e-07
Epoch 40/100
31/31 [==============================] - 2s 54ms/step - loss: 2.1839 - mae: 2.6230 - lr: 8.9125e-07
Epoch 41/100
31/31 [==============================] - 1s 46ms/step - loss: 2.1402 - mae: 2.5792 - lr: 1.0000e-06
Epoch 42/100
31/31 [==============================] - 1s 46ms/step - loss: 2.0955 - mae: 2.5342 - lr: 1.1220e-06
Epoch 43/100
31/31 [==============================] - 2s 48ms/step - loss: 2.1408 - mae: 2.5843 - lr: 1.2589e-06
Epoch 44/100
31/31 [==============================] - 2s 49ms/step - loss: 2.2656 - mae: 2.7170 - lr: 1.4125e-06
Epoch 45/100
31/31 [==============================] - 2s 52ms/step - loss: 2.1883 - mae: 2.6404 - lr: 1.5849e-06
Epoch 46/100
31/31 [==============================] - 1s 48ms/step - loss: 2.1757 - mae: 2.6292 - lr: 1.7783e-06
Epoch 47/100
31/31 [==============================] - 2s 50ms/step - loss: 2.1896 - mae: 2.6445 - lr: 1.9953e-06
Epoch 48/100
31/31 [==============================] - 2s 51ms/step - loss: 2.1901 - mae: 2.6466 - lr: 2.2387e-06
Epoch 49/100
31/31 [==============================] - 1s 47ms/step - loss: 2.3056 - mae: 2.7672 - lr: 2.5119e-06
Epoch 50/100
31/31 [==============================] - 1s 46ms/step - loss: 2.2991 - mae: 2.7599 - lr: 2.8184e-06
Epoch 51/100
31/31 [==============================] - 1s 47ms/step - loss: 2.3306 - mae: 2.7928 - lr: 3.1623e-06
Epoch 52/100
31/31 [==============================] - 2s 51ms/step - loss: 2.4801 - mae: 2.9442 - lr: 3.5481e-06
Epoch 53/100
31/31 [==============================] - 2s 53ms/step - loss: 2.6399 - mae: 3.1102 - lr: 3.9811e-06
Epoch 54/100
31/31 [==============================] - 1s 45ms/step - loss: 2.7478 - mae: 3.2204 - lr: 4.4668e-06
Epoch 55/100
31/31 [==============================] - 1s 48ms/step - loss: 2.8296 - mae: 3.3040 - lr: 5.0119e-06
Epoch 56/100
31/31 [==============================] - 1s 46ms/step - loss: 2.9673 - mae: 3.4442 - lr: 5.6234e-06
Epoch 57/100
31/31 [==============================] - 1s 47ms/step - loss: 2.9896 - mae: 3.4667 - lr: 6.3096e-06
Epoch 58/100
31/31 [==============================] - 1s 46ms/step - loss: 3.0614 - mae: 3.5388 - lr: 7.0795e-06
Epoch 59/100
31/31 [==============================] - 1s 47ms/step - loss: 3.2151 - mae: 3.6874 - lr: 7.9433e-06
Epoch 60/100
31/31 [==============================] - 2s 53ms/step - loss: 6.8204 - mae: 7.3129 - lr: 8.9125e-06
Epoch 61/100
31/31 [==============================] - 2s 49ms/step - loss: 5.8319 - mae: 6.3224 - lr: 1.0000e-05
Epoch 62/100
31/31 [==============================] - 1s 47ms/step - loss: 5.5526 - mae: 6.0418 - lr: 1.1220e-05
Epoch 63/100
31/31 [==============================] - 2s 49ms/step - loss: 5.1654 - mae: 5.6512 - lr: 1.2589e-05
Epoch 64/100
31/31 [==============================] - 2s 54ms/step - loss: 4.9511 - mae: 5.4355 - lr: 1.4125e-05
Epoch 65/100
31/31 [==============================] - 1s 45ms/step - loss: 4.9616 - mae: 5.4454 - lr: 1.5849e-05
Epoch 66/100
31/31 [==============================] - 2s 53ms/step - loss: 4.1241 - mae: 4.6022 - lr: 1.7783e-05
Epoch 67/100
31/31 [==============================] - 2s 59ms/step - loss: 3.5807 - mae: 4.0547 - lr: 1.9953e-05
Epoch 68/100
31/31 [==============================] - 2s 50ms/step - loss: 2.9691 - mae: 3.4399 - lr: 2.2387e-05
Epoch 69/100
31/31 [==============================] - 2s 57ms/step - loss: 3.0349 - mae: 3.5057 - lr: 2.5119e-05
Epoch 70/100
31/31 [==============================] - 2s 49ms/step - loss: 3.3395 - mae: 3.8136 - lr: 2.8184e-05
Epoch 71/100
31/31 [==============================] - 1s 48ms/step - loss: 6.1146 - mae: 6.5983 - lr: 3.1623e-05
Epoch 72/100
31/31 [==============================] - 2s 56ms/step - loss: 2.8603 - mae: 3.3298 - lr: 3.5481e-05
Epoch 73/100
31/31 [==============================] - 1s 48ms/step - loss: 3.2208 - mae: 3.6933 - lr: 3.9811e-05
Epoch 74/100
31/31 [==============================] - 1s 47ms/step - loss: 3.6229 - mae: 4.0997 - lr: 4.4668e-05
Epoch 75/100
31/31 [==============================] - 2s 50ms/step - loss: 3.7352 - mae: 4.2117 - lr: 5.0119e-05
Epoch 76/100
31/31 [==============================] - 1s 46ms/step - loss: 3.7175 - mae: 4.1927 - lr: 5.6234e-05
Epoch 77/100
31/31 [==============================] - 1s 48ms/step - loss: 3.5841 - mae: 4.0592 - lr: 6.3096e-05
Epoch 78/100
31/31 [==============================] - 1s 46ms/step - loss: 4.1044 - mae: 4.5814 - lr: 7.0795e-05
Epoch 79/100
31/31 [==============================] - 2s 51ms/step - loss: 12.6840 - mae: 13.1750 - lr: 7.9433e-05
Epoch 80/100
31/31 [==============================] - 2s 52ms/step - loss: 5.6454 - mae: 6.1286 - lr: 8.9125e-05
Epoch 81/100
31/31 [==============================] - 2s 52ms/step - loss: 5.2818 - mae: 5.7642 - lr: 1.0000e-04
Epoch 82/100
31/31 [==============================] - 1s 47ms/step - loss: 5.0689 - mae: 5.5509 - lr: 1.1220e-04
Epoch 83/100
31/31 [==============================] - 1s 45ms/step - loss: 5.3940 - mae: 5.8764 - lr: 1.2589e-04
Epoch 84/100
31/31 [==============================] - 2s 48ms/step - loss: 3.9872 - mae: 4.4642 - lr: 1.4125e-04
Epoch 85/100
31/31 [==============================] - 2s 54ms/step - loss: 4.1928 - mae: 4.6716 - lr: 1.5849e-04
Epoch 86/100
31/31 [==============================] - 1s 46ms/step - loss: 5.7677 - mae: 6.2517 - lr: 1.7783e-04
Epoch 87/100
31/31 [==============================] - 1s 46ms/step - loss: 6.6116 - mae: 7.0985 - lr: 1.9953e-04
Epoch 88/100
31/31 [==============================] - 1s 45ms/step - loss: 20.6073 - mae: 21.1017 - lr: 2.2387e-04
Epoch 89/100
31/31 [==============================] - 1s 46ms/step - loss: 48.8828 - mae: 49.3814 - lr: 2.5119e-04
Epoch 90/100
31/31 [==============================] - 1s 47ms/step - loss: 14.8319 - mae: 15.3293 - lr: 2.8184e-04
Epoch 91/100
31/31 [==============================] - 2s 49ms/step - loss: 12.7321 - mae: 13.2310 - lr: 3.1623e-04
Epoch 92/100
31/31 [==============================] - 2s 52ms/step - loss: 14.3075 - mae: 14.8069 - lr: 3.5481e-04
Epoch 93/100
31/31 [==============================] - 1s 46ms/step - loss: 37.2279 - mae: 37.7253 - lr: 3.9811e-04
Epoch 94/100
31/31 [==============================] - 1s 47ms/step - loss: 71.4719 - mae: 71.9696 - lr: 4.4668e-04
Epoch 95/100
31/31 [==============================] - 1s 46ms/step - loss: 122.4658 - mae: 122.9656 - lr: 5.0119e-04
Epoch 96/100
31/31 [==============================] - 2s 50ms/step - loss: 114.3986 - mae: 114.8986 - lr: 5.6234e-04
Epoch 97/100
31/31 [==============================] - 1s 46ms/step - loss: 72.3573 - mae: 72.8565 - lr: 6.3096e-04
Epoch 98/100
31/31 [==============================] - 2s 53ms/step - loss: 108.4629 - mae: 108.9629 - lr: 7.0795e-04
Epoch 99/100
31/31 [==============================] - 1s 47ms/step - loss: 121.6685 - mae: 122.1685 - lr: 7.9433e-04
Epoch 100/100
31/31 [==============================] - 2s 54ms/step - loss: 136.1241 - mae: 136.6241 - lr: 8.9125e-04

We plot this on a semilog axis

In [36]:
plt.semilogx(history.history["lr"], history.history["loss"])
plt.axis([1e-8, 1e-4, 0, 60])
Out[36]:
(1e-08, 0.0001, 0.0, 60.0)

We take the step where the learning rate drops the steepest to train our neural network.

In [37]:
tf.keras.backend.clear_session()
tf.random.set_seed(51)
np.random.seed(51)
train_set = windowed_dataset(x_train, window_size=60, batch_size=100, shuffle_buffer=shuffle_buffer_size)
model = tf.keras.models.Sequential([
  tf.keras.layers.Conv1D(filters=60, kernel_size=5,
                      strides=1, padding="causal",
                      activation="relu",
                      input_shape=[None, 1]),
  tf.keras.layers.LSTM(60, return_sequences=True),
  tf.keras.layers.LSTM(60, return_sequences=True),
  tf.keras.layers.Dense(30, activation="relu"),
  tf.keras.layers.Dense(10, activation="relu"),
  tf.keras.layers.Dense(1),
  tf.keras.layers.Lambda(lambda x: x * 400)
])


optimizer = tf.keras.optimizers.SGD(lr=1e-7, momentum=0.9)
model.compile(loss=tf.keras.losses.Huber(),
              optimizer=optimizer,
              metrics=["mae"])
history = model.fit(train_set,epochs=500)
Epoch 1/500
80/80 [==============================] - 2s 28ms/step - loss: 21.9290 - mae: 22.4160
Epoch 2/500
80/80 [==============================] - 2s 24ms/step - loss: 6.4863 - mae: 6.9641
Epoch 3/500
80/80 [==============================] - 2s 23ms/step - loss: 5.9894 - mae: 6.4660
Epoch 4/500
80/80 [==============================] - 2s 23ms/step - loss: 5.5643 - mae: 6.0396
Epoch 5/500
80/80 [==============================] - 2s 23ms/step - loss: 5.1854 - mae: 5.6594
Epoch 6/500
80/80 [==============================] - 2s 24ms/step - loss: 4.8469 - mae: 5.3194
Epoch 7/500
80/80 [==============================] - 2s 22ms/step - loss: 4.5384 - mae: 5.0094
Epoch 8/500
80/80 [==============================] - 2s 21ms/step - loss: 4.2549 - mae: 4.7241
Epoch 9/500
80/80 [==============================] - 2s 21ms/step - loss: 4.0187 - mae: 4.4864
Epoch 10/500
80/80 [==============================] - 2s 21ms/step - loss: 3.8447 - mae: 4.3112
Epoch 11/500
80/80 [==============================] - 2s 21ms/step - loss: 3.6908 - mae: 4.1559
Epoch 12/500
80/80 [==============================] - 2s 27ms/step - loss: 3.5478 - mae: 4.0117
Epoch 13/500
80/80 [==============================] - 2s 21ms/step - loss: 3.4187 - mae: 3.8813
Epoch 14/500
80/80 [==============================] - 2s 21ms/step - loss: 3.3066 - mae: 3.7680
Epoch 15/500
80/80 [==============================] - 2s 21ms/step - loss: 3.1983 - mae: 3.6579
Epoch 16/500
80/80 [==============================] - 2s 22ms/step - loss: 3.1001 - mae: 3.5584
Epoch 17/500
80/80 [==============================] - 2s 22ms/step - loss: 3.0085 - mae: 3.4654
Epoch 18/500
80/80 [==============================] - 2s 23ms/step - loss: 2.9266 - mae: 3.3825
Epoch 19/500
80/80 [==============================] - 2s 21ms/step - loss: 2.8494 - mae: 3.3040
Epoch 20/500
80/80 [==============================] - 2s 21ms/step - loss: 2.7790 - mae: 3.2328
Epoch 21/500
80/80 [==============================] - 2s 23ms/step - loss: 2.7086 - mae: 3.1615
Epoch 22/500
80/80 [==============================] - 2s 21ms/step - loss: 2.6473 - mae: 3.0994
Epoch 23/500
80/80 [==============================] - 2s 22ms/step - loss: 2.5799 - mae: 3.0310
Epoch 24/500
80/80 [==============================] - 2s 22ms/step - loss: 2.5220 - mae: 2.9722
Epoch 25/500
80/80 [==============================] - 2s 21ms/step - loss: 2.4670 - mae: 2.9163
Epoch 26/500
80/80 [==============================] - 2s 22ms/step - loss: 2.4127 - mae: 2.8610
Epoch 27/500
80/80 [==============================] - 2s 21ms/step - loss: 2.3657 - mae: 2.8131
Epoch 28/500
80/80 [==============================] - 2s 24ms/step - loss: 2.3146 - mae: 2.7611
Epoch 29/500
80/80 [==============================] - 2s 26ms/step - loss: 2.2708 - mae: 2.7163
Epoch 30/500
80/80 [==============================] - 2s 23ms/step - loss: 2.2272 - mae: 2.6718
Epoch 31/500
80/80 [==============================] - 2s 22ms/step - loss: 2.1796 - mae: 2.6232
Epoch 32/500
80/80 [==============================] - 2s 26ms/step - loss: 2.1377 - mae: 2.5804
Epoch 33/500
80/80 [==============================] - 2s 26ms/step - loss: 2.0965 - mae: 2.5383
Epoch 34/500
80/80 [==============================] - 2s 27ms/step - loss: 2.0575 - mae: 2.4983
Epoch 35/500
80/80 [==============================] - 2s 24ms/step - loss: 2.0196 - mae: 2.4596
Epoch 36/500
80/80 [==============================] - 2s 24ms/step - loss: 1.9832 - mae: 2.4222
Epoch 37/500
80/80 [==============================] - 2s 24ms/step - loss: 1.9498 - mae: 2.3880
Epoch 38/500
80/80 [==============================] - 2s 21ms/step - loss: 1.9177 - mae: 2.3549
Epoch 39/500
80/80 [==============================] - 2s 24ms/step - loss: 1.8888 - mae: 2.3252
Epoch 40/500
80/80 [==============================] - 2s 21ms/step - loss: 1.8609 - mae: 2.2964
Epoch 41/500
80/80 [==============================] - 2s 21ms/step - loss: 1.8370 - mae: 2.2719
Epoch 42/500
80/80 [==============================] - 2s 21ms/step - loss: 1.8124 - mae: 2.2465
Epoch 43/500
80/80 [==============================] - 2s 22ms/step - loss: 1.7917 - mae: 2.2251
Epoch 44/500
80/80 [==============================] - 2s 25ms/step - loss: 1.7710 - mae: 2.2040
Epoch 45/500
80/80 [==============================] - 2s 24ms/step - loss: 1.7521 - mae: 2.1845
Epoch 46/500
80/80 [==============================] - 2s 22ms/step - loss: 1.7354 - mae: 2.1674
Epoch 47/500
80/80 [==============================] - 2s 21ms/step - loss: 1.7193 - mae: 2.1508
Epoch 48/500
80/80 [==============================] - 2s 21ms/step - loss: 1.7038 - mae: 2.1349
Epoch 49/500
80/80 [==============================] - 2s 22ms/step - loss: 1.6894 - mae: 2.1203
Epoch 50/500
80/80 [==============================] - 2s 22ms/step - loss: 1.6754 - mae: 2.1059
Epoch 51/500
80/80 [==============================] - 2s 21ms/step - loss: 1.6623 - mae: 2.0925
Epoch 52/500
80/80 [==============================] - 2s 21ms/step - loss: 1.6491 - mae: 2.0792
Epoch 53/500
80/80 [==============================] - 2s 23ms/step - loss: 1.6366 - mae: 2.0664
Epoch 54/500
80/80 [==============================] - 2s 21ms/step - loss: 1.6235 - mae: 2.0531
Epoch 55/500
80/80 [==============================] - 2s 21ms/step - loss: 1.6122 - mae: 2.0417
Epoch 56/500
80/80 [==============================] - 2s 24ms/step - loss: 1.5993 - mae: 2.0285
Epoch 57/500
80/80 [==============================] - 2s 22ms/step - loss: 1.5874 - mae: 2.0164
Epoch 58/500
80/80 [==============================] - 2s 22ms/step - loss: 1.5760 - mae: 2.0048
Epoch 59/500
80/80 [==============================] - 2s 22ms/step - loss: 1.5649 - mae: 1.9937
Epoch 60/500
80/80 [==============================] - 2s 30ms/step - loss: 1.5541 - mae: 1.9826
Epoch 61/500
80/80 [==============================] - 2s 24ms/step - loss: 1.5441 - mae: 1.9726
Epoch 62/500
80/80 [==============================] - 2s 27ms/step - loss: 1.5328 - mae: 1.9612
Epoch 63/500
80/80 [==============================] - 2s 23ms/step - loss: 1.5221 - mae: 1.9503
Epoch 64/500
80/80 [==============================] - 2s 23ms/step - loss: 1.5136 - mae: 1.9417
Epoch 65/500
80/80 [==============================] - 2s 22ms/step - loss: 1.5028 - mae: 1.9309
Epoch 66/500
80/80 [==============================] - 2s 22ms/step - loss: 1.4923 - mae: 1.9201
Epoch 67/500
80/80 [==============================] - 2s 24ms/step - loss: 1.4822 - mae: 1.9099
Epoch 68/500
80/80 [==============================] - 2s 21ms/step - loss: 1.4732 - mae: 1.9009
Epoch 69/500
80/80 [==============================] - 2s 23ms/step - loss: 1.4653 - mae: 1.8929
Epoch 70/500
80/80 [==============================] - 2s 21ms/step - loss: 1.4558 - mae: 1.8832
Epoch 71/500
80/80 [==============================] - 2s 21ms/step - loss: 1.4490 - mae: 1.8764
Epoch 72/500
80/80 [==============================] - 2s 21ms/step - loss: 1.4409 - mae: 1.8681
Epoch 73/500
80/80 [==============================] - 2s 24ms/step - loss: 1.4326 - mae: 1.8596
Epoch 74/500
80/80 [==============================] - 2s 22ms/step - loss: 1.4248 - mae: 1.8517
Epoch 75/500
80/80 [==============================] - 2s 24ms/step - loss: 1.4178 - mae: 1.8445
Epoch 76/500
80/80 [==============================] - 2s 24ms/step - loss: 1.4102 - mae: 1.8369
Epoch 77/500
80/80 [==============================] - 2s 21ms/step - loss: 1.4031 - mae: 1.8296
Epoch 78/500
80/80 [==============================] - 2s 23ms/step - loss: 1.3961 - mae: 1.8225
Epoch 79/500
80/80 [==============================] - 2s 21ms/step - loss: 1.3894 - mae: 1.8157
Epoch 80/500
80/80 [==============================] - 2s 22ms/step - loss: 1.3830 - mae: 1.8092
Epoch 81/500
80/80 [==============================] - 2s 23ms/step - loss: 1.3761 - mae: 1.8022
Epoch 82/500
80/80 [==============================] - 2s 22ms/step - loss: 1.3693 - mae: 1.7952
Epoch 83/500
80/80 [==============================] - 2s 21ms/step - loss: 1.3628 - mae: 1.7886
Epoch 84/500
80/80 [==============================] - 2s 24ms/step - loss: 1.3569 - mae: 1.7827
Epoch 85/500
80/80 [==============================] - 2s 23ms/step - loss: 1.3501 - mae: 1.7757
Epoch 86/500
80/80 [==============================] - 2s 22ms/step - loss: 1.3436 - mae: 1.7691
Epoch 87/500
80/80 [==============================] - 2s 22ms/step - loss: 1.3377 - mae: 1.7633
Epoch 88/500
80/80 [==============================] - 2s 22ms/step - loss: 1.3317 - mae: 1.7570
Epoch 89/500
80/80 [==============================] - 2s 22ms/step - loss: 1.3263 - mae: 1.7515
Epoch 90/500
80/80 [==============================] - 2s 28ms/step - loss: 1.3214 - mae: 1.7466
Epoch 91/500
80/80 [==============================] - 2s 24ms/step - loss: 1.3156 - mae: 1.7407
Epoch 92/500
80/80 [==============================] - 2s 24ms/step - loss: 1.3112 - mae: 1.7362
Epoch 93/500
80/80 [==============================] - 2s 23ms/step - loss: 1.3054 - mae: 1.7303
Epoch 94/500
80/80 [==============================] - 2s 22ms/step - loss: 1.3006 - mae: 1.7255
Epoch 95/500
80/80 [==============================] - 2s 25ms/step - loss: 1.2962 - mae: 1.7209
Epoch 96/500
80/80 [==============================] - 2s 22ms/step - loss: 1.2917 - mae: 1.7164
Epoch 97/500
80/80 [==============================] - 2s 23ms/step - loss: 1.2869 - mae: 1.7115
Epoch 98/500
80/80 [==============================] - 2s 22ms/step - loss: 1.2822 - mae: 1.7067
Epoch 99/500
80/80 [==============================] - 2s 21ms/step - loss: 1.2778 - mae: 1.7022
Epoch 100/500
80/80 [==============================] - 2s 22ms/step - loss: 1.2730 - mae: 1.6973
Epoch 101/500
80/80 [==============================] - 2s 27ms/step - loss: 1.2687 - mae: 1.6930
Epoch 102/500
80/80 [==============================] - 2s 22ms/step - loss: 1.2642 - mae: 1.6883
Epoch 103/500
80/80 [==============================] - 2s 22ms/step - loss: 1.2600 - mae: 1.6842
Epoch 104/500
80/80 [==============================] - 2s 21ms/step - loss: 1.2555 - mae: 1.6796
Epoch 105/500
80/80 [==============================] - 2s 21ms/step - loss: 1.2510 - mae: 1.6749
Epoch 106/500
80/80 [==============================] - 2s 24ms/step - loss: 1.2469 - mae: 1.6707
Epoch 107/500
80/80 [==============================] - 2s 23ms/step - loss: 1.2422 - mae: 1.6661
Epoch 108/500
80/80 [==============================] - 2s 23ms/step - loss: 1.2377 - mae: 1.6614
Epoch 109/500
80/80 [==============================] - 2s 22ms/step - loss: 1.2334 - mae: 1.6570
Epoch 110/500
80/80 [==============================] - 2s 21ms/step - loss: 1.2294 - mae: 1.6530
Epoch 111/500
80/80 [==============================] - 2s 22ms/step - loss: 1.2257 - mae: 1.6493
Epoch 112/500
80/80 [==============================] - 2s 23ms/step - loss: 1.2215 - mae: 1.6450
Epoch 113/500
80/80 [==============================] - 2s 22ms/step - loss: 1.2175 - mae: 1.6408
Epoch 114/500
80/80 [==============================] - 2s 21ms/step - loss: 1.2131 - mae: 1.6364
Epoch 115/500
80/80 [==============================] - 2s 21ms/step - loss: 1.2088 - mae: 1.6320
Epoch 116/500
80/80 [==============================] - 2s 21ms/step - loss: 1.2047 - mae: 1.6279
Epoch 117/500
80/80 [==============================] - 2s 23ms/step - loss: 1.2008 - mae: 1.6239
Epoch 118/500
80/80 [==============================] - 2s 24ms/step - loss: 1.1973 - mae: 1.6203
Epoch 119/500
80/80 [==============================] - 2s 22ms/step - loss: 1.1933 - mae: 1.6163
Epoch 120/500
80/80 [==============================] - 2s 22ms/step - loss: 1.1893 - mae: 1.6121
Epoch 121/500
80/80 [==============================] - 2s 27ms/step - loss: 1.1859 - mae: 1.6087
Epoch 122/500
80/80 [==============================] - 2s 23ms/step - loss: 1.1810 - mae: 1.6037
Epoch 123/500
80/80 [==============================] - 2s 29ms/step - loss: 1.1782 - mae: 1.6009
Epoch 124/500
80/80 [==============================] - 2s 23ms/step - loss: 1.1732 - mae: 1.5958
Epoch 125/500
80/80 [==============================] - 2s 23ms/step - loss: 1.1695 - mae: 1.5919
Epoch 126/500
80/80 [==============================] - 2s 23ms/step - loss: 1.1656 - mae: 1.5880
Epoch 127/500
80/80 [==============================] - 2s 22ms/step - loss: 1.1619 - mae: 1.5842
Epoch 128/500
80/80 [==============================] - 2s 23ms/step - loss: 1.1576 - mae: 1.5798
Epoch 129/500
80/80 [==============================] - 2s 23ms/step - loss: 1.1546 - mae: 1.5767
Epoch 130/500
80/80 [==============================] - 2s 21ms/step - loss: 1.1499 - mae: 1.5719
Epoch 131/500
80/80 [==============================] - 2s 21ms/step - loss: 1.1465 - mae: 1.5683
Epoch 132/500
80/80 [==============================] - 2s 21ms/step - loss: 1.1428 - mae: 1.5644
Epoch 133/500
80/80 [==============================] - 2s 23ms/step - loss: 1.1398 - mae: 1.5615
Epoch 134/500
80/80 [==============================] - 2s 23ms/step - loss: 1.1375 - mae: 1.5593
Epoch 135/500
80/80 [==============================] - 2s 21ms/step - loss: 1.1334 - mae: 1.5549
Epoch 136/500
80/80 [==============================] - 2s 21ms/step - loss: 1.1306 - mae: 1.5520
Epoch 137/500
80/80 [==============================] - 2s 22ms/step - loss: 1.1272 - mae: 1.5486
Epoch 138/500
80/80 [==============================] - 2s 23ms/step - loss: 1.1249 - mae: 1.5462
Epoch 139/500
80/80 [==============================] - 2s 21ms/step - loss: 1.1229 - mae: 1.5442
Epoch 140/500
80/80 [==============================] - 2s 25ms/step - loss: 1.1203 - mae: 1.5415
Epoch 141/500
80/80 [==============================] - 2s 21ms/step - loss: 1.1177 - mae: 1.5387
Epoch 142/500
80/80 [==============================] - 2s 22ms/step - loss: 1.1155 - mae: 1.5365
Epoch 143/500
80/80 [==============================] - 2s 22ms/step - loss: 1.1135 - mae: 1.5344
Epoch 144/500
80/80 [==============================] - 2s 21ms/step - loss: 1.1113 - mae: 1.5321
Epoch 145/500
80/80 [==============================] - 2s 22ms/step - loss: 1.1104 - mae: 1.5314
Epoch 146/500
80/80 [==============================] - 2s 23ms/step - loss: 1.1076 - mae: 1.5283
Epoch 147/500
80/80 [==============================] - 2s 21ms/step - loss: 1.1056 - mae: 1.5261
Epoch 148/500
80/80 [==============================] - 2s 21ms/step - loss: 1.1051 - mae: 1.5258
Epoch 149/500
80/80 [==============================] - 2s 23ms/step - loss: 1.1022 - mae: 1.5228
Epoch 150/500
80/80 [==============================] - 2s 21ms/step - loss: 1.1013 - mae: 1.5219
Epoch 151/500
80/80 [==============================] - 2s 29ms/step - loss: 1.0985 - mae: 1.5188
Epoch 152/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0969 - mae: 1.5173
Epoch 153/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0952 - mae: 1.5155
Epoch 154/500
80/80 [==============================] - 2s 26ms/step - loss: 1.0936 - mae: 1.5140
Epoch 155/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0918 - mae: 1.5121
Epoch 156/500
80/80 [==============================] - 2s 24ms/step - loss: 1.0898 - mae: 1.5100
Epoch 157/500
80/80 [==============================] - 2s 24ms/step - loss: 1.0884 - mae: 1.5085
Epoch 158/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0869 - mae: 1.5069
Epoch 159/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0853 - mae: 1.5052
Epoch 160/500
80/80 [==============================] - 2s 21ms/step - loss: 1.0840 - mae: 1.5038
Epoch 161/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0824 - mae: 1.5023
Epoch 162/500
80/80 [==============================] - 2s 23ms/step - loss: 1.0810 - mae: 1.5007
Epoch 163/500
80/80 [==============================] - 2s 21ms/step - loss: 1.0794 - mae: 1.4991
Epoch 164/500
80/80 [==============================] - 2s 21ms/step - loss: 1.0778 - mae: 1.4976
Epoch 165/500
80/80 [==============================] - 2s 23ms/step - loss: 1.0773 - mae: 1.4970
Epoch 166/500
80/80 [==============================] - 2s 21ms/step - loss: 1.0754 - mae: 1.4951
Epoch 167/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0731 - mae: 1.4927
Epoch 168/500
80/80 [==============================] - 2s 24ms/step - loss: 1.0721 - mae: 1.4916
Epoch 169/500
80/80 [==============================] - 2s 21ms/step - loss: 1.0713 - mae: 1.4908
Epoch 170/500
80/80 [==============================] - 2s 23ms/step - loss: 1.0691 - mae: 1.4885
Epoch 171/500
80/80 [==============================] - 2s 21ms/step - loss: 1.0673 - mae: 1.4866
Epoch 172/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0658 - mae: 1.4851
Epoch 173/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0643 - mae: 1.4836
Epoch 174/500
80/80 [==============================] - 2s 24ms/step - loss: 1.0633 - mae: 1.4825
Epoch 175/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0612 - mae: 1.4803
Epoch 176/500
80/80 [==============================] - 2s 21ms/step - loss: 1.0596 - mae: 1.4787
Epoch 177/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0585 - mae: 1.4776
Epoch 178/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0575 - mae: 1.4765
Epoch 179/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0558 - mae: 1.4747
Epoch 180/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0543 - mae: 1.4732
Epoch 181/500
80/80 [==============================] - 2s 23ms/step - loss: 1.0530 - mae: 1.4719
Epoch 182/500
80/80 [==============================] - 2s 26ms/step - loss: 1.0516 - mae: 1.4703
Epoch 183/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0503 - mae: 1.4690
Epoch 184/500
80/80 [==============================] - 2s 23ms/step - loss: 1.0493 - mae: 1.4681
Epoch 185/500
80/80 [==============================] - 2s 23ms/step - loss: 1.0480 - mae: 1.4666
Epoch 186/500
80/80 [==============================] - 2s 24ms/step - loss: 1.0478 - mae: 1.4665
Epoch 187/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0461 - mae: 1.4647
Epoch 188/500
80/80 [==============================] - 2s 23ms/step - loss: 1.0449 - mae: 1.4634
Epoch 189/500
80/80 [==============================] - 2s 23ms/step - loss: 1.0442 - mae: 1.4627
Epoch 190/500
80/80 [==============================] - 2s 25ms/step - loss: 1.0428 - mae: 1.4612
Epoch 191/500
80/80 [==============================] - 2s 24ms/step - loss: 1.0416 - mae: 1.4600
Epoch 192/500
80/80 [==============================] - 2s 24ms/step - loss: 1.0408 - mae: 1.4592
Epoch 193/500
80/80 [==============================] - 2s 25ms/step - loss: 1.0395 - mae: 1.4579
Epoch 194/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0384 - mae: 1.4566
Epoch 195/500
80/80 [==============================] - 2s 21ms/step - loss: 1.0381 - mae: 1.4564
Epoch 196/500
80/80 [==============================] - 2s 24ms/step - loss: 1.0369 - mae: 1.4549
Epoch 197/500
80/80 [==============================] - 2s 23ms/step - loss: 1.0356 - mae: 1.4536
Epoch 198/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0347 - mae: 1.4527
Epoch 199/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0344 - mae: 1.4524
Epoch 200/500
80/80 [==============================] - 2s 21ms/step - loss: 1.0334 - mae: 1.4513
Epoch 201/500
80/80 [==============================] - 2s 28ms/step - loss: 1.0323 - mae: 1.4502
Epoch 202/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0324 - mae: 1.4503
Epoch 203/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0304 - mae: 1.4482
Epoch 204/500
80/80 [==============================] - 2s 23ms/step - loss: 1.0302 - mae: 1.4481
Epoch 205/500
80/80 [==============================] - 2s 21ms/step - loss: 1.0299 - mae: 1.4476
Epoch 206/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0284 - mae: 1.4460
Epoch 207/500
80/80 [==============================] - 2s 25ms/step - loss: 1.0274 - mae: 1.4450
Epoch 208/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0268 - mae: 1.4443
Epoch 209/500
80/80 [==============================] - 2s 23ms/step - loss: 1.0258 - mae: 1.4434
Epoch 210/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0248 - mae: 1.4423
Epoch 211/500
80/80 [==============================] - 2s 21ms/step - loss: 1.0240 - mae: 1.4415
Epoch 212/500
80/80 [==============================] - 2s 27ms/step - loss: 1.0233 - mae: 1.4408
Epoch 213/500
80/80 [==============================] - 2s 25ms/step - loss: 1.0231 - mae: 1.4405
Epoch 214/500
80/80 [==============================] - 2s 23ms/step - loss: 1.0219 - mae: 1.4393
Epoch 215/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0213 - mae: 1.4386
Epoch 216/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0208 - mae: 1.4381
Epoch 217/500
80/80 [==============================] - 2s 24ms/step - loss: 1.0196 - mae: 1.4369
Epoch 218/500
80/80 [==============================] - 2s 25ms/step - loss: 1.0198 - mae: 1.4370
Epoch 219/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0182 - mae: 1.4353
Epoch 220/500
80/80 [==============================] - 2s 23ms/step - loss: 1.0175 - mae: 1.4345
Epoch 221/500
80/80 [==============================] - 2s 21ms/step - loss: 1.0167 - mae: 1.4337
Epoch 222/500
80/80 [==============================] - 2s 21ms/step - loss: 1.0164 - mae: 1.4334
Epoch 223/500
80/80 [==============================] - 2s 21ms/step - loss: 1.0155 - mae: 1.4324
Epoch 224/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0146 - mae: 1.4315
Epoch 225/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0139 - mae: 1.4308
Epoch 226/500
80/80 [==============================] - 2s 21ms/step - loss: 1.0136 - mae: 1.4304
Epoch 227/500
80/80 [==============================] - 2s 21ms/step - loss: 1.0141 - mae: 1.4310
Epoch 228/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0133 - mae: 1.4301
Epoch 229/500
80/80 [==============================] - 2s 27ms/step - loss: 1.0114 - mae: 1.4282
Epoch 230/500
80/80 [==============================] - 2s 23ms/step - loss: 1.0106 - mae: 1.4273
Epoch 231/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0101 - mae: 1.4268
Epoch 232/500
80/80 [==============================] - 2s 21ms/step - loss: 1.0097 - mae: 1.4265
Epoch 233/500
80/80 [==============================] - 2s 24ms/step - loss: 1.0092 - mae: 1.4258
Epoch 234/500
80/80 [==============================] - 2s 21ms/step - loss: 1.0080 - mae: 1.4246
Epoch 235/500
80/80 [==============================] - 2s 24ms/step - loss: 1.0077 - mae: 1.4242
Epoch 236/500
80/80 [==============================] - 2s 23ms/step - loss: 1.0068 - mae: 1.4232
Epoch 237/500
80/80 [==============================] - 2s 21ms/step - loss: 1.0064 - mae: 1.4229
Epoch 238/500
80/80 [==============================] - 2s 21ms/step - loss: 1.0061 - mae: 1.4225
Epoch 239/500
80/80 [==============================] - 2s 21ms/step - loss: 1.0048 - mae: 1.4212
Epoch 240/500
80/80 [==============================] - 2s 21ms/step - loss: 1.0043 - mae: 1.4207
Epoch 241/500
80/80 [==============================] - 2s 24ms/step - loss: 1.0042 - mae: 1.4206
Epoch 242/500
80/80 [==============================] - 2s 25ms/step - loss: 1.0032 - mae: 1.4195
Epoch 243/500
80/80 [==============================] - 2s 21ms/step - loss: 1.0026 - mae: 1.4189
Epoch 244/500
80/80 [==============================] - 2s 21ms/step - loss: 1.0020 - mae: 1.4183
Epoch 245/500
80/80 [==============================] - 2s 26ms/step - loss: 1.0018 - mae: 1.4181
Epoch 246/500
80/80 [==============================] - 2s 24ms/step - loss: 1.0016 - mae: 1.4178
Epoch 247/500
80/80 [==============================] - 2s 22ms/step - loss: 1.0002 - mae: 1.4163
Epoch 248/500
80/80 [==============================] - 2s 22ms/step - loss: 0.9997 - mae: 1.4158
Epoch 249/500
80/80 [==============================] - 2s 23ms/step - loss: 0.9988 - mae: 1.4149
Epoch 250/500
80/80 [==============================] - 2s 22ms/step - loss: 0.9981 - mae: 1.4141
Epoch 251/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9983 - mae: 1.4143
Epoch 252/500
80/80 [==============================] - 2s 25ms/step - loss: 0.9977 - mae: 1.4138
Epoch 253/500
80/80 [==============================] - 2s 22ms/step - loss: 0.9968 - mae: 1.4127
Epoch 254/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9960 - mae: 1.4119
Epoch 255/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9957 - mae: 1.4115
Epoch 256/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9952 - mae: 1.4111
Epoch 257/500
80/80 [==============================] - 2s 23ms/step - loss: 0.9944 - mae: 1.4101
Epoch 258/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9943 - mae: 1.4101
Epoch 259/500
80/80 [==============================] - 2s 22ms/step - loss: 0.9933 - mae: 1.4090
Epoch 260/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9930 - mae: 1.4087
Epoch 261/500
80/80 [==============================] - 2s 24ms/step - loss: 0.9926 - mae: 1.4083
Epoch 262/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9914 - mae: 1.4069
Epoch 263/500
80/80 [==============================] - 2s 23ms/step - loss: 0.9914 - mae: 1.4070
Epoch 264/500
80/80 [==============================] - 2s 22ms/step - loss: 0.9906 - mae: 1.4061
Epoch 265/500
80/80 [==============================] - 2s 24ms/step - loss: 0.9901 - mae: 1.4056
Epoch 266/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9898 - mae: 1.4053
Epoch 267/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9889 - mae: 1.4044
Epoch 268/500
80/80 [==============================] - 2s 22ms/step - loss: 0.9880 - mae: 1.4035
Epoch 269/500
80/80 [==============================] - 2s 24ms/step - loss: 0.9881 - mae: 1.4035
Epoch 270/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9879 - mae: 1.4033
Epoch 271/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9872 - mae: 1.4026
Epoch 272/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9863 - mae: 1.4016
Epoch 273/500
80/80 [==============================] - 2s 27ms/step - loss: 0.9859 - mae: 1.4013
Epoch 274/500
80/80 [==============================] - 2s 22ms/step - loss: 0.9860 - mae: 1.4013
Epoch 275/500
80/80 [==============================] - 2s 23ms/step - loss: 0.9852 - mae: 1.4004
Epoch 276/500
80/80 [==============================] - 2s 24ms/step - loss: 0.9843 - mae: 1.3994
Epoch 277/500
80/80 [==============================] - 2s 24ms/step - loss: 0.9842 - mae: 1.3993
Epoch 278/500
80/80 [==============================] - 2s 22ms/step - loss: 0.9828 - mae: 1.3978
Epoch 279/500
80/80 [==============================] - 2s 23ms/step - loss: 0.9826 - mae: 1.3977
Epoch 280/500
80/80 [==============================] - 2s 26ms/step - loss: 0.9820 - mae: 1.3971
Epoch 281/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9817 - mae: 1.3967
Epoch 282/500
80/80 [==============================] - 2s 23ms/step - loss: 0.9812 - mae: 1.3962
Epoch 283/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9805 - mae: 1.3955
Epoch 284/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9800 - mae: 1.3949
Epoch 285/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9797 - mae: 1.3945
Epoch 286/500
80/80 [==============================] - 2s 23ms/step - loss: 0.9787 - mae: 1.3935
Epoch 287/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9784 - mae: 1.3931
Epoch 288/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9780 - mae: 1.3928
Epoch 289/500
80/80 [==============================] - 2s 23ms/step - loss: 0.9778 - mae: 1.3926
Epoch 290/500
80/80 [==============================] - 2s 22ms/step - loss: 0.9772 - mae: 1.3920
Epoch 291/500
80/80 [==============================] - 2s 24ms/step - loss: 0.9764 - mae: 1.3911
Epoch 292/500
80/80 [==============================] - 2s 22ms/step - loss: 0.9770 - mae: 1.3918
Epoch 293/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9760 - mae: 1.3906
Epoch 294/500
80/80 [==============================] - 2s 22ms/step - loss: 0.9749 - mae: 1.3895
Epoch 295/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9751 - mae: 1.3896
Epoch 296/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9745 - mae: 1.3890
Epoch 297/500
80/80 [==============================] - 2s 24ms/step - loss: 0.9740 - mae: 1.3884
Epoch 298/500
80/80 [==============================] - 2s 23ms/step - loss: 0.9731 - mae: 1.3875
Epoch 299/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9730 - mae: 1.3874
Epoch 300/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9720 - mae: 1.3864
Epoch 301/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9720 - mae: 1.3864
Epoch 302/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9710 - mae: 1.3854
Epoch 303/500
80/80 [==============================] - 2s 24ms/step - loss: 0.9711 - mae: 1.3854
Epoch 304/500
80/80 [==============================] - 2s 25ms/step - loss: 0.9717 - mae: 1.3860
Epoch 305/500
80/80 [==============================] - 2s 23ms/step - loss: 0.9696 - mae: 1.3839
Epoch 306/500
80/80 [==============================] - 2s 22ms/step - loss: 0.9694 - mae: 1.3837
Epoch 307/500
80/80 [==============================] - 2s 24ms/step - loss: 0.9691 - mae: 1.3833
Epoch 308/500
80/80 [==============================] - 2s 23ms/step - loss: 0.9688 - mae: 1.3830
Epoch 309/500
80/80 [==============================] - 2s 22ms/step - loss: 0.9677 - mae: 1.3818
Epoch 310/500
80/80 [==============================] - 2s 22ms/step - loss: 0.9685 - mae: 1.3826
Epoch 311/500
80/80 [==============================] - 2s 22ms/step - loss: 0.9669 - mae: 1.3810
Epoch 312/500
80/80 [==============================] - 2s 24ms/step - loss: 0.9672 - mae: 1.3813
Epoch 313/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9660 - mae: 1.3801
Epoch 314/500
80/80 [==============================] - 2s 25ms/step - loss: 0.9656 - mae: 1.3797
Epoch 315/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9652 - mae: 1.3792
Epoch 316/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9646 - mae: 1.3786
Epoch 317/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9639 - mae: 1.3778
Epoch 318/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9637 - mae: 1.3776
Epoch 319/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9636 - mae: 1.3775
Epoch 320/500
80/80 [==============================] - 2s 23ms/step - loss: 0.9634 - mae: 1.3773
Epoch 321/500
80/80 [==============================] - 2s 23ms/step - loss: 0.9623 - mae: 1.3760
Epoch 322/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9617 - mae: 1.3754
Epoch 323/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9611 - mae: 1.3748
Epoch 324/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9609 - mae: 1.3746
Epoch 325/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9604 - mae: 1.3741
Epoch 326/500
80/80 [==============================] - 2s 25ms/step - loss: 0.9601 - mae: 1.3737
Epoch 327/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9594 - mae: 1.3729
Epoch 328/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9589 - mae: 1.3725
Epoch 329/500
80/80 [==============================] - 2s 23ms/step - loss: 0.9588 - mae: 1.3724
Epoch 330/500
80/80 [==============================] - 2s 23ms/step - loss: 0.9577 - mae: 1.3712
Epoch 331/500
80/80 [==============================] - 2s 23ms/step - loss: 0.9571 - mae: 1.3706
Epoch 332/500
80/80 [==============================] - 2s 22ms/step - loss: 0.9569 - mae: 1.3704
Epoch 333/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9572 - mae: 1.3705
Epoch 334/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9558 - mae: 1.3691
Epoch 335/500
80/80 [==============================] - 2s 26ms/step - loss: 0.9555 - mae: 1.3689
Epoch 336/500
80/80 [==============================] - 2s 22ms/step - loss: 0.9548 - mae: 1.3681
Epoch 337/500
80/80 [==============================] - 2s 25ms/step - loss: 0.9543 - mae: 1.3676
Epoch 338/500
80/80 [==============================] - 2s 24ms/step - loss: 0.9542 - mae: 1.3674
Epoch 339/500
80/80 [==============================] - 2s 22ms/step - loss: 0.9534 - mae: 1.3665
Epoch 340/500
80/80 [==============================] - 2s 22ms/step - loss: 0.9529 - mae: 1.3661
Epoch 341/500
80/80 [==============================] - 2s 23ms/step - loss: 0.9538 - mae: 1.3670
Epoch 342/500
80/80 [==============================] - 2s 24ms/step - loss: 0.9530 - mae: 1.3662
Epoch 343/500
80/80 [==============================] - 2s 22ms/step - loss: 0.9518 - mae: 1.3649
Epoch 344/500
80/80 [==============================] - 2s 23ms/step - loss: 0.9509 - mae: 1.3640
Epoch 345/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9507 - mae: 1.3636
Epoch 346/500
80/80 [==============================] - 2s 23ms/step - loss: 0.9500 - mae: 1.3630
Epoch 347/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9498 - mae: 1.3627
Epoch 348/500
80/80 [==============================] - 2s 24ms/step - loss: 0.9502 - mae: 1.3631
Epoch 349/500
80/80 [==============================] - 2s 24ms/step - loss: 0.9491 - mae: 1.3620
Epoch 350/500
80/80 [==============================] - 2s 23ms/step - loss: 0.9486 - mae: 1.3613
Epoch 351/500
80/80 [==============================] - 2s 23ms/step - loss: 0.9478 - mae: 1.3605
Epoch 352/500
80/80 [==============================] - 2s 23ms/step - loss: 0.9475 - mae: 1.3603
Epoch 353/500
80/80 [==============================] - 2s 24ms/step - loss: 0.9470 - mae: 1.3597
Epoch 354/500
80/80 [==============================] - 2s 22ms/step - loss: 0.9466 - mae: 1.3592
Epoch 355/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9460 - mae: 1.3586
Epoch 356/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9457 - mae: 1.3583
Epoch 357/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9453 - mae: 1.3579
Epoch 358/500
80/80 [==============================] - 2s 22ms/step - loss: 0.9449 - mae: 1.3575
Epoch 359/500
80/80 [==============================] - 2s 23ms/step - loss: 0.9447 - mae: 1.3572
Epoch 360/500
80/80 [==============================] - 2s 22ms/step - loss: 0.9445 - mae: 1.3570
Epoch 361/500
80/80 [==============================] - 2s 22ms/step - loss: 0.9445 - mae: 1.3570
Epoch 362/500
80/80 [==============================] - 2s 23ms/step - loss: 0.9437 - mae: 1.3561
Epoch 363/500
80/80 [==============================] - 2s 22ms/step - loss: 0.9430 - mae: 1.3555
Epoch 364/500
80/80 [==============================] - 2s 21ms/step - loss: 0.9427 - mae: 1.3551
Epoch 365/500
80/80 [==============================] - 2s 23ms/step - loss: 0.9425 - mae: 1.3548
Epoch 366/500
80/80 [==============================] - 2s 25ms/step - loss: 0.9420 - mae: 1.3544
Epoch 367/500
 8/80 [==>...........................] - ETA: 1s - loss: 1.0191 - mae: 1.4353
In [38]:
def model_forecast(model, series, window_size):
    """
    Given a model object and a series for it to predict, this function will return the prediction
    """
    ds = tf.data.Dataset.from_tensor_slices(series)
    ds = ds.window(window_size, shift=1, drop_remainder=True)
    ds = ds.flat_map(lambda w: w.batch(window_size))
    ds = ds.batch(32).prefetch(1)
    forecast = model.predict(ds)
    return forecast
In [39]:
rnn_forecast = model_forecast(model, series[..., np.newaxis], window_size)
rnn_forecast = rnn_forecast[split_time - window_size:-1, -1, 0]
In [40]:
plt.figure(figsize=(10, 6))
plot_series(time_valid, x_valid)
plot_series(time_valid, rnn_forecast)
In [41]:
tf.keras.metrics.mean_absolute_error(x_valid, rnn_forecast).numpy()
Out[41]:
1.0756484
In [42]:
loss=history.history['loss']

epochs=range(len(loss)) # Get number of epochs


#------------------------------------------------
# Plot training and validation loss per epoch
#------------------------------------------------
plt.plot(epochs, loss, 'r')
plt.title('Training loss')
plt.xlabel("Epochs")
plt.ylabel("Loss")
plt.legend(["Loss"])

plt.figure()

zoomed_loss = loss[200:]
zoomed_epochs = range(200,500)


#------------------------------------------------
# Plot training and validation loss per epoch
#------------------------------------------------
plt.plot(zoomed_epochs, zoomed_loss, 'r')
plt.title('Training loss')
plt.xlabel("Epochs")
plt.ylabel("Loss")
plt.legend(["Loss"])

plt.figure()
Out[42]:
<Figure size 432x288 with 0 Axes>
<Figure size 432x288 with 0 Axes>

Conclusion

I have demonstrated in this notebook how to use naive forecast, moving average forecast and build a model in CNN and LSTM using tensorflow dataset prepration.

If you find this notebook helpful for you, please upvote!