Recurrent Neural Networks Models

In this notebook, we show an example of how RNNs can be used with darts. If you are new to darts, we recommend you first follow the quick start notebook.

[1]:
# fix python path if working locally
from utils import fix_pythonpath_if_working_locally

fix_pythonpath_if_working_locally()
[2]:
%load_ext autoreload
%autoreload 2
%matplotlib inline
[3]:
import torch
import torch.nn as nn
import torch.optim as optim
import numpy as np
import pandas as pd
import shutil
from sklearn.preprocessing import MinMaxScaler
from tqdm import tqdm_notebook as tqdm
import matplotlib.pyplot as plt

from darts import TimeSeries
from darts.dataprocessing.transformers import Scaler
from darts.models import RNNModel, ExponentialSmoothing, BlockRNNModel
from darts.metrics import mape
from darts.utils.statistics import check_seasonality, plot_acf
from darts.datasets import AirPassengersDataset, SunspotsDataset
from darts.utils.timeseries_generation import datetime_attribute_timeseries

import warnings

warnings.filterwarnings("ignore")
import logging

logging.disable(logging.CRITICAL)

Recurrent Models

Darts includes two recurrent forecasting model classes: RNNModel and BlockRNNModel.

RNNModel is fully recurrent in the sense that, at prediction time, an output is computed using these inputs:

  • the previous target value, which will be set to the last known target value for the first prediction, and for all other predictions it will be set to the previous prediction

  • the previous hidden state

  • the current covariates (if the model was trained with covariates)

A prediction with forecasting horizon n thus is created in n iterations of RNNModel predictions and requires n future covariates to be known. This model is suited for forecasting problems where the target series is highly dependent on covariates that are known in advance.

BlockRNNModel has a recurrent encoder stage, which encodes its input, and a fully-connected neural network decoder stage, which produces a prediction of length output_chunk_length based on the last hidden state of the encoder stage. Consequently, this model produces ‘blocks’ of forecasts and is restricted to looking at covariates with the same time index as the input target series.

Air Passenger Example

This is a data set that is highly dependent on covariates. Knowing the month tells us a lot about the seasonal component, whereas the year determines the effect of the trend component. Both of these covariates are known in the future, and thus the RNNModel class is the preferred choice for this problem.

[4]:
# Read data:
series = AirPassengersDataset().load()

# Create training and validation sets:
train, val = series.split_after(pd.Timestamp("19590101"))

# Normalize the time series (note: we avoid fitting the transformer on the validation set)
transformer = Scaler()
train_transformed = transformer.fit_transform(train)
val_transformed = transformer.transform(val)
series_transformed = transformer.transform(series)

# create month and year covariate series
year_series = datetime_attribute_timeseries(
    pd.date_range(start=series.start_time(), freq=series.freq_str, periods=1000),
    attribute="year",
    one_hot=False,
)
year_series = Scaler().fit_transform(year_series)
month_series = datetime_attribute_timeseries(
    year_series, attribute="month", one_hot=True
)
covariates = year_series.stack(month_series)
cov_train, cov_val = covariates.split_after(pd.Timestamp("19590101"))

Let’s train an LSTM neural net. For using vanilla RNN or GRU instead, replace 'LSTM' by 'RNN' or 'GRU', respectively.

[5]:
my_model = RNNModel(
    model="LSTM",
    hidden_dim=20,
    dropout=0,
    batch_size=16,
    n_epochs=300,
    optimizer_kwargs={"lr": 1e-3},
    model_name="Air_RNN",
    log_tensorboard=True,
    random_state=42,
    training_length=20,
    input_chunk_length=14,
    force_reset=True,
    save_checkpoints=True,
)

In what follows, we can just provide the whole covariates series as future_covariates argument to the model; the model will slice these covariates and use only what it needs in order to train on forecasting the target train_transformed:

[6]:
my_model.fit(
    train_transformed,
    future_covariates=covariates,
    val_series=val_transformed,
    val_future_covariates=covariates,
    verbose=True,
)
[6]:
<darts.models.forecasting.rnn_model.RNNModel at 0x7f800691edc0>

Look at predictions on the validation set

Use the “current” model - i.e., the model at the end of the training procedure:

[7]:
def eval_model(model):
    pred_series = model.predict(n=26, future_covariates=covariates)
    plt.figure(figsize=(8, 5))
    series_transformed.plot(label="actual")
    pred_series.plot(label="forecast")
    plt.title("MAPE: {:.2f}%".format(mape(pred_series, val_transformed)))
    plt.legend()


eval_model(my_model)
../_images/examples_04-RNN-examples_12_1.png

Use the best model obtained over training, according to validation loss:

[8]:
best_model = RNNModel.load_from_checkpoint(model_name="Air_RNN", best=True)
eval_model(best_model)
../_images/examples_04-RNN-examples_14_1.png

Backtesting

Let’s backtest our RNN model, to see how it performs at a forecast horizon of 6 months:

[9]:
backtest_series = my_model.historical_forecasts(
    series_transformed,
    future_covariates=covariates,
    start=pd.Timestamp("19590101"),
    forecast_horizon=6,
    retrain=False,
    verbose=True,
)
[10]:
plt.figure(figsize=(8, 5))
series_transformed.plot(label="actual")
backtest_series.plot(label="backtest")
plt.legend()
plt.title("Backtest, starting Jan 1959, 6-months horizon")
print(
    "MAPE: {:.2f}%".format(
        mape(
            transformer.inverse_transform(series_transformed),
            transformer.inverse_transform(backtest_series),
        )
    )
)
MAPE: 2.71%
../_images/examples_04-RNN-examples_17_1.png

Monthly sunspots

Let’s now try a more challenging time series; that of the monthly number of sunspots since 1749. First, we build the time series from the data, and check its periodicity.

[11]:
series_sunspot = SunspotsDataset().load()

series_sunspot.plot()
check_seasonality(series_sunspot, max_lag=240)
[11]:
(True, 125)
../_images/examples_04-RNN-examples_19_1.png
[12]:
plot_acf(series_sunspot, 125, max_lag=240)  # ~11 years seasonality
../_images/examples_04-RNN-examples_20_0.png
[13]:
train_sp, val_sp = series_sunspot.split_after(pd.Timestamp("19401001"))

transformer_sunspot = Scaler()
train_sp_transformed = transformer_sunspot.fit_transform(train_sp)
val_sp_transformed = transformer_sunspot.transform(val_sp)
series_sp_transformed = transformer_sunspot.transform(series_sunspot)
[14]:
my_model_sun = BlockRNNModel(
    model="GRU",
    input_chunk_length=125,
    output_chunk_length=36,
    hidden_dim=10,
    n_rnn_layers=1,
    batch_size=32,
    n_epochs=100,
    dropout=0.1,
    model_name="sun_GRU",
    nr_epochs_val_period=1,
    optimizer_kwargs={"lr": 1e-3},
    log_tensorboard=True,
    random_state=42,
    force_reset=True,
)

my_model_sun.fit(train_sp_transformed, val_series=val_sp_transformed, verbose=True)
[14]:
<darts.models.forecasting.block_rnn_model.BlockRNNModel at 0x7f80095963d0>

To evaluate our model, we will simulate historic forecasting with a forecasting horizon of 3 years across the validation set. To speed things up, we will only look at every 10th forecast. For the sake of comparison, let’s also fit an exponential smoothing model.

[15]:
# Compute the backtest predictions with the two models
pred_series = my_model_sun.historical_forecasts(
    series_sp_transformed,
    start=pd.Timestamp("19401001"),
    forecast_horizon=36,
    stride=10,
    retrain=False,
    last_points_only=True,
    verbose=True,
)

pred_series_ets = ExponentialSmoothing(seasonal_periods=120).historical_forecasts(
    series_sp_transformed,
    start=pd.Timestamp("19401001"),
    forecast_horizon=36,
    stride=10,
    retrain=True,
    last_points_only=True,
    verbose=True,
)
[16]:
val_sp_transformed.plot(label="actual")
pred_series.plot(label="our RNN")
pred_series_ets.plot(label="ETS")
plt.legend()
print("RNN MAPE:", mape(pred_series, val_sp_transformed))
print("ETS MAPE:", mape(pred_series_ets, val_sp_transformed))
RNN MAPE: 73.19010018398568
ETS MAPE: 116.63584309419007
../_images/examples_04-RNN-examples_25_1.png