Implement a Transformer-Based Time Series Predictor

Get the Latest on All Things CODE

Bob Chesebrough, senior AI solution architect, and Jack Erickson, principal developer marketing manager, Intel Corporation

Several clients have recently focused on time series analysis. This was a great excuse to explore current and older methods of time series prediction and clustering. This experimentation was divided into two articles: This one on time series prediction and another one in this issue of The Parallel Universe magazine on a time series clustering trick using principal components analysis and density-based spatial clustering of applications with noise (DBSCAN). 

A Google* search in the Hugging Face* community indicates there is a model called Chronos for this. Chronos is a family of time series forecasting models based on the transformer architecture used by language models to predict the next tokens. Chronos models have been pretrained on a large corpus of public time series data, and you have the option to fine-tune them to customize for your data. We use the chronos-bolt-small model through the AutoGluon library. The concise description of what AutoGluon does from their website is:

“Via a simple fit() call, AutoGluon can train and tune simple forecasting models (e.g., ARIMA, ETS, Theta), powerful deep learning models (e.g., DeepAR, Temporal Fusion Transformer), tree-based models (e.g., LightGBM), and an ensemble that combines predictions of other models to produce multistep ahead probabilistic forecasts for univariate time series data.”

Intel® Tiber™ AI Cloud

This example uses a free account on Intel® Tiber™ AI Cloud, which provides a sandbox for trying out the latest AI hardware and software from Intel. The following steps get an initial project working. This example uses an Intel® Xeon® Scalable processor. If you do not already have an Intel Tiber AI Cloud account, you can enroll for free. 

 

Set Up the Initial Project

After creating your account, perform the following steps:

  1. Get started with a free learning account by going to Get Started
  2. Select Connect Now, and then choose any device type to launch JupyterLab.
    Note This example only uses the CPU.
  3. The JupyterLab interface appears as follows:

  4. Create a folder named Unsupervised.

    Open the Unsupervised folder before the next step.
  5. The code for the project can be found in this GitHub* repository:
    ChronosTimeSeriesPredictor.
    Launch a terminal session.

    Select the plus button at the top left, then select Terminal


    This creates a browser tab with your Bash shell terminal:

  6. Clone the ChronosTimeSeriesPredictor repository in the directory.


     

Create a Virtual Environment and Jupyter* Kernel

Run the following commands in the terminal to create a dedicated Python* virtual environment, install the necessary libraries into it, and create a Jupyter* kernel associated with the virtual environment:

python -m venv venv_timeseries source venv_timeseries/bin/activate pip install -q torch torchvision --index-url https://download.pytorch.org/whl/cpu pip install -q autogluon ipywidgets ipykernel python -m ipykernel install --user --name timeseries

It takes several minutes to install these packages. When you open the notebook, use the timeseries kernel to run the code.

Open the Notebook for the Analysis

Open the ChronosTimeSeriesPredictor folder, and then open the TransformerTimeSeriesUsingChronos.ipynb notebook.
 

Once the notebook is open, change to the newly created kernel by selecting Kernel > Change Kernel, and then selecting timeseries in the drop-down menu.

Import the Libraries

Run the first code cell to import the needed modules from the Python libraries that were installed.

from autogluon.timeseries import TimeSeriesDataFrame, TimeSeriesPredictor import matplotlib.pyplot as plt import numpy as np import pandas as pd import torch from tqdm.auto import tqdm

Retrieve and Explore the Data

The sample uses time series data that was generated for another project based on electrical grid data from Toronto Hydro titled Toronto Hydro Transformer Monitors.

Notice the regular daily periodicity and the somewhat irregular nature of the electric grid values bounded and constrained within a statistical envelope for each day. This data was generated by tracking a mean and standard deviation for every 10-minute interval each day for multiple days. Each time slice was modeled as having its own mean and standard deviation and assumed a Gaussian distribution that was later sampled to generate an infinite amount of data:

import pandas as pd df = pd.read_csv(“anomalySeries.csv”) plt.plot( df.target[-500:]) plt.grid() plt.show()

Load the .csv file in time series format:

# Put into TimeSeries format data = TimeSeriesDataFrame(“anomalySeries.csv”)

Create a Zero-Shot Forecaster

We set up the forecast to run in zero-shot mode, using the model as-is with our data. The model has already been trained on a large amount of time series data, and we don’t have enough training data to meaningfully customize it with fine-tuning. Instead, we use the training subset of the data as input to the predictor, which has a forecasting horizon of 24 tokens.

Note AutoGluon has the ability to perform model selection, but we direct it to use the bolt_ small model.

prediction_length = 24 train_data, test_data = data.train_test_split(prediction_length) predictor = TimeSeriesPredictor(prediction_length=prediction_length).fit( train_data, presets=”bolt_small”, )

 

Forecast the Test Data

Finally, use the test subset to measure how well the forecaster generalizes to unseen data. 

predictions = predictor.predict(test_data) # Turn off matplotlib interactive mode plt.ioff() Len = data.shape[0] predictor.plot( data=data, predictions=predictions, item_ids=["Series"], max_history_length=Len, )

The plot output shows that the trained model does a good job of predicting the periodicity.

Summary

You can see how streamlined it is to obtain a free account on the Intel Tiber AI Cloud and use the Chronos model to train and predict time series. We encourage you to give it a try.