You are on page 1of 8

 

[Live Webinar] Gain a Competitive Edge with Graph Analytics


Sign Up Now 
DZone > AI Zone > Introduction to Time Series

Introduction to Time Series


by Maria Jesus Alonso  · Jul. 13, 17 · AI Zone · Tutorial

Boost Your Remote Work Environment


LIVE WEBINAR! Are you managing a team of
remote developers? Register and learn how a
community can be a vibrant source of collaboration
for your developer team. ►

A time series is a sequentially indexed representation of your historical data that can be used to
solve classi ication and segmentation problems, in addition to forecasting future values of
numerical properties, for example, air pollution level in Madrid for the last two days. This is a
very versatile method often used for predicting stock prices, sales forecasting, website traf ic,
production and inventory analysis, or weather forecasting, among many other use cases. Soon,
BigML will have time series as a new resource.

Following our mission of democratizing machine learning and making it easy for everyone, we
will provide new learning material for you to start with time series from scratch and become a
power user over time. We start by publishing a series of six blog posts that will progressively
dive deeper into the technical and practical aspects of time series with an emphasis on time
series models for forecasting. Today’s post sets the tone by explaining the basic time series
concepts We will follow with an example use case Then there will be several posts focused on
concepts. We will follow with an example use case. Then there will be several posts focused on
how to use and interpret time series through the BigML Dashboard, API, WhizzML, and
Bindings to make forecasts for new time horizons. Finally, we will complete this series with a
technical view of how time series models work behind the scenes.

Let’s get started!

There are times when historical data inform certain behavior in the short or longer future.
However, unlike general classi ication or regression problems, time series need your data to be
organized as a sequence of snapshots of your input ields at various points in time. For
example, the chart below depicts the variation of sales during a given month. Can we forecast
the sales for future days or even months based on this data?

The answer is a resounding yes since BigML has implemented the exponential smoothing
algorithm to train time series models. In this type of model, the data is modeled as a
combination of exponentially weighted moving averages. Exponential smoothing is not new. It
was proposed in the late 1950s. Some of the most relevant work was Robert Goodell Brown’s
in 1956, and later on, the ield of study was expanded by Charles C. Holt in 1957, as well as
Peter Winters in 1960. Contrary to other methods, in forecasts produced using exponential
smoothing, past instances are not equally weighted and recent instances are given more weight
than older instances. In other words, the more recent the observation the higher the associated
weight.

From Zero to Time Series Forecasts

In BigML, a regular time series work low is composed of training your data, evaluating it, and
forecasting what will happen in the future. In that way, it is very much like other modeling
th d il bl i Bi ML B t h t k ti i diff t?
methods available in BigML. But what makes time series different?

The Training Data Structure


The instances in the training data need to be sequentially arranged — that is, the irst instance
in your dataset will be the irst data point in time and the last one will be the most recent one.
In addition, the interval between consecutive instances must be constant.

The Objective Fields


Time series models can only be applied to numeric ields, but a single time series model can
produce forecasts for all the numeric ields in the dataset at once (as opposed to classi ication
or regression models, which only allow one objective ield per model). In other words, a time
series model accepts multiple objective ields, and in fact, you can use all numeric ields in the
input dataset as objective at once.

The Time Series Models


BigML automatically trains multiple models for you behind the scenes and lists them according
to different criteria, which is a big boost in productivity as compared to hand tuning all the
different combinations of the underlying con iguration options. BigML’s exponential smoothing
methodology models time series data as a combination of different components: level, trend,
seasonality, and error (see the Understanding Time Series Models section for more details).

When creating a time series, we have several options regarding whether to model each
component additively or multiplicatively, or whether to include a component at all. To alleviate
this burden, BigML computes in parallel a model for each applicable combination, allowing you
to explore how your Time Series data its within the entire family of exponential smoothing
models. Naturally, we need some way to compare the individual models. BigML computes
several different performance measures for each model, allowing you to select the model that
best its your data and gives the most accurate forecasts. Their parameters and corresponding
formulas are described in depth in the Dashboard and API documentation.
Forecast
BigML lets you forecast the future values of multiple objective ields in short or long time
horizons with a time series model. You will be able to separately train a different time series
model for each objective ield in just a few clicks. Your time series forecasts come with forecast
intervals: a range of values within which the forecast is contained with a 95% probability.
Generally, these intervals grow as the forecast period increases, since there is more uncertainty
when the predicted time horizon is further away.

Evaluation
Evaluations for time series models differ from supervised learning evaluations in that the
training-test splits must be sequential. For an 80-20 split, the test set is the inal 20% of rows in
the dataset. Forecasts are then generated from the time series model with a horizon equal to the
length of the test set. BigML computes the error between the test dataset values and forecast
values and represents the evaluation performance in the form of several metrics: Mean
Absolute Error (MAE), Mean Squared Error (MSE), R Squared, Symmetric Mean Absolute
Percentage Error (sMAPE), Mean Absolute Scaled Error (MASE), and Mean Directional
Accuracy (MDA). These metrics are fully covered in the Dashboard documentation (available
on July 20, 2017). Every exponential smoothing model type contained by a BigML time series
model is automatically evaluated in parallel, so the end result is a comprehensive overview of
all models’ performance.

Understanding Time Series Models


As mentioned in our description above, time series models are characterized by these four
components:

1 L l
1. Level
Exponential smoothing, as the name suggests, reduces noise variation in a time series’ value,
resulting in a gentler, smoother curve. To understand how it works, let us irst consider a
simple moving average ilter: for a given order m, the iltered value at time t is simply the
arithmetic mean of the m preceding values, or in other words, an equally-weighted sum of the
past m values. In exponential smoothing, the iltered value is a weighted sum of all the
preceding values, with the weights being highest for the most recent values, and decreasing
exponentially as we move towards the past. The rate of this decrease is controlled by a single
smoothing coef icient α, where higher values mean the ilter is more responsive to localized
changes. In the following igure, we compare the results of iltering stock price data with
moving average and exponential smoothing.

Both smoothing techniques attenuate the sharp peaks found in the underlying time series data.
The iltered values from exponential smoothing are what we call the level component of the time
series model. Because the behavior of exponential smoothing is governed by a continuous
parameter α, rather than an integer m, the number of possible solution is in initely greater than
the moving average ilter, and it is possible to achieve a superior it to the data by performing
parameter optimization. Moreover, the exponential smoothing procedure for the level
component may be analogously applied to the remaining time series components: trend and
seasonality.

2. Trend
While the level component represents the localized average value of a time series, the
trend component of a time series represents the long-term trajectory of its value. We represent
trend as either the difference between consecutive level values (additive trend, linear
trajectory), or the ratio between them (multiplicative trend, exponential trajectory). As with the
level component, the sequence of local trend values is considered to be a noisy series which we
can again smooth in an exponential fashion. The following series exhibits a pronounced
additive (linear) trend.
3. Seasonality
The seasonality component of a time series represents any variation in its value which follows a
consistent pattern over consecutive, ixed-length intervals. For example, sales of alcohol may be
consistently higher during summer months and lower during winter months year after year.
This variation may be modeled as a relatively constant amount independent of the time series’
level (additive seasonality), or as a relatively constant proportion of the level (multiplicative
seasonality). The following series is an example of multiplicative seasonality on a yearly cycle.
Note that the magnitude of variation is larger when the level of the series is higher.

4. Error
After accounting for the level trend and seasonality components There remains some
After accounting for the level, trend, and seasonality components. There remains some
variation not yet accounted for by the model. Like seasonality, error may be modeled as an
additive process (independent of the series level), or multiplicative process (proportional to
the series level). Parameterizing the error component is important for computing con idence
bounds for time series Forecasts.

In Summary
To wrap up, BigML’s time series models:

Help solve use cases such as predicting stock prices, sales forecasting, website traf ic,
production, and inventory analysis as well as weather forecasting, among many other use
cases.
Are used to characterize the properties of ordered sequences, and to forecast their
future behavior.
Implement exponential smoothing, where the data is modeled as a combination of
exponentially-weighted moving averages. That is, the recent instances are given more
weight than older instances.
Train the data with a different split compared to other modeling methods. The split needs
to be sequential rather than random, so you can test your model against the latter
period in your dataset.
Are trained as level components and it can include other components such as trend,
damped, seasonality, or error.
Let you forecast one or multiple objective ields. For multiple objectives, you can train
a separate set of time series models for each objective ield.

You can create time series models, interpret, and evaluate them, as well as forecast short and
longer horizons with them via the BigML Dashboard, our API, and bindings, plus WhizzML and
bindings (to automate your time series work lows). All of these options will be showcased in
future blog posts.

Learn how to use the newly released Python APIs for Arm NN
inference engine to classify images as “Fire” versus “Non-Fire” by
reading this article.
Presented by Arm

Like This Article? Read More From DZone


Creating Your First Time Series Investigating Real-World Data With
Creating Model With the Dashboard Investigating
Time Series
Your Real-
First World
Time Data
Series With
Model Time
With the Series
Dashboard

AI AI Meets AI: The Key to Actually Free DZone Refcard


Meets Implementing AI Introduction
Introduction to TensorFlow
AI: The to
Key to TensorFlow
Actually
Implementing
AI

Topics: AI , TIME SERIES , TUTORIAL , MACHINE LEARNING , DATA ANALYTICS

Published at DZone with permission of Maria Jesus Alonso , DZone MVB. See the original article here. 
Opinions expressed by DZone contributors are their own.

You might also like