<img src="https://certify.alexametrics.com/atrk.gif?account=J5kSo1IWhd105T" style="display:none" height="1" width="1" alt="">

Nexosis @ Work & Play

Forecasting With Exponential Smoothing, ARIMA and Regression

April 11, 2017 @ 4:35 PM | Technical

Use these methods for demand forecasting

Demand forecasting helps you plan for the future. There are a ton of forecasting methods, but below Ryan focuses on a few that can be used on time series data because time series problems are what we specialize in.


Alright, let’s look at a few forecasting models. Don’t worry, we’re not going to derive proofs for these models because this is a blog post, not a textbook! Instead of building models from scratch, let’s gain some intuition into how they function.

Triple exponential smoothing

This method assigns exponentially decreasing weights as observations get older. The most recent observations are the most important and useful data points for forming a forecast. We can break up a triple exponential smoothing model into 3 components: Level, Trend, and Seasonal. The most intuitive part of triple exponential smoothing models is the additive method - each of your components literally “adds up” to form your forecast:

$$y_{t+h|t} = l_t + hb_t + s_{t-m}$$

Your h-step ahead forecast is the sum of the 3 components: \(l_t\) is your level component, \(b_t\) is your trend multiplied by how far ahead you want to forecast and \(s_{t-m}\) defines the effect due to which season it is.

Autoregressive integrated moving average (ARIMA)

This is an autoregressive model with optional moving average terms and differencing components. For a simple example, let’s just look at an ARIMA model with p previous terms but without moving average or differencing components:

$$y_t = c + \phi_1y_{t-1} + \phi_2y_{t-2} + ... + \phi_py_{t-p} + \epsilon_t$$

Here your forecast variable is dependent on lagged observations weighted by some coefficients that are subject to constraints.

ARIMA models can be a good choice for time series that are non-stationary since they allow for differencing components. However, the model coefficients are constrained in a manner that cause forecasts to converge to the original time series mean as time progresses. It's not necessarily a bad outcome, but long-term forecasts likely won’t be accurate.

Multiple linear regression

This method is aimed at finding relationships between different variables. Generally, a problem looking to use regression analysis has a set of variables (often called predictors) that each have a specific relationship to a target variable.

Essentially, this is every forecaster's go-to method:

$$y_i = \beta_0 + \beta_1x_{1,i} + \beta_2x_{2,i} + ... + \beta_kx_{k,i} + \epsilon_i$$

Your target variable is dependent on a linear combination of predictor variables. It’s easy to identify how much each predictor variable adds to the forecast. However, linear regressions don’t perform well on datasets that are highly autocorrelated or if multiple predictor variables are too correlated.

Like I said, these are just a few forecasting methods. Stay tuned for a second blog post that discusses more complex methods. Our API uses forecasting algorithms so you can solve business problems with time series data. Discover all that our API can do.

Ready to start building machine learning applications?

Sign up for free  Talk to an expert

Ryan West

Ryan is one our machine learning engineers. In addition to being the unofficial face of Nexosis, he spends his days building and testing models, fine tuning algorithms, and generally being a nice guy.