### Demand forecasting helps you plan for the future. There are a ton of forecasting methods, but below Ryan focuses on a few that can be used on time series data because time series problems are what we specialize in.

**Introduction**

Alright, let’s look at a few forecasting models. Don’t worry, we’re not going to derive proofs for these models because this is a blog post, not a textbook! Instead of building models from scratch, let’s gain some intuition into how they function.

**Triple exponential smoothing**

This method assigns exponentially decreasing weights as observations get older. The most recent observations are the most important and useful data points for forming a forecast. We can break up a triple exponential smoothing model into 3 components: Level, Trend, and Seasonal. The most intuitive part of triple exponential smoothing models is the additive method - each of your components literally “adds up” to form your forecast:

$$y_{t+h|t} = l_t + hb_t + s_{t-m}$$

Your h-step ahead forecast is the sum of the 3 components: \(l_t\) is your level component, \(b_t\) is your trend multiplied by how far ahead you want to forecast and \(s_{t-m}\) defines the effect due to which season it is.

**Autoregressive integrated moving average (ARIMA)**

This is an autoregressive model with optional moving average terms and differencing components. For a simple example, let’s just look at an ARIMA model with p previous terms but without moving average or differencing components:

$$y_t = c + \phi_1y_{t-1} + \phi_2y_{t-2} + ... + \phi_py_{t-p} + \epsilon_t$$

Here your forecast variable is dependent on lagged observations weighted by some coefficients that are subject to constraints.

ARIMA models can be a good choice for time series that are non-stationary since they allow for differencing components. However, the model coefficients are constrained in a manner that cause forecasts to converge to the original time series mean as time progresses. It's not necessarily a bad outcome, but long-term forecasts likely won’t be accurate.

**Multiple linear regression**

This method is aimed at finding relationships between different variables. Generally, a problem looking to use regression analysis has a set of variables (often called predictors) that each have a specific relationship to a target variable.

Essentially, this is every forecaster's go-to method:

$$y_i = \beta_0 + \beta_1x_{1,i} + \beta_2x_{2,i} + ... + \beta_kx_{k,i} + \epsilon_i$$

Your target variable is dependent on a linear combination of predictor variables. It’s easy to identify how much each predictor variable adds to the forecast. However, linear regressions don’t perform well on datasets that are highly autocorrelated or if multiple predictor variables are too correlated.