3. Stationary Increments#
Logarithms of many economic time series that appear to display stochastic growth can be modeled as having stationary increments. Multivariate versions of these models possess stochastic process versions of balanced growth paths. Applied econometricians seek permanent shocks that contribute to such growth. Furthermore, we shall see that it is convenient to pose central limit theory in terms of processes with stationary increments. The mathematical formulation in this chapter opens the door to studying these topics.
3.1. Basic setup#
We adopt assumptions from Inventing an Infinite Past that allow an infinite past and again let \( {\mathfrak A}\) be a subsigma algebra of \({\mathfrak F}\) and
Let \(X\) be a scalar measurement function. Assume that \(Y_0\) is \({\mathfrak A}_0\) measurable and consider a scalar process \(\{Y_t : t=0,1,... \}\) with stationary increments \(\{X_t\}\):
for \(t=0,1, \ldots\). Let
and
We can interpret the above equations as providing two contributions to the \(\{Y_{t}: t \ge 0\}\) process. Thus, component \(U_{t+1}\) is unpredictable and represents new information about \(Y_{t+1}\) that arrives at date \(t+1\). Component \(\nu\) is the trend rate of growth or decay in \(\{Y_{t} : t \ge 0\}\) conditioned on the invariant events. In the following sections, we present a full decomposition of a stationary increment process that will be useful both in connecting to sources of permanent versus transitory shocks and to central limit theorems.
3.2. A martingale decomposition#
A special class of stationary increment processes called additive martingales interests us.
The process \(\{Y_t^m : t=0,1,... \}\) is said to be an additive martingale relative to \(\{ {\mathfrak A}_{t} : t=1,2,... \}\) if for \(t=0,1,... \)
\(Y_t^m\) is \({\mathfrak A}_{t}\) measurable, and
\(E\left(Y_{t+1}^m \vert {\mathfrak A}_t \right) = Y_t^m\) .
Notice that by the Law of Iterated Expectations, for a martingale \(\{Y_{t}^m : t \ge 0\}\), best forecasts satisfy:
for \(j \ge 1\). Under suitable additional restrictions on the increment process \(\{X_t : t \ge 0 \}\), we can deploy a construction of Gordin [1969] to show that the \(\{V_t\}\) process contributes a martingale component to the \(\{Y_t^m : t=0,1, ... \}\) process.[1] Let \({\mathcal H}\) denote the set of all scalar random variables \(X\) such that \(E(X^2) < \infty\) and such that[2]
is well defined as a mean-square convergent series. Convergence of the infinite sum on the right side limits temporal dependence of the process \(\{ X_t \}\). For example, it can exclude so-called long memory processes.[3]
Construct the one-period ahead forecast of \(H_{t+1}\):
Notice that
where
Since \(G_t\) is a forecast error,
Assembling these parts, we have
Let
Since \(Y_t^m\) is \({\mathfrak A}_{t}\) measurable, the equality
implies that the process \(\{Y_t^m : t \ge 0 \}\) is an additive martingale.
For a given stationary increment process, \(\{Y_t : t \ge 0\}\), express the martingale increment as
So the increment to the martingale component of \(\{Y_t : t \ge 0 \}\) is new information about the limiting optimal forecast of \(Y_{t+j}\) as \(j \rightarrow + \infty\).
By accumulating equation (3.3) forward, we arrive at:
If \(X\) is in \({\mathcal H}\), the stationary increments process \(\{Y_t : t=0,1,...\}\) satisfies the additive decomposition
The stationary increment process, \(\{Y_{t}^m : t\ge 0 \},\) is the martingale component with \(Y_0^m = 0\), The component \(\{H_{t}^+\}\) is stationary. The other components are constant over time.
Proposition 3.1 decomposes a stationary-increment process into a linear time trend, a martingale, and a transitory component. A permanent shock is the increment to the martingale. The martingale and transitory contributions are typically correlated.
(Moving-average increment process) Consider again the Example 1.7 moving-average process:
Use this \(\{X_t\}\) process as the increment for \(\{ Y_t : t \ge 0 \}\) in formula (3.1). New information about the unpredictable component of \(X_{t+j}\) for \(j \ge 0\) that arrives at date \(t\) is
Summing these terms over \(j\) gives
where
provided that the coefficient sequence \(\{ \alpha_j : j\ge 0\}\) is summable, a condition that restricts temporal dependence of the increment process \(\{X_t\}\). Indeed, it is possible for \(\alpha(1) = \infty\) or for it not to be well defined while
ensuring that \(X_t\) is well defined. This possibility opened the door to the literature on long-memory processes that allow for \(\alpha(1)\) to be infinite as discussed in Granger and Joyeux [1980] and elsewhere.
In what follows, we presume that \(\alpha(1)\) is finite. This sum of the coefficients \(\{\alpha_j: j\ge 0 \}\) in moving-average representation (3.5) for the first difference \(Y_{t+1} - Y_t = X_{t+1}\) of \(\{ Y_t : t=0,1,.... \}\) tells the permanent effect of \(W_{t+1}\) on current and future values of the level of \(Y\), i.e., the effect on \(\lim_{j\rightarrow + \infty} Y_{t+j}\). Models of Blanchard and Quah [1989] and Shapiro and Watson [1988] build on this property.
The variance of the random variable \(\alpha(1) \cdot W_{t+1}\) conditioned on the invariant events in \({\mathfrak I}\) is \(|\alpha(1)|^2\). The overall variance of \(X_{t}\) is given by
To form a permanent-transitory shock decomposition, construct the permanent shock as:
where we introduce an additional scaling so the permanent shock has variance one. Form
which by construction will be uncorrelated with \(W_{t+1}^p\). Since the covariance matrix of \(W_{t+1}^{tr}\) will be singular, the components of \(W_{t+1}^{tr}\) can be expressed as linear combinations of a vector of transitory shocks with unit variances.
This is a process in which \(W_t\) has transient but no permanent effects on future \(Y\)’s. Let \(\alpha_0 = 1\) and \(\alpha_j = (\lambda -1) \lambda^{j-1}\) for \(j \geq 1\) and \(-1 < \lambda < 1\). Construct the power series
Evidently, \(\alpha(1) = 0\). Define
and note that since (3.6) is satisfied
The process \(\{ Y_t : t = 0,1,... \}\) is stationary provided that \(Y_0 = - H_0\), ensuring that \(Y_t = - H_t\) for all \(t \ge 0\).
3.3. Central limit approximation#
Example 3.1 starts from a moving average of martingale differences that is used as an increment \(\{X_t \}\) to a \(\{Y_t: t \ge 0\}\) process, after which it constructs a process of innovations to the martingale component of the \(\{Y_t: t \ge 0 \}\) process. That analysis illustrates the workings of an operator \(\mathbb{D}\) that maps an admissible increment process in \(\mathcal{H}\) into the innovation in a martingale component. To construct \(\mathbb{D}\), let \(\mathcal{G}\) be the set of all random variables \(G\) with finite second moments that satisfy the conditions that \(G\) is \(\mathfrak{A}\) measurable and that \(E(G_1 \vert \mathfrak{A}) = 0\) where \(G_1 = G \circ \mathbb{S}\). Define \(\mathbb{D}: \mathcal{H} \rightarrow \mathcal{G}\) via
Both \(\mathcal{G}\) and \(\mathcal{H}\) are linear spaces of random variables and \(\mathbb{D}\) is a linear transformation. The operator \(\mathbb{D}\) plays a prominent role in a central limit approximation.
To form a central limit approximation, construct the following scaled partial sum that nets out trend growth
where
From Billingsley [1961]’s central limit theorem for martingales
where \(\Rightarrow\) denotes weak convergence, meaning convergence in distribution. Clearly, \(\{(1/ {\sqrt t}) H_{t}^+\}\) and \(\{(1/{\sqrt{t}}) (H_0^+ + Y_0) \}\) both converge in mean square to zero.
For all stationary increment processes \(Y_t : t=0,1,2, ...\) represented by \(X\) in \(\mathcal{H}\)
Furthermore,
3.4. Cointegration#
Linear combinations of stationary increment processes \(Y_t^1\) and \(Y_t^2\) have stationary increments. For real-valued scalars \(r_1\) and \(r_2\), form
where
The increment in \(\{Y_t : t=0, 1, \ldots \}\) is
and
The Proposition 3.1 martingale component of \(\{ Y_t : t \ge 0 \}\) is the corresponding linear combination of the martingale components of \(\{ Y_t^1 : t =0,1,...\}\) and \(\{ Y_t^2 : t =0,1,...\}\). The Proposition 3.1 trend component of \(\{ Y_t : t =0,1, \ldots \}\) is the corresponding linear combination of the trend components of \(\{ Y_t^1 : t =0,1, \ldots \}\) and \(\{ Y_t^2 : t =0,1, \ldots \}\).
Proposition 3.1 sheds light on the cointegration concept of Engle and Granger [1987] that is associated with linear combinations of stationary increment processes whose trend and martingale components are both zero. Call two processes cointegrated if there exists a linear combination of them that is stationary.[4] That situation prevails when there exist real-valued scalars \(r_1\) and \(r_2\) such that
where the \(\nu\)’s correspond to the trend components in Proposition 3.1. These two zero restrictions imply that the time trend and the martingale component for the linear combination \(Y_t\) are both zero.[5] When \(r_1 = 1\) and \(r_2 = - 1\), the stationary increment processes \(Y_{t}^1\) and \(Y_{t}^2\) share a common growth component.
This notion of cointegration provides one way to formalize balanced growth paths in stochastic environments through determining a linear combination of growing time series for which stochastic growth is absent.