2

As I try to study Markov processes I struggle in understanding how to extend the Markov property for one dimension \begin{align*} \mathbb{P}\left[X_{t} \in A | \mathcal{F}_{s}\right]=\mathbb{P}\left[X_{t} \in A | X_{s}\right] \end{align*} to $n$ dimensions (or at least to two). So far I think it is not enough to just define the property for every dimension separately, because there are processes which are non-Markovian but written as $n$-dimensional process can be Markovian. Secondly I want to show that the following process is Markovian.

Example: Let ($X_t)_{t\geq 0}$ be a standard Brownian motion and define $ Y_{t}=\int_{0}^{t} X_{t} \mathrm{d} t$. Show that $\left(X_{t}, Y_{t}\right)_{t}$ is a Markov process with respect to its natural filtration. In addition, provide a simple intuitive argument for why the combined process is Markovian.

This example breaks down to my initial question, because I don't understand how a non-Markovian process like $Y_t$ can be Markovian as vector $(X_t,Y_t)$, when $X_t$ is Markovian.

Leviathan
  • 358
  • 1
    Your question is reasonable, but perhaps it is first worth asking what makes you think the definition you provided is valid only for one dimension? Nothing is stopping you from taking $A \subset \mathbb{R}^d$, $X_t : \Omega \rightarrow \mathbb{R}^d$. – snar Jun 09 '19 at 13:38
  • 1
  • I can imagine that the condition holds for more than one dimension, but I have no intuition how to write down and read the condition, so that it makes sense to me. So can we write the Markov condition as $\mathbb P[ (X_t,Y_t) \in A | \mathcal F_s] = \mathbb P[ (X_t,Y_t) \in A | X_s]$ and if so, what is meaning of it? – Leviathan Jun 09 '19 at 18:55
  • Why not? Just take a (measurable) set $A \subseteq \mathbb{R}^2$. What makes you believe that there is something special about dimension $n=1$? – saz Jun 09 '19 at 20:06

1 Answers1

4

The (simple) Markov property

$$\mathbb{P}(X_t \in A \mid \mathcal{F}_s) = \mathbb{P}(X_t \in A \mid X_s) \tag{1}$$

makes perfect sense in any dimension $n \geq 1$. If, say, $(X_t)_{t \geq 0}$ is a continuous stochastic process which takes values in $\mathbb{R}^n$, then $(1)$ is well-defined for any Borel set $A \in \mathcal{B}(\mathbb{R}^n)$. The interpretation of $(1)$ is the same for any dimension $n \geq 1$: The evolution of the process in the future does only depend on the present state and not on the past. A Markov process is memoryless in the sense that it does not remember the past but only the present.

If some process $(X_t)_{t \geq 0}$ is not Markovian then $Z_t := (X_t,Y_t)_{t \geq 0}$ still might be Markovian. Why? Adding another component $(Y_t)_{t \geq 0}$ means that the new process has a larger information about the present - note that $Z_t = (X_t,Y_t)$ gives clearly more information than just the value of $X_t$.

Let's consider a Brownian motion $(B_t)_{t \geq 0}$ and $X_t := \int_0^t B_r \, dr$. Let's first try to get some intuition why $(X_t)_{t \geq 0}$ is not Markovian. Fix $s \leq t$, i.e. $s$ corresponds to "present" and $t$ is the "future". Clearly,

$$X_t = X_s + \int_s^t B_r \, dr.$$

This tells us the following: The evolution of the process in the future depends on the present state $X_s$ and on $\int_s^t B_r \, dr$. For $(X_t)_{t \geq 0}$ to be Markovian we would need to show that $\int_s^t B_r \, dr$ depends only on the present state $X_s$ and not on the past - but that's impossible since the value of the integral $\int_s^t B_r \, dr$ depends highly on $B_s$ (e.g. if $B_s$ is very large, then $\int_s^t B_r \, dr$ will be large (at least for $t$ close to $s$)), and the present $X_s$ does not give us any information about $B_s$. However, this indicates that we might have a chance to prove that $(X_t,B_t)_{t \geq 0}$ is Markovian.

To prove that $(X_t,B_t)_{t \geq 0}$ is Markovian, we note that

$$X_t = X_s + \int_s^t B_r \, dr = X_s + B_s (t-s) + \int_s^t (B_r-B_s) \, dr$$

and

$$B_t = B_s + (B_t-B_s).$$

Combining both equations we find that

$$\begin{pmatrix} X_t \\ B_t \end{pmatrix} = f \begin{pmatrix} X_s \\ B_s \end{pmatrix} + Z$$

where $f$ is a deterministic function and $Z$ is a suitable random variable which is independent from past and present $(X_r,B_r)_{r \leq s}$ (due to the independence of the increments of Brownian motion). Now our interpretation of $(1)$ tells us that $(X_t,B_t)_{t \geq 0}$ is Markovian: the evolution of $(X_t,B_t)$ in the future does not depend on the past but only on the present $(X_s,B_s)$.

Formally, the proof goes as follows: Denote by $\mathcal{F}_s = \sigma(B_r;r \leq s)$ the canonical filtration of the Brownian motion. Take a bounded Borel measurable function $u:\mathbb{R}^2 \to \mathbb{R}$, then

\begin{align*} \mathbb{E} \left( u(X_t,B_t) \mid \mathcal{F}_s \right) &= \mathbb{E} \left( u(X_s+(t-s)B_s + \int_s^t (B_r-B_s) \, dr, B_s + (B_t-B_s)) \mid \mathcal{F}_s \right). \end{align*}

Since $(B_r-B_s)_{r \geq s}$ is independent from $\mathcal{F}_s$ and $(X_s,B_s)$ is $\mathcal{F}_s$-measurable, it follows that

\begin{align*} \mathbb{E} \left( u(X_t,B_t) \mid \mathcal{F}_s \right) &= g(X_s,B_s) \tag{2} \end{align*}

where

$$g(x,y) := \mathbb{E} \left( u(x + (t-s)y+ \int_s^t (B_r-B_s) \, dr, y + (B_t-B_s)) \right).$$

By the tower property of conditional expectation, $(2)$ implies

\begin{align*} \mathbb{E}(u(X_t,B_t) \mid (X_s,B_s)) &= \mathbb{E} \bigg[ \mathbb{E}(u(X_t,B_t) \mid \mathcal{F}_s) \mid (X_s,B_s)) \bigg] \\ &\stackrel{(2)}{=} g(X_s,B_s). \tag{3} \end{align*}

Combining $(2)$ and $(3)$ we get

\begin{align*} \mathbb{E} \left( u(X_t,B_t) \mid \mathcal{F}_s \right) &= g(X_s,B_s) \\ &= \mathbb{E}(u(X_t,B_t) \mid (X_s,B_s)) \end{align*}

which proves that $(X_t,B_t)_{t \geq 0}$ is Markovian (with respect to $(\mathcal{F}_t)_{t \geq 0}$).

saz
  • 120,083