# 微积分网课代修|极限理论代写Limit Theory代考|MATH407 Autoregression of Order One

• 单变量微积分
• 多变量微积分
• 傅里叶级数
• 黎曼积分
• ODE
• 微分学

## 微积分网课代修|极限理论代写Limit Theory代考|Autoregression of Order One

In this and the subsequent chapter we present concrete applications of previous stable limit theorems. Here we consider an autoregressive process of order one $X=$ $\left(X_{n}\right){n \geq 0}$ generated recursively by $$X{n}=\vartheta X_{n-1}+Z_{n}, n \geq 1,$$
where $\vartheta \in \mathbb{R},\left(Z_{n}\right){n \geq 1}$ is an independent and identically distributed sequence of real random variables and $X{0}$ is a real random variable independent of $\left(Z_{n}\right){n \geq 1}$. We assume that $P^{Z{1}}$ is continuous. Then $X_{n}^{2}>0$ almost surely for all $n \geq 1$ since by independence of $X_{n-1}$ and $Z_{n}, P^{X_{n}}$ is continuous for $n \geq 1$. The usual least squares estimator for the parameter $\vartheta$ on the basis of the observations $X_{0}, \ldots, X_{n}$ is given by
$$\widehat{\vartheta}{n}:=\frac{\sum{j=1}^{n} X_{j} X_{j-1}}{\sum_{j=1}^{n} X_{j-1}^{2}}, n \geq 2,$$
provided $Z_{1} \in \mathcal{L}^{1}(P)$ and $E Z_{1}=0$. In the explosive case $|\vartheta|>1$, the effect of the mean of $Z_{1}$ disappears asymptotically so that $\widehat{\vartheta}{n}$ is also reasonable in that case if $E Z{1} \neq 0$. We prove stable limit theorems for $\widehat{\vartheta}_{n}$ under deterministic and random norming.

Let $\mathcal{F}{n}:=\sigma\left(X{0}, X_{1}, \ldots, X_{n}\right)=\sigma\left(X_{0}, Z_{1}, \ldots, Z_{n}\right)$ for all $n \geq 0$ and $\mathbb{F}:=$ $\left(\mathcal{F}{n}\right){n \geq 0}$. Define $\mathbb{F}$-adapted processes by
$$A_{n}:=\sum_{j=1}^{n} X_{j-1}^{2} \text { with } A_{0}=0$$
and

$$B_{n}:=\sum_{j=1}^{n} X_{j-1} Z_{j} \text { with } B_{0}=0$$
Since $\sum_{j=1}^{n} X_{j} X_{j-1}=\sum_{j=1}^{n}\left(\vartheta X_{j-1}+Z_{j}\right) X_{j-1}=\vartheta A_{n}+B_{n}$, we obtain
$$\widehat{\vartheta}{n}-\vartheta=B{n} / A_{n} \text { for all } n \geq 2 .$$

## 微积分网课代修|极限理论代写Limit Theory代考|Galton-Watson Branching Processes

Let $\left(Y_{n j}\right){n, j \in \mathbb{N}}$ be independent and identically distributed random variables with values in $\mathbb{N}{0}$, and let $X_{0}$ be some random variable with values in $\mathbb{N}$ which is independent of $\left(Y_{n j}\right){n, j \in \mathbb{N}}$, where all these random variables are defined on the same probability space $(\Omega, \mathcal{F}, P)$. For every $n \in \mathbb{N}$ we set $$X{n}:=\sum_{j=1}^{X_{n-1}} Y_{n j} .$$
The process $X=\left(X_{n}\right){n \geq 0}$ is the Galton-Watson branching process. The process $X$ can be interpreted as follows: In a population of particles (which may represent people, cells, neutrons, etc., depending on the field of application) each particle $j$ of the ( $n-1)$-th generation produces a random number $Y{n j}$ (which may be 0 ) of identical particles in the $n$-th generation, called the offspring of $j$, and it does so independently of all other particles from the $(n-1)$-th and all earlier generations. The offspring distribution, i.e. the distribution of $Y_{n j}$, is the same for all particles in all generations. Then $X_{n}$ is the total number of particles in the $n$-th generation, with $X_{0}$ being the (random) number of particles in the 0-th generation. Note that excluding the value 0 of $X_{0}$ is not an essential restriction because by definition of $X_{n}$ we would have $X_{n}=0$ for all $n \in \mathbb{N}$ on the event $\left{X_{0}=0\right}$ so that $\left(X_{n}\right){n \geq 0}$ would be trivial on $\left{X{0}=0\right}$.

## 微积分网课代修|极限理论代写Limit Theory代考| Autoregression of Order One

$$X n=\vartheta X_{n-1}+Z_{n}, n \geq 1,$$

$$\widehat{\vartheta}{n}:=\frac{\sum j=1^{n} X{j} X_{j-1}}{\sum_{j=1}^{n} X_{j-1}^{2}}, n \geq 2$$

$$A_{n}:=\sum_{j=1}^{n} X_{j-1}^{2} \text { with } A_{0}=0$$

$$B_{n}:=\sum_{j=1}^{n} X_{j-1} Z_{j} \text { with } B_{0}=0$$

$$\widehat{\vartheta} n-\vartheta=B n / A_{n} \text { for all } n \geq 2 .$$

## 微积分网课代修|极限理论代写Limit Theory代考| Galton-Watson Branching Processes

$$X n:=\sum_{j=1}^{X_{n-1}} Y_{n j}$$