# 随机微积分作业代写stochastic calculus代考| CONVERGENCE OF RANDOM VARIABLES

my-assignmentexpert™ 随机微积分stochastic calculus作业代写，免费提交作业要求， 满意后付款，成绩80\%以下全额退款，安全省心无顾虑。专业硕 博写手团队，所有订单可靠准时，保证 100% 原创。my-assignmentexpert™， 最高质量的随机微积分stochastic calculus作业代写，服务覆盖北美、欧洲、澳洲等 国家。 在代写价格方面，考虑到同学们的经济条件，在保障代写质量的前提下，我们为客户提供最合理的价格。 由于随机微积分stochastic calculus作业种类很多，难度波动比较大，同时其中的大部分作业在字数上都没有具体要求，因此随机微积分stochastic calculus作业代写的价格不固定。通常在经济学专家查看完作业要求之后会给出报价。作业难度和截止日期对价格也有很大的影响。

my-assignmentexpert™ 为您的留学生涯保驾护航 在经济学作业代写方面已经树立了自己的口碑, 保证靠谱, 高质且原创的微积分calculus代写服务。我们的专家在随机微积分stochastic calculus 代写方面经验极为丰富，各种随机微积分stochastic calculus相关的作业也就用不着 说。

• 随机偏微分方程
• 随机控制
• Ito积分
• black-Scholes-Merton option pricing formula
• Fokker–Planck equation
• 布朗运动 Brownian motion

## 微积分作业代写calclulus代考|a Forms of convergence

1.Let $X_{n}, X, n \geq 1$, be random variables on the probability space $(\Omega, \mathcal{F}, P)$ and $1 \leq p<\infty$. We need several notions of convergence $X_{n} \rightarrow X$ : (i) $X_{n} \rightarrow X$ in $L^{p}$, if $\left|X_{n}-X\right|_{p}^{p}=E\left(\left|X_{n}-X\right|^{p}\right) \rightarrow 0$, as $n \uparrow \infty$. (ii) $X_{n} \rightarrow X, P$-almost surely $\left(P\right.$-as. ), if $X_{n}(\omega) \rightarrow X(\omega)$ in $\bar{R}$, for all points $\omega$ in the complement of some $P$-null set. (iii) $X_{n} \rightarrow X$ in probability on the set $A \in \mathcal{F}$, if $P\left(\left[\left|X_{n}-X\right|>\epsilon\right] \cap A\right) \rightarrow 0, n \uparrow \infty$, for all $\epsilon>0$. Convergence $X_{n} \rightarrow X$ in probability is defined as convergence in probability on all of $\Omega$, equivalently $P\left(\left|X_{n}-X\right|>\epsilon\right) \rightarrow 0, n \uparrow \infty$, for all $\epsilon>0$.
Here the differences $X_{n}-X$ are evaluated according to the rule $(+\infty)-(+\infty)=$ $(-\infty)-(-\infty)=0$ and $|Z|_{p}$ is allowed to assume the value $+\infty$. Recall that the finiteness of the probability measure $P$ implies that $|Z|_{p}$ increases with $p \geq 1$. Thus $X_{n} \rightarrow X$ in $L^{p}$ implies that $X_{n} \rightarrow X$ in $L^{r}$, for all $1 \leq r \leq p$.

Convergence in $L^{1}$ will simply be called convergence in norm. Thus $X_{n} \rightarrow X$ in norm if and only if $\left|X_{n}-X\right|_{1}=E\left(\left|X_{n}-X\right|\right) \rightarrow 0$, as $n \uparrow \infty$. Many of the results below make essential use of the finiteness of the measure $P$.
1.a.0. (a) Convergence $P$-as. implies convergence in probability.
(b) Convergence in norm implies convergence in probability.
Proof. (a) Assume that $X_{n} \nrightarrow X$ in probability. We will show that that $X_{n} \nrightarrow X$ on a set of positive measure. Choose $\epsilon>0$ such that $P\left(\left[\left|X_{n}-X\right| \geq \epsilon\right]\right) \not 0$, as $n \uparrow \infty$. Then there exists a strictly increasing sequence $\left(k_{n}\right)$ of natural numbers and a number $\delta>0$ such that $P\left(\left|X_{k_{n}}-X\right| \geq \epsilon\right) \geq \delta$, for all $n \geq 1$.

Set $A_{n}=\left[\left|X_{k_{n}}-X\right| \geq \epsilon\right]$ and $A=\left[A_{n} i . o .\right]$. As $P\left(A_{n}\right) \geq \delta$, for all $n \geq 1$, it follows that $P(A) \geq \delta>0$. However if $\omega \in A$, then $X_{k_{n}}(\omega) \not X(\omega)$ and so $X_{n}(\omega) \nrightarrow X(\omega)$. (b) Note that $P\left(\left|X_{n}-X\right| \geq \epsilon\right) \leq \epsilon^{-1}\left|X_{n}-X\right|_{1} \cdot \mathbf{I}$
1.a.1. Convergence in probability implies almost sure convergence of a subsequence.
Proof. Assume that $X_{n} \rightarrow X$ in probability and choose inductively a sequence of integers $0<n_{1}<n_{2}<\ldots$ such that $P\left(\left|X_{n_{k}}-X\right| \geq 1 / k\right) \leq 2^{-k}$. Then $\sum_{k} P\left(\left|X_{n_{k}}-X\right| \geq 1 / k\right)<\infty$ and so the event $A=\left[\left|X_{n_{k}}-X\right| \geq \frac{1}{k}\right.$ i.o. $]$ is a nullset. However, if $\omega \in A^{c}$, then $X_{k_{n}}(\omega) \rightarrow X(\omega)$. Thus $X_{k_{n}} \rightarrow X, P$-as. I

Remark. Thus convergence in norm implies almost sure convergence of a subsequence. It follows that convergence in $L^{p}$ implies almost sure convergence of a subsequence. Let $L^{0}(P)$ denote the space of all (real valued) random variables on $(\Omega, \mathcal{F}, P)$. As usual we identify random variables which are equal $P$-as. Consequently $L^{0}(P)$ is a space of equivalence classes of random variables.

It is interesting to note that convergence in probability is metrizable, that is, there is a metric $d$ on $L^{0}(P)$ such that $X_{n} \rightarrow X$ in probability if and only if $d\left(X_{n}, X\right) \rightarrow 0$, as $n \uparrow \infty$, for all $X_{n}, X \in L^{0}(P)$. To see this let $\rho(t)=1 \wedge t$, $t \geq 0$, and note that $\rho$ is nondecreasing and satisfies $\rho(a+b) \leq \rho(a)+\rho(b), a, b \geq 0$. From this it follows that $d(X, Y)=E(\rho(|X-Y|))=E(1 \wedge|X-Y|)$ defines a metric on $L^{0}(P)$. It is not hard to show that $P(|X-Y| \geq \epsilon) \leq \epsilon^{-1} d(X, Y)$ and $d(X, Y) \leq P(|X-Y| \geq \epsilon)+\epsilon$, for all $0<\epsilon<1$. This implies that $X_{n} \rightarrow X$ in probability if and only if $d\left(X_{n}, X\right) \rightarrow 0$. The metric $d$ is translation invariant $(d(X+Z, Y+Z)=d(X, Y))$ and thus makes $L^{0}(P)$ into a metric linear space. In contrast it can be shown that convergence $P$-as. cannot be induced by any topology.

## 微积分作业代写calclulus代考|b Norm convergence and uniform integrability

1.b.0. $X$ is integrable if and only if $\lim {c \uparrow \infty} E(|X| ;[|X| \geq c])=0$. In this case $X$ satisfies $\lim {P(A) \rightarrow 0} E\left(|X| 1_{A}\right)=0$.

Proof. Assume that $X$ is integrable. Then $|X| 1_{|| X \mid0$ be arbitrary and choose $c$ such that $E(|X| ;[|X| \geq c])<\epsilon$. If $A \in \mathcal{F}$ with $P(A)<\epsilon / c$ is any set, we have
\begin{aligned} E\left(|X| 1_{A}\right) &=E(|X| ; A \cap[|X|<c])+E(|X| ; A \cap[|X| \geq c]) \ & \leq c P(A)+E(|X| ;[|X| \geq c])<\epsilon+\epsilon=2 \epsilon \end{aligned}
Thus $\lim {P(A) \rightarrow 0} E\left(|X| 1{A}\right)=0$. Conversely, if $\lim _{c \mid \infty} E(|X| ;[|X| \geq c])=0$ we can choose $c$ such that $E(|X| ;[|X| \geq c]) \leq 1$. Then $E(|X|) \leq c+1<\infty$. Thus $X$ is integrable. $\mathbf{I}$

This leads to the following definition: a family $F=\left{X_{i} \mid i \in I\right}$ of random variables is called uniformly integrable if it satisfies
$$\lim {c \uparrow \infty} \sup {i \in I} E\left(\left|X_{i}\right| ;\left[\left|X_{i}\right| \geq c\right]\right)=0,$$

that is, $\lim {c \uparrow \infty} E\left(\left|X{i}\right| ;\left[\left|X_{i}\right| \geq c\right]\right)=0$, uniformly in $i \in I$. The family $F$ is called uniformly $P$-continuous if it satisfies
$$\lim {P(A) \rightarrow 0} \sup {i \in I} E\left(1_{A}\left|X_{i}\right|\right)=0,$$
that is, $\lim {P(A) \rightarrow 0} E\left(1{A}\left|X_{i}\right|\right)=0$, uniformly in $i \in I$. The family $F$ is called $L^{1}$-bounded, iff $\sup {i \in I}\left|X{i}\right|_{1}<+\infty$, that is, $F \subseteq L^{1}(P)$ is a bounded subset.

## 微积分作业代写calclulus代考|a Forms of convergence

1.让Xn,X,n≥1, 是概率空间上的随机变量(Ω,F,磷)和1≤p<∞. 我们需要几个收敛的概念Xn→X： （一世）Xn→X在一世p， 如果|Xn−X|pp=和(|Xn−X|p)→0， 作为n↑∞. (二)Xn→X,磷- 几乎可以肯定(磷-作为。）， 如果Xn(ω)→X(ω)在R¯, 对于所有点ω在一些的补充磷-空集。㈢Xn→X在集合上的概率一种∈F， 如果磷([|Xn−X|>ε]∩一种)→0,n↑∞， 对所有人ε>0. 收敛Xn→X概率被定义为在所有的概率收敛Ω, 等价磷(|Xn−X|>ε)→0,n↑∞， 对所有人ε>0.

1.a.0。(a) 趋同磷-作为。意味着概率收敛。
(b) 范数收敛意味着概率收敛。

1.a.1. 概率收敛意味着子序列几乎肯定收敛。

## 微积分作业代写calclulus代考|b Norm convergence and uniform integrability

1.b.0。X可积当且仅当 $\lim {c \uparrow \infty} E(|X| ;[|X| \geq c])=0.一世n吨H一世sC一种s和Xs一种吨一世sF一世和s\lim {P(A) \rightarrow 0} E\left(|X| 1_{A}\right)=0$。

$$\lim {c \uparrow \infty} \sup {i \in I} E\left(\left|X_{i}\right| ;\left[\left| X_{i}\right| \geq c\right]\right)=0,$$

\lim {P(A) \rightarrow 0} \sup {i \in I} E\left(1_{A}\left|X_{i}\right|\right)=0，