# 随机微积分作业代写stochastic calculus代考| Ito’s Formula

my-assignmentexpert™ 随机微积分stochastic calculus作业代写，免费提交作业要求， 满意后付款，成绩80\%以下全额退款，安全省心无顾虑。专业硕 博写手团队，所有订单可靠准时，保证 100% 原创。my-assignmentexpert™， 最高质量的随机微积分stochastic calculus作业代写，服务覆盖北美、欧洲、澳洲等 国家。 在代写价格方面，考虑到同学们的经济条件，在保障代写质量的前提下，我们为客户提供最合理的价格。 由于随机微积分stochastic calculus作业种类很多，难度波动比较大，同时其中的大部分作业在字数上都没有具体要求，因此随机微积分stochastic calculus作业代写的价格不固定。通常在经济学专家查看完作业要求之后会给出报价。作业难度和截止日期对价格也有很大的影响。

my-assignmentexpert™ 为您的留学生涯保驾护航 在经济学作业代写方面已经树立了自己的口碑, 保证靠谱, 高质且原创的微积分calculus代写服务。我们的专家在随机微积分stochastic calculus 代写方面经验极为丰富，各种随机微积分stochastic calculus相关的作业也就用不着 说。

• 随机偏微分方程
• 随机控制
• Ito积分
• black-Scholes-Merton option pricing formula
• Fokker–Planck equation
• 布朗运动 Brownian motion

## 微积分作业代写calclulus代考|Itoís formula

3.a Ito’s formula. Let $X=\left(X^{1}, \ldots, X^{d}\right)$ be an $R^{d}$-valued process with continuously differentiable paths and consider the process $Y_{t}=f\left(X_{t}\right)$, where $f \in C^{2}\left(R^{d}\right)$. Let us write
$$D_{j} f=\frac{\partial f}{\partial x_{j}} \quad \text { and } \quad D_{i j} f=\frac{\partial^{2} f}{\partial x_{i} \partial x_{j}}$$
The process $Y$ has continuously differentiable paths with
$$\frac{d}{d t} f\left(X_{t}(\omega)\right)=\sum_{j=1}^{d} D_{j} f\left(X_{t}(\omega)\right) \frac{d}{d t} X_{t}^{j}(\omega) .$$
Fixing $\omega \in \Omega$ and integrating yields
$$f\left(X_{t}(\omega)\right)-f\left(X_{0}(\omega)\right)=\sum_{j=1}^{d} \int_{0}^{t} D_{j} f\left(X_{s}(\omega)\right) \frac{d}{d s} X_{s}^{j}(\omega) d s$$
where this integral is to be interpreted pathwise. Written as
$$f\left(X_{t}\right)-f\left(X_{0}\right)=\sum_{j=1}^{d} \int_{0}^{t} D_{j} f\left(X_{s}\right) d X_{s}^{j}$$
this equation remains true if $X$ is a continuous, bounded variation process. The situation becomes more complicated if the process $X$ is a continuous semimartingale and hence no longer has paths which are of bounded variation on finite intervals in general. Then a new term appears on the right hand side of (0) (Ito’s formula). We will give a very explicit derivation which shows clearly where the new term comes from.

(a) Assume first that $F, K$ are compact sets such that $F \subseteq K^{o} \subseteq K \subseteq G$ and the range of $X$ is contained in $F$. Fix $t \geq 0$ and let $\left(\Delta_{n}\right)$ be a sequence of partitions of the interval $[0, t]$ such that $\left|\Delta_{n}\right| \rightarrow 0$, as $n \uparrow \infty$. For $n \geq 1$ write $\Delta_{n}=\left{0=t_{0}^{n}0$ and
$$\Omega_{m}=\left{\omega \in \Omega \mid\left|X_{t_{k}^{n}}(\omega)-X_{t_{k-1}^{n}}(\omega)\right|<\epsilon, \forall n \geq m, 1 \leq k \leq k_{n}\right} .$$
If $\omega \in \Omega$, then the path $s \in[0, t] \rightarrow X_{s}(\omega)$ is uniformly continuous and so $\omega \in \Omega_{m}$, for some $m \geq 1$. Thus $\Omega_{m} \uparrow \Omega$, as $m \uparrow \infty$. It will thus suffice to show that (1) holds $P$-as. on the set $\Omega_{m}$, for each $m \geq 1$.

Fix $m \geq 1$. If $\omega \in \Omega_{m}$, then $X_{t_{k}^{n}}(\omega) \in B_{\epsilon}\left(X_{t_{k-1}^{n}}(\omega)\right)$ and hence the line segment from $X_{t_{k-1}^{n}}(\omega)$ to $X_{t_{k}^{n}}(\omega)$ is contained in the ball $B_{\epsilon}\left(X_{t_{k-1}^{n}}(\omega)\right) \subseteq K$, for all $n \geq m$ and all $1 \leq k \leq k_{n}$. Let $n \geq m$ and write
$$f\left(X_{t}\right)-f\left(X_{0}\right)=\sum_{k=1}^{k_{n}}\left[f\left(X_{t_{k}^{n}}\right)-f\left(X_{t_{k-1}^{n}}\right)\right] .$$
Consider $k \in\left{1, \ldots, k_{n}\right}$ and $\omega \in \Omega_{m}$. A second degree Taylor expansion for $f(x)$ centered at $x=X_{t_{k-1}^{n}}(\omega)$ yields
\begin{aligned} f\left(X_{t_{k}^{n}}\right)-f\left(X_{t_{k-1}^{n}}\right)=& \sum_{j=1}^{d} D_{j} f\left(X_{t_{k-1}^{n}}\right)\left(X_{t_{k}^{n}}^{j}-X_{t_{k-1}^{n}}^{j}\right) \ &+\frac{1}{2} \sum_{i, j=1}^{d} D_{i j} f\left(\xi_{n k}\right)\left(X_{t_{k}^{n}}^{i}-X_{t_{k-1}^{n}}^{i}\right)\left(X_{t_{k}^{n}}^{j}-X_{t_{k-1}^{n}}^{j}\right) \end{aligned}
where the point $\xi_{n k}=\xi_{n k}(\omega)$ is on the line segment from $X_{t_{k}^{n}}(\omega)$ to $X_{t_{k-1}^{n}}(\omega)$. Note that this line segment is contained in $K$ and that $D_{i j} f$ is uniformly continuous on $K$. Entering the above expansion into (3) and commuting the order of summation, we can write
$f\left(X_{t}\right)-f\left(X_{0}\right)=\sum_{j=1}^{d} A_{j}^{n}+\frac{1}{2} \sum_{i, j=1}^{d} B_{i j}^{n}$,
where $\quad A_{j}^{n}=\sum_{k=1}^{k_{n}} D_{j} f\left(X_{t_{k-1}^{n}}\right)\left(X_{t_{k}^{n}}^{j}-X_{t_{k-1}^{n}}^{j}\right)$
and $\quad B_{i j}^{n}=\sum_{k=1}^{k_{n}} D_{i j} f\left(\xi_{n k}\right)\left(X_{t_{k}^{n}}^{i}-X_{t_{k-1}^{n}}^{i}\right)\left(X_{t_{k}^{n}}^{j}-X_{t_{k-1}^{n}}^{j}\right)$,
at all points $\omega \in \Omega_{m}$. According to $2 . e .1$ we have $A_{j}^{n} \rightarrow \int_{0}^{t} D_{j} f\left(X_{s}\right) d X_{s}^{j}$ in probability, as $n \uparrow \infty$. Since limits in probability are uniquely determined $P$-as., it will now suffice to show that $B_{i j}^{n} \rightarrow \int_{0}^{t} D_{i j} f\left(X_{s}\right) d\left\langle X^{i}, X^{j}\right\rangle_{s}$ in probability on the set $\Omega_{m}$, as $n \uparrow \infty$. To see this we will compare $B_{i j}^{n}$ to the similar term
$$\tilde{B}{i j}^{n}=\sum{k=1}^{k_{n}} D_{i j} f\left(X_{t_{k-1}^{n}}\right)\left(X_{t_{k}^{n}}^{i}-X_{t_{k-1}^{n}}^{i}\right)\left(X_{t_{k}^{n}}^{j}-X_{t_{k-1}^{n}}^{j}\right),$$
which is known to converge to $\int_{0}^{t} D_{i j} f\left(X_{s}\right) d\left\langle X^{i}, X^{j}\right\rangle_{s}$ in probability (2.e.5).

## 微积分作业代写calclulus代考|Differential notation

3.b Differential notation. Let us introduce some purely symbolic but nonetheless useful notation. If $X \in \mathcal{S}$ we write $d Z_{t}=H_{t} d X_{t}$ or more briefly $d Z=H d X$ if and only if $H \in L(X)$ and $Z_{t}=Z_{0}+\int_{0}^{t} H_{s} d X_{s}$, for all $t \geq 0$, equivalently iff $H \in L(X)$ and $Z=Z_{0}+H \cdot X$.

The equality $d Z=0$ is to be interpreted as $d Z=0 d X$, for some $X \in \mathcal{S}$. Clearly then $d Z=0$ if and only if $Z_{t}=Z_{0}, t \geq 0$, that is, if $Z$ is a stochastic constant. By the associative law 2.d.2
$$d Z=H d X \text { and } d X=K d Y \quad \Rightarrow \quad d Z=H K d Y .$$
According to 2.d.1.(f), $H \in L(X), K \in L(Y), Z=H \bullet X$ and $W=K \bullet Y$ imply that $H K \in L_{l o c}^{1}(\langle X, Y\rangle)$ and $\langle H \bullet X, K \bullet Y\rangle_{t}=\int_{0}^{t} H_{s} K_{s} d\langle X, Y\rangle_{s}, t \geq 0$. In differential notation this can be written as
$$d Z=H d X, \text { and } d W=K d Y \quad \Rightarrow \quad d\langle Z, W\rangle=H K d\langle X, Y\rangle .$$
If we define the product $d Z d W$ of the stochastic differentials $d Z$ and $d W$ as
$$d Z d W=d\langle Z, W\rangle$$
then (1) assumes the form $d Z=H d X, d W=K d Y \Rightarrow d Z d W=H K d X d Y$. In particular $d Z=H d X \Rightarrow d\langle Z\rangle=(d Z)^{2}=H^{2}(d X)^{2}=H^{2} d\langle X\rangle$. There is no analogue for the differential products $d X d Y$ in classical integration theory on the line: If $X$ and $Y$ are locally of bounded variation then $\langle X, Y\rangle=0$.

The above can be generalized to vector valued integrators $X$. If $X \in \mathcal{S}^{d}$, then we write $d Z=H \cdot d X$, iff $H \in L(X)$ and $Z=Z_{0}+H \cdot X$, that is, $Z_{t}=$ $Z_{0}+\sum_{j=1}^{d} \int_{0}^{t} H_{s}^{j} d X_{s}^{j}$, for all $t \geq 0$. Note that then $Z$ is a scalar semimartingale. The associative law ( 0$)$ now assumes the form
$$d Y=K d Z \text { and } d Z=H \cdot d X \Rightarrow d Y=(K H) \cdot d X,$$
whenever $X \in \mathcal{S}^{d}, H \in L(X), K \in L(Z)=L(H \bullet X)$ (2.d.2). Here $X$ and $H$ are $R^{d}$-valued processes while $Z$ and $K$ are scalar processes. Thus $K H$ is an $R^{d}$-valued process also. Likewise 2.d.1 in differential notation yields:

## 随机微积分作业代写calclulus代考|Itoís formula

3.a 伊藤公式。令 $X=\left(X^{1}, \ldots, X^{d}\right)$ 为具有连续可微路径的 $R^{d}$ 值过程，并考虑过程 $Y_{t} =f\left(X_{t}\right)$，其中 $f \in C^{2}\left(R^{d}\right)$。让我们写
$$D_{j} f=\frac{\partial f}{\partial x_{j}} \quad \text { 和 } \quad D_{ij} f=\frac{\partial^{2} f}{\partial x_{i} \部分 x_{j}}$$

$$\frac{d}{dt} f\left(X_{t}(\omega)\right)=\sum_{j=1}^{d} D_{j} f\left(X_{t}(\omega )\right) \frac{d}{dt} X_{t}^{j}(\omega) 。$$

$$f\left(X_{t}(\omega)\right)-f\left(X_{0}(\omega)\right)=\sum_{j=1}^{d} \int_{0}^{ t} D_{j} f\left(X_{s}(\omega)\right) \frac{d}{ds} X_{s}^{j}(\omega) ds$$

$$f\left(X_{t}\right)-f\left(X_{0}\right)=\sum_{j=1}^{d} \int_{0}^{t} D_{j} f\左（X_{s}\右）d X_{s}^{j}$$

(a) 首先假设$F, K$ 是紧集使得$F \subseteq K^{o} \subseteq K \subseteq G$ 并且$X$ 的范围包含在$F$ 中。修正 $t \geq 0$ 并令 $\left(\Delta_{n}\right)$ 是区间 $[0, t]$ 的分区序列，使得 $\left|\Delta_{n}\right | \rightarrow 0$，作为 $n \uparrow \infty$。对于 $n \geq 1$ 写 $\Delta_{n}=\left{0=t_{0}^{n}0$ 和
$$\Omega_{m}=\left{\omega \in \Omega \mid\left|X_{t_{k}^{n}}(\omega)-X_{t_{k-1}^{n}}( \omega)\right|<\epsilon, \forall n \geq m, 1 \leq k \leq k_{n}\right} 。$$

$$f\left(X_{t}\right)-f\left(X_{0}\right)=\sum_{k=1}^{k_{n}}\left[f\left(X_{t_{k }^{n}}\right)-f\left(X_{t_{k-1}^{n}}\right)\right] 。$$

$$\开始{对齐} f\left(X_{t_{k}^{n}}\right)-f\left(X_{t_{k-1}^{n}}\right)=& \sum_{j=1}^{ d} D_{j} f\left(X_{t_{k-1}^{n}}\right)\left(X_{t_{k}^{n}}^{j}-X_{t_{k -1}^{n}}^{j}\右）\ &+\frac{1}{2} \sum_{i, j=1}^{d} D_{ij} f\left(\xi_{nk}\right)\left(X_{t_{k}^{ n}}^{i}-X_{t_{k-1}^{n}}^{i}\right)\left(X_{t_{k}^{n}}^{j}-X_{t_ {k-1}^{n}}^{j}\右） \end{对齐}$$

$f\left(X_{t}\right)-f\left(X_{0}\right)=\sum_{j=1}^{d} A_{j}^{n}+\frac{1} {2} \sum_{i, j=1}^{d} B_{ij}^{n}$,