jagomart
digital resources
picture1_Calculus Pdf 169537 | Yu,lingyue


 187x       Filetype PDF       File size 0.26 MB       Source: math.uchicago.edu


File: Calculus Pdf 169537 | Yu,lingyue
stochastic calculus on brownian motion and stochastic integration lingyueyu abstract in this paper i will rst introduce the basics of measure theo retic probability and give a proof of central ...

icon picture PDF Filetype PDF | Posted on 25 Jan 2023 | 2 years ago
Partial capture of text on file.
                                      STOCHASTIC CALCULUS ON BROWNIAN MOTION AND
                                                          STOCHASTIC INTEGRATION
                                                                         LINGYUEYU
                                           Abstract. In this paper, I will first introduce the basics of measure theo-
                                           retic probability and give a proof of Central Limit Theorem using moment
                                           generating functions. This section will allow us to explore stochastic processes
                                           and Brownian motion in a more rigorous way. Finally, building upon Brow-
                                           nian motion, we can formally explain Ito’s Integral and Ito’s Formula on the
                                           stochastic calculus.
                                                                         Contents
                                    1.   Introduction                                                                        1
                                    2.   Probability Measure, Random Variable, and Expectation                               1
                                    3.   Stochastic Processes                                                                5
                                    4.   Brownian Motion                                                                     6
                                    5.   Itˆo’s Formula                                                                      8
                                    Acknowledgments                                                                         11
                                    References                                                                              11
                                                                     1. Introduction
                                    Calculus is the study of continuous change and typically characterized by differ-
                                 entiable functions. However, a stochastic process is a collection of random variables
                                 and its derivative is not easily defined, which renders ordinary calculus ineffective.
                                 To define a stochastic integral, we need to deal with randomness in stochastic pro-
                                 cesses.   Essential definitions and theorems in probability introduced in the first
                                 section allow our further discussions. In the remaining sections, we will delve into
                                 the stochastic integration on Brownian motion, a stochastic process modeling con-
                                 tinuous random motion, and examine the construction of stochastic integrals under
                                 the framework of Itˆo’s Integral and Itˆo’s Formula.
                                        2. Probability Measure, Random Variable, and Expectation
                                 Definition 2.1 (Sample Space). A sample space Ω is a non-empty set of outcomes.
                                 Definition 2.2 (Algebra of Sets). Let X be a set (We usually work within the
                                 context of sample space Ω in probability). An algebra is a collection A of subsets
                                 of X such that
                                      (1) ∅ ∈ A and X ∈ A.
                                      (2) If A ∈ A, then Ac := X \A ∈ A.
                                    Date: DEADLINES: Draft AUGUST 15 and Final version AUGUST 29, 2020.
                                                                               1
                             2                                 LINGYUE YU
                                                            S                 T
                                (3) If A ,...,A ∈ A, then     n  A ∈Aand n A ∈A.
                                        1       n             k=1 k            k=1 k
                             Definition 2.3 (σ-Algebra of Sets). For an algebra of sets, we have in addition
                             that                          S                 T
                                (4) If A1,A2,··· ∈ A, then   ∞ Ak∈Aand ∞ Ak∈A.
                                                             k=1               k=1
                             then we call A an σ -Algebra of Sets.                                T
                               In (4), we only allow countable unions and intersections.   Since   ∞ Ai =
                             S                                T                                    i=1
                               ∞ c c                            ∞
                             ( i=1 Ai) , the requirement that   k=1Ak ∈ A would be redundant. The pair
                             (X,A) is called a measurable space. A set A is measurable or A − measurable if
                             A∈A.
                             Definition 2.4 (Probability Measure). Let Ω be a sample space and let F be a
                             σ-Algebra on Ω. A probability measure is a function P : F → [0,1] such that
                                (1) P(∅) = 0 and P(Ω) = 1.
                                (2) If events E ,E ,... are pairwise disjoint, then
                                               1  2
                                                          ∞            ∞     
                                                         XP(Ei)=P [Ei .
                                                         i=1            i=1
                             Definition 2.5 (Probability Space). Given a set Ω and a σ-Algebra F on Ω, a
                             probability space is the triple (Ω,F,P), where P is the probability measure.
                             Definition 2.6 (Borel σ-Algebra). If we have an arbitrary collection C of subsets
                             of X, define                              \
                                                        σ(C) :=                A.
                                                                A is a σ−Algebra
                                                                     C⊂A
                             We call σ(C) the σ-Algebra generated by C. If G is the collection of all open sets
                             on X, then we define B = σ(G) to be the Borel σ-Algebra on X.
                             Definition 2.7 (Borel σ-Algebra on R). R is the Borel σ-Algebra generated by R.
                             Proposition 2.8 (LebesgueMeasure). Let X = [0,1] and B be the Borel σ-Algebra.
                             There is a unique measure λ on (X,B) such that for any interval J ⊂ X, we have
                             that λ(J) = length(J). We call the measure as Lebesgue Measure.
                             Proof. The construction of Lebesgue Measure involves Carth´eodory Extension The-
                             orem using the algebra of finite unions and intersections of intervals. This proposi-
                             tion also works with X = R because the space is σ-finite. The whole construction
                             of Lebesgue measure please see Chapter 4 of [7] from Page 24 to 41.          
                             Definition 2.9 (Random Variable). Let (Ω,F,P) be a probability space and let
                             (R,B,λ) be the real line endowed with the Borel σ-Algebra and the Lebesgue mea-
                             sure. A real random variable is a measurable function X : Ω → R.
                             Definition 2.10 (Expectation). The expectation for a discrete random variable is
                                                          E[X] = XiP(X =i).
                                                                   i
                             For a continuous random variable, the expectation is defined as the following inte-
                             gral:                                  Z
                                                            E[X] =     XdP.
                                                                     Ω
                                   STOCHASTIC CALCULUS ON BROWNIAN MOTION AND STOCHASTIC INTEGRATION 3
                                Definition 2.11 (Variance). The variance of random variable X is defined as
                                                                                2         2         2
                                                     Var[X] := E[(X −E[X]) ] = E[X ]−E[X] .
                                Definition 2.12 (Conditional Expectation). Let X be a random variable with
                                E[X] < ∞. Then there exists a unique G-measurable random variable E(X|G) such
                                that, for very bounded G-measurable random variable Y, we have
                                                                 E(XY)=E(E(X|G)Y).
                                The unique random variable E(X|G) is called the conditional expectation.
                                Definition 2.13 (Distribution Function). The distribution function F : R → [0,1]
                                of a random variable X on Ω is defined by
                                                          F (x) := P ((−∞,x]) = P(X ≤ x).
                                                           X          X
                                F is an increasing function whose corresponding Lebesgue measure is P .
                                  X                                                                              X
                                Definition 2.14 (Density and Distribution). Let X be a continuous random vari-
                                able. A function F is the distribution function of X if
                                                                  F(x) = Z x f(y)dy.
                                                                             −∞
                                Function f is the density of X if
                                                                  PX(A)=Z f(x)dx.
                                                                              A
                                Definition 2.15 (Standard Normal Distribution). A standard distribution is de-
                                fined to be                              Z
                                                                          b            2
                                                                               1    −x
                                                                Φ(b) =        √ e 2 dx.
                                                                         −∞ 2π
                                Definition 2.16 (Independence). Two events E ,E are independent if
                                                                                      1   2
                                                               P(E ∩E )=P(E )P(E ).
                                                                   1     2         1     2
                                Lemma2.17 (Borel-Cantelli). Let {En} be a sequence of events in the probability
                                space (Ω,F,P). If the sum of the probabilities of the E        is finite
                                                                                             n
                                                                      ∞
                                                                     XP(En)<∞,
                                                                     n=1
                                then the probability of infinitely many of them to occur is 0, which is
                                                                     ∞ ∞
                                                                 P(\ [ EmEn)=0.
                                                                    n=1m=n
                                Proof. By definition, for each n,
                                                    ∞ ∞                     ∞            ∞
                                                 P(\ [ E E )≤P([ E )≤ XP(E )<∞.
                                                               m n               m               m
                                                   n=1m=n                 m=n           m=n
                                                 P
                                When n→∞, ∞ P(E )→0. Therefore, we have proved the claim.                                
                                                    m=n      m
                                4                                      LINGYUE YU
                                Definition2.18(MomentGeneratingFunction). Themoment-generatingfunction
                                of a random variable X is given by
                                                                 M(t) := E[etX], t ∈ R.
                                If two random variables have the same moment-generating function, they are said
                                to be identically distributed.
                                Lemma2.19. LetZ1,Z2,... be a sequence of random variables having distribution
                                functions F     and moment generating function M          , n ≥ 1. Furthermore, let Z be
                                             Z                                         Z
                                              n                                          n
                                a random variable having distribution function F       andmomentgeneratingfunctions
                                                                                     Z
                                M . If M          →M forallt, then we have F (t) → F (t) for all t at which
                                   Z        Zn(t)      Z(t)                            Zn          Z
                                F is continuous.
                                  Zt
                                   This lemma is integral to the proof of the central limit theorem. As it is an
                                advanced and technical proof, we will not prove this lemma in the paper. However,
                                the whole proof can be seen in Probability and Random Processes [5].
                                Theorem 2.20 (Central Limit Theorem). Let X ,X ,...,X                     be independent,
                                                                                          1   2        n
                                identically distributed random variables with E[X ] = µ and Var[X ] = σ2 < ∞.
                                                                                       i                    i
                                Let
                                                                 (X +X +···+X )−nµ
                                                           Z =       1     2   √       n        .
                                                            n                 σ n
                                Then as n → ∞, the distribution of Zn approaches a standard normal distribution.
                                Precisely, this means: if a < b
                                                           lim P(a ≤ Zn ≤ b) = Φ(b)−Φ(a).
                                                          n→∞
                                Proof. We begin the proof with the assumption that µ = 0,σ2 = 1 ,and the mo-
                                ment generating function of X ,M(t) exists and is finite. If not, we consider its
                                                                   i
                                standardized random variable X∗ = (Xi − µ)/σ2 for the same. By definition, the
                                                                    i
                                                                    Xi
                                moment generating function of √        is given as:
                                                                     n
                                                                                   tX
                                                                         t         √i
                                                                   M(√ )=E[e n].
                                                                         n
                                                                               Pn Xi              t  n
                                Thus, the moment generating function of          i=1 √   is [M(√ )] since:
                                                                                       n          n
                                                        "       n        !#       "       n       !#
                                                                 XX                       XtX
                                                      E exp t        √i      =E exp           √i
                                                                 i=1   n                  i=1   n
                                                                                 n             
                                                                                Y            tXi
                                                                             =      E exp √
                                                                                i=1             n
                                                                                 n
                                                                                Y tX
                                                                                        √i
                                                                             =      E[e n]
                                                                                i=1
                                                                                      t   n
                                                                             =[M(√n)] .
                                Then, we define
                                                                     L(t) = logM(t),
The words contained in this file might help you see if this file matches what you are looking for:

...Stochastic calculus on brownian motion and integration lingyueyu abstract in this paper i will rst introduce the basics of measure theo retic probability give a proof central limit theorem using moment generating functions section allow us to explore processes more rigorous way finally building upon brow nian we can formally explain ito s integral formula contents introduction random variable expectation it o acknowledgments references is study continuous change typically characterized by dier entiable however process collection variables its derivative not easily dened which renders ordinary ineective dene need deal with randomness pro cesses essential denitions theorems introduced our further discussions remaining sections delve into modeling con tinuous examine construction integrals under framework denition sample space non empty set outcomes algebra sets let x be usually work within context an subsets such that if then ac date deadlines draft august final version lingyue yu t n aa...

no reviews yet
Please Login to review.