Assume an experiment with a set of possible outcomes \(w_1, w_2, \dots\). A sample space, \(\Omega\), is defined as the set of all possible outcomes, \(\Omega=\{w_1, w_2, \dots\}\).
Measurable space: A σ-algebra \( \mathcal{F} \), on a sample space \( \Omega \), is a collection of subsets of \( \Omega \) called events such that it satisfies:
- \(\mathcal{F}\) contains the trivial sets and \( \phi \) and \( \Omega \).
- If event \(A \in \mathcal{F} \), then the complement of \(A, A^c \in \mathcal{F} \).
- If events \(A_1, A_2, \dots \in \mathcal{F} \), then \( \bigcup_{i=1}^{\infty} A_i \in \mathcal{F} \)
The pair \((\Omega, \mathcal{F}) \) is called a measurable space. The smallest σ-algebra that contains all open sets of a topological space is called Borel σ-algebra and is denoted by \( \mathcal{B} \).
Partition: A set of events {Hi} form a partition of the sample space \( \Omega \), if the following conditions hold
- \( \bigcup_i H_i = \Omega \), and
- \( H_i \cap H_j = \phi , \forall i \neq j. \) (Mutually exclusive events).
Independence: Two events \( A,B \in \mathcal{F} \) are called independent if and only if
$$ \mathrm{P}(A\cap B) = \mathrm{P}(A) \times \mathrm{P}(B). $$
Mutually Exclusive: If events A and B are mutually exclusive, then
$$ \mathrm{P}(A \cup B)=\mathrm{P}(A) + \mathrm{P}(B).$$
Example:
The following \( H_1 \) and \( H_1 \) form a partition of the sample space of all integers \( \mathbb{Z} \) with- \( H_1\) : the set of all even integers.
- \( H_2 \): the set of all odd integers.
$$ 1=\mathrm{P}(n \in \mathbb{Z})= \mathrm{P}(n \in H_1) + \mathrm{P}(n \in H_2)= 0.5+0.5 .$$
Probability space:
A probability, \(\mathrm{P}\), on a measurable space \( (\Omega,\mathcal{F}) \) is a function, \(\mathrm{P}: \mathcal{F} \longrightarrow [0,1] \), such that:
- \( \mathrm{P}(\phi)=0 \) and \( \mathrm{P}(\Omega)=1\).
- \( 0 \leq \mathrm{P}(A) \leq 1\) for any event \(A \in \mathcal{F}\).
- If \(A_1, A_2, \dots \) are mutually exclusive events, then \(\displaystyle{\mathrm{P}(\bigcup_{i=1}^{\infty} A_i)}=\displaystyle{\sum_{i=1}^{\infty} \mathrm{P}(A_i)}\).
Random variable: A function, \( \mathbf{X} : \Omega \longrightarrow \mathfrak{R} \), is called a random variable (r.v.), with respect to the probability space \( (\Omega,\mathcal{F},\mathrm{P}) \), such that \( \forall \text{ open set } u \subseteq \mathfrak{R} \):
$$\mathbf{X}^{-1}(u)=\mathrm{P}(w \in \Omega; \mathbf{X}(w) \in u). $$
We can then write \(\forall x \in \mathfrak{R}\):
$$
\mathrm{P}(\mathbf{X}\leq x)=\mathrm{P}(w \in \Omega; \mathbf{X}(w)\leq x).
$$
Every r.v. \( \mathbf{X} \) has a probability measure \( \mathrm{P}_{\mathbf{X}} \) on \( \mathfrak{R} \), defined by
$$
\mathrm{P}_{\mathbf{X}}(\mathcal{B}) =\mathrm{P}(\mathbf{X}^{-1}(\mathcal{B})),
$$
We can then write \(\forall x \in \mathfrak{R}\):
$$
\mathrm{P}(\mathbf{X}\leq x)=\mathrm{P}(w \in \Omega; \mathbf{X}(w)\leq x).
$$
Every r.v. \( \mathbf{X} \) has a probability measure \( \mathrm{P}_{\mathbf{X}} \) on \( \mathfrak{R} \), defined by
$$
\mathrm{P}_{\mathbf{X}}(\mathcal{B}) =\mathrm{P}(\mathbf{X}^{-1}(\mathcal{B})),
$$
where \( \mathcal{B} \) is a Borel σ-algebra. \( \mathrm{P}_{\mathbf{X}}\) is called the distribution function of \( \mathbf{X} \).
Lemma: If two random variables \( \mathbf{X}, \mathbf{Y}: \Omega \longrightarrow \mathfrak{R} \) are indepedent, with \( \mathrm{E}[|\mathbf{X}|] . \mathrm{E} [|\mathbf{Y}|] < \infty \) then $$ \mathrm{E} [\mathbf{X} \mathbf{Y}]= \mathrm{E}[\mathbf{X}] \mathrm{E}[\mathbf{Y}]. $$
A stochastic process is a collection of random variables, \( \{\mathbf{X}_t\}_{t \in T} \), defined on a probability space \( (\Omega,\mathcal{F},\mathrm{P}) \) and assuming values in \( \mathfrak{R}\) . For every r.v. \( \mathbf{X} \), it's distribution function is defined as
$$ P_{\mathbf{X}}(x):=\mathrm{P}(\mathbf{X} \leq x). $$
If there is a function \(f\) such that
$$ P_{\mathbf{X}}(x)=\int_{-\infty}^{x}f(u) \mathrm{d} u: \forall x \in \mathfrak{R}, $$
then \( \mathbf{X}\) is said to be a continuous random variable with density function \(f\).
Measurable function: Given two measurable spaces \((\Omega_1, \mathcal{F}_1)\) and \((\Omega_2, \mathcal{F}_2)\), \(g:\Omega_1 \longrightarrow \Omega_2\) is called a measurable function if
$$ \forall A: A \in \mathcal{F}_2 \Longrightarrow g^{-1}(A) \in \mathcal{F}_1. $$
Expectation of measurable functions: Let \(\mathbf{X}\) be a continuous r.v. with density function \(f\) and with respect to the probability space \((\Omega,\mathcal{F},\mathrm{P})\). If \(g(x)\) is a measurable function and bounded (i.e. \(\mathrm{E} \left[|g(x)| \right]<\infty\)), then
$$ \mathrm{E} \left[|g(\mathbf{X})| \right]= \int_{-\infty}^{\infty}g(x)f(x) \mathrm{d} x. $$
A stochastic process is called a Martingle, if the following conditions hold
Example: A card is dealt face down at random from a complete deck of 52 playing cards. Tom has a portfolio of £3, he bets £1 on the card being a ``Queen'' and the remaining £2 on the card being a red card with pay-off odds 12:1 and 1:1 respectively.
References
A stochastic process is a collection of random variables, \( \{\mathbf{X}_t\}_{t \in T} \), defined on a probability space \( (\Omega,\mathcal{F},\mathrm{P}) \) and assuming values in \( \mathfrak{R}\) . For every r.v. \( \mathbf{X} \), it's distribution function is defined as
$$ P_{\mathbf{X}}(x):=\mathrm{P}(\mathbf{X} \leq x). $$
If there is a function \(f\) such that
$$ P_{\mathbf{X}}(x)=\int_{-\infty}^{x}f(u) \mathrm{d} u: \forall x \in \mathfrak{R}, $$
then \( \mathbf{X}\) is said to be a continuous random variable with density function \(f\).
Measurable function: Given two measurable spaces \((\Omega_1, \mathcal{F}_1)\) and \((\Omega_2, \mathcal{F}_2)\), \(g:\Omega_1 \longrightarrow \Omega_2\) is called a measurable function if
$$ \forall A: A \in \mathcal{F}_2 \Longrightarrow g^{-1}(A) \in \mathcal{F}_1. $$
Expectation of measurable functions: Let \(\mathbf{X}\) be a continuous r.v. with density function \(f\) and with respect to the probability space \((\Omega,\mathcal{F},\mathrm{P})\). If \(g(x)\) is a measurable function and bounded (i.e. \(\mathrm{E} \left[|g(x)| \right]<\infty\)), then
$$ \mathrm{E} \left[|g(\mathbf{X})| \right]= \int_{-\infty}^{\infty}g(x)f(x) \mathrm{d} x. $$
A stochastic process is called a Martingle, if the following conditions hold
- \( (\mathbf{X_t},\mathcal{F_t}) \) is a measurable space, \(\forall t \).
- \( \mathrm{E} \left[\mathbf{X_t}\right]< \infty, \forall t \)
- \( \mathrm{E} \left[\mathbf{X_t}| \mathcal{F_s}\right]=\mathbf{X_s}, \forall t \geq s\)
Example: A card is dealt face down at random from a complete deck of 52 playing cards. Tom has a portfolio of £3, he bets £1 on the card being a ``Queen'' and the remaining £2 on the card being a red card with pay-off odds 12:1 and 1:1 respectively.
- What is the sample space \(\Omega\)?
- Is the following \(\mathcal{F}\) a σ-algebra? and why?
- \( \mathcal{F}\) ={ \(\phi \), The drawn card is a ``Queen'', is not a ``Queen'', is red, is black, is a ``Queen'' and red, is a ``Queen'' and black, is a ``Queen'' or red, is a ``Queen'' or black, is red or is not a ``Queen'', is black or is not a ``Queen", is not a ``Queen'' but black, is not a ``Queen'' but red, is either black or red }.
- Find the probability measure \(\mathrm{P}:\mathcal{F} \rightarrow [0,1]\) on the above measure space \( (\Omega,\mathcal{F}) \).
- Model the portfolio value as a random variable \(\mathbf{X}\) with \(\mathbf{X}:\Omega \rightarrow \mathfrak{R}\).
- Is the stochastic process \(\mathbf{X_t}\) a martingale?
- Let event \(A=\) {The drawn card is a ``Queen''} and event \(B=\) {The drawn card is a red}, are events \(A\) and \(B\) independent?
References
- Goldie C.M. ,
Probability and Statistics
, Lecture notes, University of Sussex, 2011-2012.
- John E. Freund's,
Mathematical Statistics with Applications
, Seventh Edition, 2004.
- Oksendal B.,
Stochastic Differential Equations: An Introduction With Applications
, 5th ed. Springer, 2000.
No comments:
Post a Comment