A random variable is a quantity whose outcome is uncertain.
Mutually exclusive events mean one and only one event can occur at any time.
Exhaustive events one of the events must occur, i.e. that the listed events cover all possible outcomes
b) The two defining properties of probability;
Two defining properties of Probability.
i. Probability of any event E is a number between 0 and 1, .
ii. Sum of the probabilities of any list of mutually exclusive and exhaustive events equals 1.
c) Empirical, a priori, and subjective probabilities;
Empirical probability is when the probability of an event occurring is estimated from data, usually in the form of a relative frequency.
A priori probability is when probability of an event is deduced by reasoning about the structure of the problem itself.
These first two approaches to probability are sometimes referred to as objective probabilities because they should not vary from person to person.
Subjective probability is when the probability of an event is based on a personal assessment without reference to any particular data.
Inconsistent probabilities create profit opportunities because investors can buy and sell assets at the resulting inconsistent prices in ways that allow them to achieve profits on average. These buying and selling decisions should eliminate inconsistent prices, and probabilities, in the market.
e) Unconditional and conditional probabilities;
Unconditional or marginal probability, P(A), is the probability of event A occurring without reference to any other event.
Conditional probability, P(A|B), is the probability of event A occurring given that event B is known to already have occurred. P(A|B) = P(AB)/P(B) if P(B) ≠ 0. Conditional probabilities are important in tests of market efficiency, where event B is some piece of public or private information that becomes available to the market at some point of time.
f) Joint probability;
Joint probability, P(AB), is the probability of both event A and event B occurring together.
g) Multiplication rule and the joint probability of two events;
Multiplication Rule for probabilities - Joint probability, P(AB), is
P(AB) = P(A|B) P(B) = P(B|A) P(A)
h) The probability that at least one of two events will occur;
Addition Rule for probabilities – Given events A and B, the probability that A or B occurs is equal to:
P(A or B) = P(AU B) = P(A) + P(B) - P(AB)
If you don’t see this result then construct a Venn diagram of Events A and B that share some overlap. The sum P(A) + P(B) counts P(AB)twice, so it must be subtracted.
i) Dependent and independent events;
Definition of Independent Events – Two events A and B are independent if and only if:
P(A|B) = P(A) or equivalently P(B|A) = P(B)
If two events are dependent, then the occurrence of one of the events is related to the probability of the occurrence of the other event.
j) Joint probability of any number of independent events;
Multiplication Rule for Independent Events - Joint probability of independent events A1, A2, … Am is:
P(A1A2…Am) = P(A1)P(A2)… P(Am-1)P(Am)
Think about calculating the probability of getting 10 heads on ten coin flips.
k) The total probability rule
Total Probability Rule - Probability of event A is:
i. P(A) = P(A|S) P(S) + P(A|SC) P(SC)
ii. P(A) = P(A|S1) P(S1) + P(A|S2) P(S2)… + P(A|Sm) P(Sm) where S1, S2, … , Sm are mutually exclusive and exhaustive events.
l) Expected value, variance and standard deviation;
Expected Value of a random variable is the probability-weighted average of the possible outcomes of the random variable. Expected Value of random variable X is calculated as:
E[X] = ΣP(xi)xi
Variance of a random variable is the expected value of squared deviations from the random variable’s expected value.
σ² = Σ [(X – Avg.(X)] ²/n-1
Standard deviation is the square root of the variance.
It is a measure of risk and it shows dispersion of possible outcomes around expected level of outcomes.