Thursday, September 13, 2012

Quantitative Methods - Probability Concepts

1:30 PM

This overview of probabilities will also likely be a lot of review and (hopefully) less calculation intensive than the statistics portion.  There are some more advanced concepts that I've not done in detail (mainly Bayes' formula) that should be interesting to get a better look at.

Introduction

  • Random variable - quantity whose outcomes (possible values) are uncertain
  • Event - a specified set of outcomes
  • Probabilities
    • Range from 0 to 1
    • Sum of probabilities of any set of mutually exclusive and collectively exhaustive events is equal to 1
  • How do we estimate probabilities?
    • Empirics - based on historical data - assumes relationships stable over time.  Same for all people.
    • Subjective - personal judgment.  Can vary person to person.
    • A priori - based on logical analysis rather than empirics or personal.  Same for all people.  
      • "Reason out the problem."
Probability State as Odds
  • Odds for event E = P(E) / [1 - P(E)]
    • Given the odds "for a to b," it is a/(a+b) 
    • Odds of 1 to 7 would be 1/(1+7) = 1/8 = .125
  • Odd against event E = [1 - P(E)] / P(E)
    • Reciprocal of above
    • Given the odds "against a to b" it is b/(a+b)
Conditional/Unconditional
  • Unconditional does not depend on anything else.  Also known as marginal probability
  • Conditional - what is the probability of A dependent on B having occurred
    • P(A|B) = P(AB)/P(B), given that P(B) is not zero
      • P(AB) is the Joint probability
        • Sum of the probabilities that two events have in common
        • I.e. if you have 100 outcomes, two events A and B are mutually exclusive and have odds of 40% and 20%, only 8% of the outcomes will see both A and B occur at the same time
    • A conditional probability can be higher, lower, or equal to the non-conditional
    • You can rearrange using Multiplication Rule
      • P(AB) = P(A|B)*P(B)
    • If events are independent:
      • P(AB) = P(A)*P(B)
Addition Rule for Probabilities
  • P(A or B) = P(A) + P(B) - P(AB)
    • Subtracts joint probability to avoid double counting
    • Note - if events are mutually exclusive, then you can just add because P(AB)=0
Total Probability Rule - probability of any event can be expressed as the weighted average of the probabilities of the event given a set of scenarios; the weights are the probabilities of each scenario

I have to say at this point that I am extraordinarily frustrated with the way they are presenting probabilities - the authors tend to write things in very inaccessible language and jump around topics rather than present them clearly.  I'll try to keep going but this is painful.  For example take this excerpt on Expected Value, a concept I used to think I understood:
"Expected value (for example, expected stock return) looks either to the future, as a forecast, or to the “true” value of the mean (the population mean, discussed in the reading on statistical concepts and market returns). We should distinguish expected value from the concepts of historical or sample mean. The sample mean also summarizes in a single number a central value. However, the sample mean presents a central value for a particular set of observations as an equally weighted average of those observations. To summarize, the contrast is forecast versus historical, or population versus sample. (Institute 450)"
Institute, CFA. Level I 2012 Volume 1 Ethical and Professional Standards and Quantitative Methods, 7th Edition. Pearson Learning Solutions. <vbk:9781256112754#page(450)>.

Wtf.  Anyways, moving on.

Independent Events - defined by P(A|B)=P(A)

Expected Value - the probability weighted average of possible outcomes - denoted E(X)
  • E(X) = P(X1)(X1) + P(X2)X2 ... P(Xi)(Xi)
Variance in context of Expected Value - probability weighted average of squared deviations from the expected value
  • Standard deviation is then the square root of this number
  • Approach for calculation is always expected value, then calculate variance, then calculate standard deviation
    • Can also find expected value conditional on some fact - just replace probabilities with probabilities conditional on X
  • Total Probability Rule for Expected Value
    • E(X) = E(X|S1)*P(S1) + E(X|S2)*P(S2) + ... + E(X|Sn)*P(Sn)
Portfolio Expected Return and Variance of Return
  • Expected return = reward, variance of return = risk
    • Portfolio variance is more complex, because portfolio return is determined by return on individual holdings
  • Portfolio expected return = weighted average of expected returns on component securities
  • Covariance
    • Cov(Ri, Rj) = E[(Ri - ERi)*(Rj-ERj)
    • Probability weighted average of the cross products of each variable's deviation from its own EV 
    • Individual variances constitute part, but not all, of portfolio variance
      • As number of holdings increase, covariance becomes increasingly important, c.p.
  • Covariance impact on portfolio - 3 cases
    • Covariance negative - inverse relationship between returns
    • Covariance of 0 - returns on asset are unrelated
    • Covariance positive - positive relationship between returns
  • Covariance of a random variable with itself is its OWN variance
  • Covariance matrix - put them into a square, denote the diagonal as special - these are just the individual security variances.  The remaining will still be a mirror image.
Diversification benefit increases with decreasing covariance
  • As long as security returns are not perfectly positively correlated, benefits to diversification are possible; the smaller the correlation, the greater the cost of not diversifying
  • Here they go into how to calculate various things given a covariance matrix - this seems far too complex to devote time to - maybe come back later
Topics in Probability
 
Bayes' Formula, Principles of Counting, and Permutations - these are all interesting but very niche and I will consider these low priority for now.

End of Reading 8.

4:45 pm
About 3.25 hours

No comments:

Post a Comment