About › Forums › PRM Exam Prep Forum › Exam II End of Regression Chapter – Random Walks and MLE
- This topic is empty.
-
AuthorPosts
-
September 12, 2011 at 8:56 pm #594AnonymousGuest
These subjects are dealt with very briefly in the handbook and I wonder how much we will actually need to know for the exam. I am also struggling with some of the terminology used. For Example “A random walk is a stochastic process for a level variable whose increments are determined by a zero mean.” Could someone please explain what this means in laymens terms and what we are likely to need to know about random walks and MLE for the exam. Thanks for you help.
September 12, 2011 at 8:56 pm #113Patrick O’HaraMemberExam II End of Regression Chapter – Random Walks and MLE
November 7, 2011 at 2:47 am #595AnonymousGuestPohara,
First, apologies for the delay in replying, but things have been a bit hectic at my end.
The sentence you have quoted is a particularly egregious example of how text book writers can make even simple things sound extremely complex. And all you need to do after that is to replace English words with a greek letters, put in a lot of complex notation and become completely incomprehensible. But let me stop the rant and answer your question.
A random walk simply means that returns are unknown and can be whatever they want to be, and therefore prices, which are the sum of the previous period’s closing price plus the current period’s return, can move randomly. There are a few terms you should be familiar with – geometric Brownian motion (same as random walk), Markov process etc – and these are mentioned in the PRMIA handbook and also on this tutorial: http://www.riskprep.com/all-tutorials/35-exam-1/46-weiner-process-markov-property
Quick refresher on MLE:
When you have a sample of observed values of correlated variables, then based on what you see in the sample you want to estimate the true parameters for your regression. In other words, you want to know what is alpha and what is beta (where dependent variable y = alpha + beta * x, where x is the independent variable). You can do that in two ways:
1. Method of Least Squares (or Ordinary Least Squares, OLS): Calculate alpha and beta such that the squares of the differences between the observed values those predicted by your regression model are minimized.
2. Maximum Likelihood Estimation (MLE): Calculate alpha and beta such that the probability of having obtained the sample that you did is maximized.MLE gives the same estimates for alpha and beta as OLS (assuming that the regression errors are normally distributed around the predicted value).
I have not heard of any complicated questions being asked on MLE.
-
AuthorPosts
- The forum ‘PRM Exam Prep Forum’ is closed to new topics and replies.