Professional Documents
Culture Documents
Interdependence between inputs, impact of different inputs relative to the outcome, and other nuances
are ignored, oversimplifying the model and reducing its accuracy.
Yet despite its drawbacks and inaccuracies, many organizations operate using this type of analysis.
Triangular The user defines the minimum, most likely, and maximum values. Values around the most likely are
more likely to occur. Variables that could be described by a triangular distribution include past sales history per
unit of time and inventory levels.
PERT- The user defines the minimum, most likely, and maximum values, just like the triangular distribution.
Values around the most likely are more likely to occur. However values between the most likely and extremes are
more likely to occur than the triangular; that is, the extremes are not as emphasized. An example of the use of a
PERT distribution is to describe the duration of a task in a project management model.
Discrete The user defines specific values that may occur and the likelihood of each. An example might be the
results of a lawsuit: 20% chance of positive verdict, 30% change of negative verdict, 40% chance of settlement,
and 10% chance of mistrial.
During a Monte Carlo simulation, values are sampled at random from the input probability distributions. Each set
of samples is called an iteration, and the resulting outcome from that sample is recorded. Monte Carlo simulation
does this hundreds or thousands of times, and the result is a probability distribution of possible outcomes. In this
way, Monte Carlo simulation provides a much more comprehensive view of what may happen. It tells you not only
what could happen, but how likely it is to happen.
Monte Carlo simulation provides a number of advantages over deterministic analysis:
Probabilistic Results. Results show not only what could happen, but how likely each outcome is.
Graphical Results. Because of the data a Monte Carlo simulation generates, its easy to create graphs
of different outcomes and their chances of occurrence. This is important for communicating findings to
other stakeholders.
Sensitivity Analysis. With just a few cases, deterministic analysis makes it difficult to see which variables
impact the outcome the most. In Monte Carlo simulation, its easy to see which inputs had the biggest
effect on bottom-line results.
Scenario Analysis. In deterministic models, its very difficult to model different combinations of values for
different inputs to see the effects of truly different scenarios. Using Monte Carlo simulation, analysts
can see exactly which inputs had which values together when certain outcomes occurred. This is
invaluable for pursuing further analysis.
Correlation of Inputs. In Monte Carlo simulation, its possible to model interdependent relationships
between input variables. Its important for accuracy to represent how, in reality, when some factors
goes up, others go up or down accordingly.
Research analysts use multivariate models to forecast investment outcomes to understand the
possibilities surrounding their investment exposures and to better mitigate risks. Monte Carlo analysis
is one specific multivariate modeling technique that allows researchers to run multiple trials and define
all potential outcomes of an event or investment. Running a Monte Carlo model creates a probability
distribution or risk assessment for a given investment or event under review. By comparing results
against risk tolerances, managers can decide whether to proceed with certain investments or projects.
(To learn more about Monte Carlo basics, see Introduction To Monte Carlo Simulation and Monte
Carlo Simulation With GBM.)
Multivariate Models
Multivariate models can be thought of as complex, "What if?" scenarios. By changing the value of
multiple variables, the modeler can ascertain his or her impact on the estimate being evaluated.
These models are used by financial analysts to estimate cash flows and new product ideas. Portfolio
managers and financial advisors use these models to determine the impact of investments on portfolio
performance and risk. Insurance companies use these models to estimate the potential for claims and
to price policies. Some of the best-known multivariate models are those used to value stock options.
Multivariate models also help analysts determine the true drivers of value.
Monte Carlo Analysis
Monte Carlo analysis is named after the principality made famous by its casinos. With games of
chance, all the possible outcomes and probabilities are known, but with most investments the set of
future outcomes is unknown. It is up to the analyst to determine the set of outcomes and the
probability that they will occur. In Monte Carlo modeling, the analyst runs multiple trials (often
thousands) to determine all the possible outcomes and the probability that they will take place.
Monte Carlo analysis is useful for analysts because many investment and business decisions are
made on the basis of one outcome. In other words, many analysts derive one possible scenario and
then compare it to return hurdles to decide whether to proceed. Most pro forma estimates start with a
base case. By inputting the highest probability assumption for each factor, an analyst can
actually derive the highest probability outcome. However, making any decisions on the basis of a base
case is problematic, and creating a forecast with only one outcome is insufficient because it says
nothing about any other possible values that could occur. It also says nothing about the very real
chance that the actual future value will be something other than the base case prediction. It is
impossible to hedge or insure against a negative occurrence if the drivers and probabilities of these
events are not calculated in advance. (To learn more about how to manage the risk in your portfolio,
see our Risk and Diversification tutorial.)
Creating the Model
Once designed, executing a Monte Carlo model requires a tool that will randomly select factor values
that are bound by certain predetermined conditions. By running a number of trials with variables
constrained by their own independent probability of occurrence, an analyst creates a distribution that
includes all the possible outcomes and the probability that they will occur. There are many random
number generators in the marketplace. The two most common tools for designing and executing
Monte Carlo models are @Risk and Crystal Ball. Both of these can be used as add-ins for
spreadsheets and allow random sampling to be incorporated into established spreadsheet models.
The art in developing an appropriate Monte Carlo model is to determine the correct constraints for
each variable and the correct relationship between variables. For example, because portfolio
diversification is based on the correlation between assets, any model developed to create expected
portfolio values must include the correlation between investments. (To learn more, read The
Importance of Diversification.)
In order to choose the correct distribution for a variable, one must understand each of the possible
distributions available. For example, the most common one is a normal distribution, also known as a
bell curve. In a normal distribution, all the occurrences are equally distributed (symmetrical) around
the mean. The mean is the most probable event. Natural phenomena, people's heights and inflation
are some examples of inputs that are normally distributed.
In the Monte Carlo analysis, a random-number generator picks a random value for each variable
(within the constraints set by the model) and produces a probability distribution for all possible
outcomes. The standard deviation of that probability is a statistic that denotes the likelihood that the
actual outcome being estimated will be something other than the mean or most probable event.
Assuming a probability distribution is normally distributed, approximately 68% of the values will
fall within one standard deviation of the mean, about 95% of the values will fall within two standard
deviations and about 99.7 % will lie within three standard deviations of the mean. This is known as the
"68-95-99.7 rule" or the "empirical rule".
Examples
Let us take for example two separate, normally distributed probability distributions derived from
random-factor analysis or from multiple scenarios of a Monte Carlo model.
A Monte Carlo Simulation refers to a computer-generated series of trials where the probabilities for
both risk and reward are tested repeatedly in an effort to help define these parameters. These
simulations are characterized by large numbers of trials - typically hundreds or even thousands of
iterations, which is why it's typically described as "computer generated". Also know that Monte Carlo
simulations rely on random numbers to generate a series of samples.
Monte Carlo simulations are used in a number of applications, often as a complement to other riskassessment techniques in an effort to further define potential risk. For example, a pension-benefit
administrator in charge of managing assets and liabilities for a large plan may use computer software
with Monte Carlo simulation to help understand any potential downside risk over time, and how
changes in investment policy (e.g. higher or lower allocations to certain asset classes, or the
introduction of a new manager) may affect the plan. While traditional analysis focuses on returns,
variances and correlations between assets, a Monte Carlo simulation can help introduce other
pertinent economic variables (e.g. interest rates, GDP growth and foreign exchange rates) into the
simulation.
Monte Carlo simulations are also important in pricing derivative securities for which there are no
existing analytical methods. European- and Asian-style options are priced with Monte Carlo methods,
as are certain mortgage-backed securities for which the embedded options (e.g. prepayment
assumptions) are very complex.
A general outline for developing a Monte Carlo simulation involves the following steps (please note
that we are oversimplifying a process that is often highly technical):
1. Identify all variables about which we are interested, the time horizon of the analysis and the
distribution of all risk factors associated with each variable.
2. Draw K random numbers using a spreadsheet generator. Each random variable would then
be standardized so we have Z1, Z2, Z3... ZK.
3. Simulate the possible values of the random variable by calculating its observed value with Z 1,
Z2, Z3... ZK.
4. Following a large number of iterations, estimate each variable and quantity of interest to
complete one trial. Go back and complete additional trials to develop more accurate
estimates.
Historical Simulation
Historical simulation, or back simulation, follows a similar process for large numbers of iterations, with
historical simulation drawing from the previous record of that variable (e.g. past returns for a mutual
fund). While both of these methods are very useful in developing a more meaningful and in-depth
analysis of a complex system, it's important to recognize that they are basically statistical estimates;
that is, they are not as analytical as (for example) the use of a correlation matrix to understand
portfolio returns. Such simulations tend to work best when the input risk parameters are well defined.
One of the most common ways to estimate risk is the use of a Monte Carlo simulation (MCS). For
example, to calculate the value at risk (VaR) of a portfolio, we can run a Monte Carlo simulation that
attempts to predict the worst likely loss for a portfolio given a confidence interval over a specified time
horizon - we always need to specify two conditions for VaR: confidence and horizon. (For related
reading, see The Uses And Limits Of Volatility and Introduction To Value At Risk (VAR) - Part 1 and
Part 2.)
If we rearrange the formula to solve just for the change in stock price, we see that GMB says the
change in stock price is the stock price "S" multiplied by the two terms found inside the parenthesis
below:
The first term is a "drift" and the second term is a "shock". For each time period, our model assumes
the price will "drift" up by the expected return. But the drift will be shocked (added or subtracted) by a
random shock. The random shock will be the standard deviation "s" multiplied by a random number
"e". This is simply a way of scaling the standard deviation.
That is the essence of GBM, as illustrated in Figure 1. The stock price follows a series of steps, where
each step is a drift plus/minus a random shock (itself a function of the stock's standard deviation):
Figure 1
2. Generate Random Trials
Armed with a model specification, we then proceed to run random trials. To illustrate, we've used
Microsoft Excel to run 40 trials. Keep in mind that this is an unrealistically small sample; most
simulations or "sims" run at least several thousand trials.
In this case, let's assume that the stock begins on day zero with a price of $10. Here is a chart of the
outcome where each time step (or interval) is one day and the series runs for ten days (in summary:
forty trials with daily steps over ten days):
The simulation produced a distribution of hypothetical future outcomes. We could do several things
with the output. If, for example, we want to estimate VaR with 95% confidence, then we only need to
locate the thirty-eighth-ranked outcome (the third-worst outcome). That's because 2/40 equals 5%, so
the two worst outcomes are in the lowest 5%.
If we stack the illustrated outcomes into bins (each bin is one-third of $1, so three bins covers the
interval from $9 to $10), we'll get the following histogram:
Figure 3
Remember that our GBM model assumes normality: price returns are normally distributed with
expected return (mean) "m" and standard deviation "s". Interestingly, our histogram isn't looking
normal. In fact, with more trials, it will not tend toward normality. Instead, it will tend toward a
lognormal distribution: a sharp drop off to the left of mean and a highly skewed "long tail" to the right
of the mean. This often leads to a potentially confusing dynamic for first-time students:
Price returns are normally distributed.
Price levels are log-normally distributed.
Think about it this way: A stock can return up or down 5% or 10%, but after a certain period of time,
the stock price cannot be negative. Further, price increases on the upside have a compounding effect,
while price decreases on the downside reduce the base: lose 10% and you are left with less to lose
the next time. Here is a chart of the lognormal distribution superimposed on our illustrated
assumptions (e.g. starting price of $10):
Figure 4
Summary
A Monte Carlo simulation applies a selected model (a model that specifies the behavior of an
instrument) to a large set of random trials in an attempt to produce a plausible set of possible future
outcomes. In regard to simulating stock prices, the most common model is geometric Brownian
motion (GBM). GBM assumes that a constant drift is accompanied by random shocks. While the
period returns under GBM are normally distributed, the consequent multi-period (for example, ten
days) price levels are lognormally distributed.
Check out David Harper's movie tutorial, Monte Carlo Simulation with Geometric Brownian Motion, to
learn more on this topic.
It is important to keep in mind that when a company analyzes a potential project, it is forecasting
potential not actual cash flows for a project. As we all know, forecasts are based on assumptions that
may be incorrect. It is therefore important for a company to perform a sensitivity analysis on its
assumptions to get a better sense of the overall risk of the project the company is about to take.
There are three risk-analysis techniques that should be known for the exam:
1.Sensitivity analysis
2.Scenario analysis
3.Monte Carlo simulation
1.Sensitivity Analysis
Sensitivity analysis is simply the method for determining how sensitive our NPV analysis is to changes
in our variable assumptions. To begin a sensitivity analysis, we must first come up with a base-case
scenario. This is typically the NPV using assumptions we believe are most accurate. From there, we
can change various assumptions we had initially made based on other potential assumptions. NPV is
then recalculated, and the sensitivity of the NPV based on the change in assumptions is determined.
Depending on our confidence in our assumptions, we can determine how potentially risky a project
can be.
2.Scenario Analysis
Scenario analysis takes sensitivity analysis a step further. Rather than just looking at the sensitivity of
our NPV analysis to changes in our variable assumptions, scenario analysis also looks at the
probability distribution of the variables. Like sensitivity analysis, scenario analysis starts with the
construction of a base case scenario. From there, other scenarios are considered, known as the
"best-case scenario" and the "worst-case scenario". Probabilities are assigned to the scenarios and
computed to arrive at an expected value. Given its simplicity, scenario analysis is one the most
Stochastic Modeling
What Does Stochastic Modeling Mean?
A method of financial modeling in which one or more variables within the model are random.
Stochastic modeling is for the purpose of estimating the probability of outcomes within a forecast
to predict what conditions might be like under different situations. The random variables are usually
constrained by historical data, such as past market returns.
Quantitative Analysis
What Does Quantitative Analysis Mean?
A business or financial analysis technique that seeks to understand behavior by using complex
mathematical and statistical modeling, measurement and research. By assigning a numerical value to
variables, quantitative analysts try to replicate reality mathematically.
Quantitative analysis can be done for a number of reasons such as measurement, performance
evaluation or valuation of a financial instrument. It can also be used to predict real world events such
as changes in a share price.
Bell Curve
What Does Bell Curve Mean?
The most common type of distribution for a variable. The term "bell curve" comes from the fact that
the graph used to depict a normal distribution consists of a bell-shaped line.
The bell curve is also known as a normal distribution. The bell curve is less commonly referred to as a
Gaussian distribution, after German mathematician and physicist Karl Gauss, who popularized the
model in the scientific community by using it to analyze astronomical data.
Mean
Arithmetic Mean
What Does Arithmetic Mean Mean?
A mathematical representation of the typical value of a series of numbers, computed as the sum of all
the numbers in the series divided by the count of all numbers in the series.
Arithmetic mean is commonly referred to as "average" or simply as "mean".
Winsorized Mean
What Does Winsorized Mean Mean?
A method of averaging that initially replaces the smallest and largest values with the observations
closest to them. After replacing the values, a simple arithmetic averaging formula is used to calculate
the winsorized mean.
Winsorized means are expressed in two ways. A "kth" winsorized mean refers to the replacement of
the 'k' smallest and largest observations, where 'k' is an integer. A "X%" winsorized mean involves
replacing a given percentage of values from both ends of the data.
Let's calculate the first winsorized mean for the following data set: 1, 5, 7, 8, 9, 10, 14. Because the
winsorized mean is in the first order, we replace the smallest and largest values with their nearest
observations. The data set now appears as follows: 5, 5, 7, 8, 9, 10, 10. Taking an arithmetic average
of the new set produces a winsorized mean of 7.71 ( (5+5+7+8+9+10+10) / 7 ).
Geometric Mean
What Does Geometric Mean Mean?
The average of a set of products, the calculation of which is commonly used to determine the
performance results of an investment or portfolio. Technically defined as "the 'n'th root product of 'n'
numbers", the formula for calculating geometric mean is most easily written as:
Average Price
What Does Average Price Mean?
1. A representative measure of a range of prices that is calculated by taking the sum of the values and
dividing it by the number of prices being examined. The average price reduces the range into a
single value, which can then be compared to any point to determine if the value is higher or lower than
what would be expected.
2. A bond's average price is calculated by adding its face value to the price paid for it and dividing the
sum by two. The average price is sometimes used in determining a bond's yield to maturity where the
average price replaces the purchase price in the yield to maturity calculation.
Moving Average - MA
A technical indicator developed by Patrick Mulloy that first appeared in the February, 1994 Technical
Analysis of Stocks & Commodities. The DEMA is a calculation based on both a single exponential
moving average (EMA) and a double EMA.
Weighted Average
What Does Weighted Average Mean?
An average in which each quantity to be averaged is assigned a weight. These weightings determine
the relative importance of each quantity on the average. Weightings are the equivalent of having that
many like items with the same value involved in the average.
8
2
5
1
4
10
3
8
2
7
1
68
0
2
To average these values, do a weighted average using the number of occurrences of each value as
the weight. To calculate a weighted average:
1. Multiply each value by its weight. (Ans: 20, 16, 5, 40, 24, 14, 68, and 0)
2. Add up the products of value times weight to get the total value. (Ans: Sum=187)
3. Add the weight themselves to get the total weight. (Ans: Sum=100)
4. Divide the total value by the total weight. (Ans: 187/100 = 1.87 = average value of a Scrabble
tile)
Price-Weighted Index
What Does Price-Weighted Index Mean?
A stock index in which each stock influences the index in proportion to its price per share. The value
of the index is generated by adding the prices of each of the stocks in the index and dividing them by
the total number of stocks. Stocks with a higher price will be given more weight and, therefore, will
have a greater influence over the performance of the index.
Capitalization-Weighted Index
What Does Capitalization-Weighted Index Mean?
A type of market index whose individual components are weighted according to their market
capitalization, so that larger components carry a larger percentage weighting. The value of a
capitalization-weighted index can be computed by adding up the collective market capitalizations of its
members and dividing it by the number of securities in the index.
Also known as a "market-value weighted index".
economists at Standard & Poor's. The S&P 500 is a market value weighted index - each stock's
weight is proportionate to its market value.
Free-Float Methodology
What Does Free-Float Methodology Mean?
A method by which the market capitalization of an index's underlying companies is calculated. Freefloat methodology market capitalization is calculated by taking the equity's price and multiplying it by
the number of shares readily available in the market. Instead of using all of the shares outstanding like
the full-market capitalization method, the free-float method excludes locked-in shares such as those
held by promoters and governments.
Calculated as:
Nasdaq
What Does Nasdaq Mean?
A computerized system that facilitates trading and provides price quotations on more than 5,000 of
the more actively traded over the counter stocks. Created in 1971, the Nasdaq was the world's first
electronic stock market.
Stocks on the Nasdaq are traditionally listed under four or five letter ticker symbols. If the company is
a transfer from the New York Stock Exchange, the symbol may be comprised of three letters.
Because of the way people throw around the words "Dow" and "Nasdaq," both terms have
become synonymous with "the market," giving people a hazy idea of what each term actually
means. In this question, "the Dow" refers to the famous figure that peppers almost all business
news reports: the Dow Jones Industrial Average (DJIA), an important index that many people
watch to get an indication of how well the overall stock market is performing. The Dow, or the
DJIA, is not exactly the same as Dow Jones and Company, the firm that publishes the Wall
Street Journal. However,the editors of the Wall Street Journal are the people who maintain the
DJIA, along with other Dow Jones indices. The Nasdaq is also a term that can refer to two
different things: first, it is the National Association of Securities Dealers Automated Quotations
System, which is the first electronic exchange, where investors can buy and sell stock. Second,
when you hear people say that the "the Nasdaq is up today," they are referring to the Nasdaq
Composite Index, which, like the DJIA, is a statistical measure of a portion of the market.
Both the Dow and the Nasdaq, then, refer to an index, or an average of a bunch of numbers
derived from the price movements of certain stocks. The DJIA tracks the performance of 30
different companies that are considered major players in their industries. The Nasdaq Composite,
on the other hand, tracks approximately 4,000 stocks, all of which are traded on the Nasdaq
exchange. The DJIA is composed mainly of companies found on the NYSE, with only a couple of
Nasdaq-listed stocks.
Remember, although both "the Dow" and the "Nasdaq" refer to market indices, only the Nasdaq
also refers to an exchange where investors can buy and sell stock. Furthermore, an investor can't
trade the Dow or the Nasdaq indexes because they each represent merely a mathematical
average that people use to try and make sense of the stock market. You can, however, purchase
index funds, which are a kind of mutual fund, or exchange traded funds, which are securities that
track the indexes.
Blue Chip