You are on page 1of 28

CHAPTER 10

DECISION THEORY

Reporters:
Harid, Jamaira
Mamaclay, Angelica
Gonzales, Katherine
10.4 DECISION TREES

 It is a graphical diagram consisting of nodes and


branches.
 The user computes the expected value of each
outcome and makes a decision based on these
expected values.
 The main benefit of a decision tree is that it
provides a graphic of the decision-making
process.
 Decision trees represents the sequence of
events in a decision situation.
 Circles and Squares are referred as nodes.
Decision Variable- variable whose value
represents a potential decision on the part of
the decision maker.
POSTERIOR PROBABILITIES

 the concept of conditional probability given


statistical dependence forms the necessary
foundation for an area of probability known
as Beyesian Analysis.
 The basic principle of Bayesian Analysis is
that the additional information (if available)
can sometimes enable one to
alter(improve) the marginal probabilities of
the occurrence of an event.
Steps in Implementing Posterior
Probabilities
1. Obtain conditional sample outcome
probabilities
2. Obtain the sample outcome
3. Calculate the posterior probabilities of the
events. The numerator is the prior
probability times the likelihood ; the
denominator is the predictive probability.
4. Use the posterior probabilities calculated
in step 3 to determine the Bayes decision.
The Manufacturing Company, a corporate raider, has
acquired a shoe company and is contemplating the
future of one of its major plants located in
E Marikina. Three alternative decisions are being
considered :
X (1) expand the plant and produce lightweight,
durable shoes for possible sales to the department
A store, a market with little foreign competition
M (2) maintain the status quo at the plant, continuing
production of shoes that are subject to heavy
P foreign competition
(3)sell the plant now. If one of the first two
L alternatives is chosen, the plant will still be sold at
the end of the year. The amount of profit that
E could be earned by selling the plant in a year
depends on foreign market conditions, including
the status of a trade embargo bill in Congress.
STATES OF NATURE

DECISION Good Foreign Bad


Competitive Foreign
Condition Competitive
Condition
Expand 30,000,000 20,000,000
Maintain Status 50,000,000 -75,000,000
quo
Sell now 15,000,000 15,000,000
b. Assume that it is now possible to estimate a probability of
0.25 that bad conditions will exist. Determine the best
decisions using expected value and expected opportunity
loss.
c. Compute the expected value of perfect information.
d. Develop a decision tree for this decision situation, with
expected values at the probability nodes.
e. The company has hired a consulting firm to provide a
report on the political and market situation in the future.
The conditional probability of each report outcome given
each state of nature is as follows :

P(P/g)= 0.80. P(N/g)= 0.20 P(P/b)= 0.30 P(N/b)= 0.70

Determine the Posterior probabilities s using Bayes's Rule.


f. Perform a decision tree analysis using the posterior
probability obtained
Decision with Expected Value and Expected
Opportunity loss
S Expected Value:
Expand= 30,000,000(.75)+20,000,000(.25)
O = 22,500,000+5,000,000
=27,500,000
L Status quo =50,000,000 (0.75)-7,500,000(.25)
U =37,500,000-1,875,000
=35,625,000
T Sell = 15,000,000 (.75) + 15,000,000(.25)
=11,250,000+3,750,000
I =15,000,000
Select the highest solution and make the
O decision.
N  35,625,000
EXPECTED OPPORTUNITY LOSS
Step1. Determine the opportunity loss table.
STATES OF NATURE

DECISION Good Foreign Bad


Competitive Foreign
Condition Competitive
Condition
Expand 20,000,000 0
Maintain 0 27,500,000
Status quo
Sell now 35,000,000 15,000,000
EXPECTED OPPORTUNITY LOSS

Step 2. Substitute the opportunity loss to the


estimate probability.
expand = 20,000,000(.75)+0(.25)
=15,000,000+0
=15,000,000
status quo =0(.75)-27,500,000(.25)
=0+6,875,000
=6,875,000
Sell = 35,000,000(.75)+5,000,000(.25)
=26,250,000+1,250,000
=27,500,000
 select the lowest result 6,875,000
Expected Value of Perfect Information

Step. 1 Expected Value of Perfect Information


= 50,000,000(.75)+20,000,000(.25)
=37,500,000+5,000,000
=42,500,000
Step 2
Expected Value without perfect information
=50,000,000(.75)-7,500,000(.25)
=37,500,000-1875,000
=35,625,000
Step 3 Solve for the Expected Value of Perfect
information
EVPI= 42,500,000-36,625,000
= 6,875,000
c. Decision Tree
Step 1. Construct a decision tree
Step 2. Compute for the expected value of the
nodes B,C and D.
Node B
=30,000,000(.75)+20,000,000(.25)=22,500,000
+5,000,000= 27,500,000
Node C
=54,000,000(.75)-7,500,000(.25)=37,500,000-
1,875,000= 35,625,000
Node D
=15,000,000(.75)+15,000,000(.25)=11,250,000
+3,750,000= 15,000,000
Step 3. Select the highest solution among the circle nodes and
make the decision
D. Decision trees with Posterior
Probabilities
Step 2. Compute for the Posterior probabilities
with tables and by the use of formula.
P(Pg)=P(g)*P(P/g)
=(.75)(.80)
=0.60

P(Pb)=P(b)*P(P/b) P(b/P) = P(P/b)*P(b)


=(.25)(.30) __________
=0.075 P(P/b)*P(b)+P(P/g)*P(g)

P(g/P) = P(P/g)*P(g)
_______________ = (.30)(.25)
P(P/g)*P(g)+P(P/b)*P(b) _____________
(.30)(.25)+(.80)(.75)
= (.80)(.75)
______________ = 0.075
(.80)(.75)+(.30)(.25) _______
0.075+.60
= .60
_______ P(b/P) = .11
.60+0.075

P(g/P)= .89
State of Prior Conditional Prior Posterior
Nature Probabilities Probabilities Probabilities Probabilities
X
Conditional
Probabilities

Good P(g)= .75 P(P/g)= .80 P(Pg)= .60 P(g/P)= .89


Conditions

Bad P(b)= .25 P(P/b)= .30 P(Pb)= .075 P(b/P)= .11


Conditions
Computation of Poster Probabilities for
Negative Result

P(N/g) = P(g)*P(N/g)
=(.75)(.20) P(b/N) = P(N/b) * P(b)
= .15
___________________
P(N/b) * P(b) + P (N/g) *
P(N/b) = P(b)*P(N/b) P(g)
=(.25)(.70)
=.175
= (.70)(.25)
P(g/N) = P(N/g) * P(g) _______________
__________________ (.70)(.25)+(.20)(.75))
P(N/g) * P(g)+P(N/b)*P(b)
=0.175
= (.20)(.75) ________
______________ .175+.15
(.20)(.75)+(.70)(.25)
P(b/N) = .54
= .15
______
.15+.175

P(g/N) = .46
State of Prior Conditional Prior Posterior
Nature Probabilities Probabilities Probabilities Probabilities
X
Conditional
Probabilities

Good P(g)= .75 P(P/g)= .20 P(Pg)= .15 P(g/P)= .46


Conditions

Bad P(b)= .25 P(P/b)= .70 P(Pb)= .175 P(b/P)= .54


Conditions
Step 3. Determine the posterior probability of the decision tree/
Step 4 Solve for circular nodes from D to I.
 Node D
=(.89)(30,000,000)+(.11)(20,000,000)=26,700,000+2,200,000
=28,900,000
 Node E
=(.89)(50,000,000)+(.11)(-7,500,000)=44,500,000-825,000
=43,675,000
 Node F
=(.89)(15,000,000)+(.11)(15,000,000)=13,350,000+1,650,000
=15,000,000
 Node G
=(.46)(30,000,000)+(.54)(20,000,000)=13,800,000+10,800,000
=24,600,000
 Node H
=(.46)(50,000,000)+(.54)(-7,500,000)=23,000,000-4,050,000
=18,950,000
 Node I
=(.46)(15,000,000)+(.54)(15,000,000)=6,900,000+8,100,000
=15,000,000
Step 5. Identify the largest value
Step 6. Compute the probability for positive
report and negative report.

 P(P)= P(P/g)*P(g)+P(P/b)*P(b)
=(.80)(.75)+(.30)(.25)
=.60+.075
=0.675 probability of positive report

 P(N)=P(N/g)*P(g)+P(N/b)*P(b)
=(.20)(.75)+(.70)(.25)
=.15+.175
=.325 probability of positive report
Step 7. Compute for a Node A to generate the expected
value of a decision strategy given additional information
 Node A
= (.675)(43,675,000)+(.325)(24,600,000)
=29,480,000+7,995,000
=37,345,625
THANK YOU!

You might also like