Module 10 Reasoning with Uncertainty - Probabilistic reasoning Lesson 29 A Basic Idea Notes | EduRev

: Module 10 Reasoning with Uncertainty - Probabilistic reasoning Lesson 29 A Basic Idea Notes | EduRev

 Page 1


 
 
 
 
 
Module 
10 
 
Reasoning with 
Uncertainty - 
Probabilistic reasoning 
 
 
 
Version 2 CSE IIT, Kharagpur 
Page 2


 
 
 
 
 
Module 
10 
 
Reasoning with 
Uncertainty - 
Probabilistic reasoning 
 
 
 
Version 2 CSE IIT, Kharagpur 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Lesson 
 29 
 
A Basic Idea of 
Inferencing with Bayes 
Networks  
Version 2 CSE IIT, Kharagpur 
Page 3


 
 
 
 
 
Module 
10 
 
Reasoning with 
Uncertainty - 
Probabilistic reasoning 
 
 
 
Version 2 CSE IIT, Kharagpur 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Lesson 
 29 
 
A Basic Idea of 
Inferencing with Bayes 
Networks  
Version 2 CSE IIT, Kharagpur 
 
10.5.5 Inferencing in Bayesian Networks 
10.5.5.1 Exact Inference 
The basic inference problem in BNs is described as follows: 
 
Given 
 
1. A Bayesian network BN 
 
2. Evidence e - an instantiation of some of the variables in BN (e can be empty) 
 
3. A query variable Q 
 
Compute P(Q|e) - the (marginal) conditional distribution over Q 
 
Given what we do know, compute distribution over what we do not. Four categories of 
inferencing tasks are usually encountered. 
    1. Diagnostic Inferences (from effects to causes) 
Given that John calls, what is the probability of burglary? i.e. Find P(B|J) 
   2. Causal Inferences (from causes to effects) 
Given Burglary, what is the probability that  
John calls, i.e. P(J|B) 
Mary calls, i.e. P(M|B) 
3. Intercausal Inferences (between causes of a common event) 
Given alarm, what is the probability of burglary? i.e. P(B|A) 
Now given Earthquake, what is the probability of burglary? i.e. P(B|AE) 
4. Mixed Inferences (some causes and some effects known) 
Given John calls and no Earth quake, what is the probability of Alarm, i.e.  
P(A|J,~E) 
 
We will demonstrate below the inferencing procedure for BNs. As an example consider 
the following linear BN without any apriori evidence. 
 
Consider computing all the marginals (with no evidence). P(A) is given, and 
Version 2 CSE IIT, Kharagpur 
Page 4


 
 
 
 
 
Module 
10 
 
Reasoning with 
Uncertainty - 
Probabilistic reasoning 
 
 
 
Version 2 CSE IIT, Kharagpur 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Lesson 
 29 
 
A Basic Idea of 
Inferencing with Bayes 
Networks  
Version 2 CSE IIT, Kharagpur 
 
10.5.5 Inferencing in Bayesian Networks 
10.5.5.1 Exact Inference 
The basic inference problem in BNs is described as follows: 
 
Given 
 
1. A Bayesian network BN 
 
2. Evidence e - an instantiation of some of the variables in BN (e can be empty) 
 
3. A query variable Q 
 
Compute P(Q|e) - the (marginal) conditional distribution over Q 
 
Given what we do know, compute distribution over what we do not. Four categories of 
inferencing tasks are usually encountered. 
    1. Diagnostic Inferences (from effects to causes) 
Given that John calls, what is the probability of burglary? i.e. Find P(B|J) 
   2. Causal Inferences (from causes to effects) 
Given Burglary, what is the probability that  
John calls, i.e. P(J|B) 
Mary calls, i.e. P(M|B) 
3. Intercausal Inferences (between causes of a common event) 
Given alarm, what is the probability of burglary? i.e. P(B|A) 
Now given Earthquake, what is the probability of burglary? i.e. P(B|AE) 
4. Mixed Inferences (some causes and some effects known) 
Given John calls and no Earth quake, what is the probability of Alarm, i.e.  
P(A|J,~E) 
 
We will demonstrate below the inferencing procedure for BNs. As an example consider 
the following linear BN without any apriori evidence. 
 
Consider computing all the marginals (with no evidence). P(A) is given, and 
Version 2 CSE IIT, Kharagpur 
 
We don't need any conditional independence assumption for this. 
 
For example, suppose A, B are binary then we have  
 
Now, 
 
(B) (the marginal distribution over B) was not given originally. . . but we just computed 
 C were not independent of A given B, we would have a CPT for P(C|A,B) not 
 each node has k values, and the chain has n nodes this algorithm has complexity 
s into 
 
ynamic programming may also be used for the problem of exact inferencing in the 
 
 
P
it in the last step, so we’re OK (assuming we remembered to store P(B) somewhere). 
 
If
P(C|B).Note that we had to wait for P(B) before P(C) was calculable. 
 
If
O(nk
2
). Summing over the joint has complexity O(k
n
). 
 
omplexity can be reduced by more efficient summation by “pushing sum C
products”. 
 
 
D
above Bayes Net. The steps are as follows: 
 
. We first compute 1
Version 2 CSE IIT, Kharagpur 
Page 5


 
 
 
 
 
Module 
10 
 
Reasoning with 
Uncertainty - 
Probabilistic reasoning 
 
 
 
Version 2 CSE IIT, Kharagpur 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Lesson 
 29 
 
A Basic Idea of 
Inferencing with Bayes 
Networks  
Version 2 CSE IIT, Kharagpur 
 
10.5.5 Inferencing in Bayesian Networks 
10.5.5.1 Exact Inference 
The basic inference problem in BNs is described as follows: 
 
Given 
 
1. A Bayesian network BN 
 
2. Evidence e - an instantiation of some of the variables in BN (e can be empty) 
 
3. A query variable Q 
 
Compute P(Q|e) - the (marginal) conditional distribution over Q 
 
Given what we do know, compute distribution over what we do not. Four categories of 
inferencing tasks are usually encountered. 
    1. Diagnostic Inferences (from effects to causes) 
Given that John calls, what is the probability of burglary? i.e. Find P(B|J) 
   2. Causal Inferences (from causes to effects) 
Given Burglary, what is the probability that  
John calls, i.e. P(J|B) 
Mary calls, i.e. P(M|B) 
3. Intercausal Inferences (between causes of a common event) 
Given alarm, what is the probability of burglary? i.e. P(B|A) 
Now given Earthquake, what is the probability of burglary? i.e. P(B|AE) 
4. Mixed Inferences (some causes and some effects known) 
Given John calls and no Earth quake, what is the probability of Alarm, i.e.  
P(A|J,~E) 
 
We will demonstrate below the inferencing procedure for BNs. As an example consider 
the following linear BN without any apriori evidence. 
 
Consider computing all the marginals (with no evidence). P(A) is given, and 
Version 2 CSE IIT, Kharagpur 
 
We don't need any conditional independence assumption for this. 
 
For example, suppose A, B are binary then we have  
 
Now, 
 
(B) (the marginal distribution over B) was not given originally. . . but we just computed 
 C were not independent of A given B, we would have a CPT for P(C|A,B) not 
 each node has k values, and the chain has n nodes this algorithm has complexity 
s into 
 
ynamic programming may also be used for the problem of exact inferencing in the 
 
 
P
it in the last step, so we’re OK (assuming we remembered to store P(B) somewhere). 
 
If
P(C|B).Note that we had to wait for P(B) before P(C) was calculable. 
 
If
O(nk
2
). Summing over the joint has complexity O(k
n
). 
 
omplexity can be reduced by more efficient summation by “pushing sum C
products”. 
 
 
D
above Bayes Net. The steps are as follows: 
 
. We first compute 1
Version 2 CSE IIT, Kharagpur 
2. f numbers, one for each possible value of 
B. 
n use  f
1
(B) to calculate f
2
(C) by summation over B 
 
 (ie finding P(D)) by solving subproblems and storing 
inte as 
f
1
nary and we want  
(C|A = t,E = t). Computing P(C,A = t,E = t) is enough—it’s a table of numbers, one for 
to just renormalise it so that they add up to 1.  
1
(B) is a function representable by a table of
 
3. Here, 
4. We the
This method of solving a problem
the results is characteristic of dynamic programming. 
 
The above methodology may be generalized. We eliminated variables starting from the 
root, but we dont have to. We might have also done the following computation. 
 
The following points are to be noted about the above algorithm. The algorithm computes 
rmediate results which are not individual probabilities, but entire tables such
(C,E). It so happens that f
1(
C,E) = P(E|C) but we will see examples where the 
intermediate tables do not represent probability distributions.  
 
Dealing with Evidence 
Dealing with evidence is easy. Suppose {A,B,C,D,E} are all bi
P
each value of C. We need 
 
Version 2 CSE IIT, Kharagpur 
Read More
Offer running on EduRev: Apply code STAYHOME200 to get INR 200 off on our premium plan EduRev Infinity!