Artificial Intelligence
Chap.5 Uncertainty
Prepared by:
Prof. Khushali B Kathiriya
Outline
 Acting under uncertainty
 Basic probability notation
 The axioms of probability
 Inference using full join distributions.
Prepared by: Prof. Khushali B Kathiriya
2
Artificial Intelligence
Acting under uncertainty
Prepared by:
Prof. Khushali B Kathiriya
Acting under uncertainty
 A agent working in real environment almost never has access to whole
truth about its environment. Therefore, agent needs to work under
uncertainty.
 With knowledge representation, we might write A→B, which means if A is
true then B is true, but consider a situation where we are not sure about
whether A is true or not then we cannot express this statement, this situation
is called uncertainty.
 But when agent works with uncertain knowledge then it might be
impossible to construct a complete and correct description of how its
actions will work.
Prepared by: Prof. Khushali B Kathiriya
6
Sources of Uncertainty
1. Uncertain input
2. Uncertain knowledge
3. Uncertain output
Prepared by: Prof. Khushali B Kathiriya
7
Sources of Uncertainty (Cont.)
1. Uncertain input
1. Missing data
2. Noisy data
2. Uncertain knowledge
1. Multiple causes leads to multiple effects
2. Incomplete knowledge
3. Theoretical ignorance
4. Practical ignorance
3. Uncertain output
1. Abduction, induction are uncertain
2. Default reasoning
3. Incomplete deduction inference
Prepared by: Prof. Khushali B Kathiriya
8
Sources of Uncertainty (Cont.)
 Uncertainty may be caused by problems with data such as:
1. Missing data
2. Incomplete data
3. Unreliable data
4. Inconsistence data
5. Imprecise data
6. Guess data
7. Default data
Prepared by: Prof. Khushali B Kathiriya
9
What's the solution for uncertainty?
 Probabilistic reasoning is a way of knowledge representation where we
apply the concept of probability to indicate the uncertainty in knowledge.
Prepared by: Prof. Khushali B Kathiriya
10
Artificial Intelligence
Basic probability notation
Prepared by:
Prof. Khushali B Kathiriya
Probability
 Probability can be defined as a chance that an uncertain event will occur.
It is the numerical measure of the likelihood that an event will occur. The
value of probability always remains between 0 and 1 that represent ideal
uncertainties.
Prepared by: Prof. Khushali B Kathiriya
12
Probability (Cont.)
Prepared by: Prof. Khushali B Kathiriya
13
Basic probability notation
1. Propositions
2. Atomic events
3. Unconditional (prior) probability
4. Conditional probability
5. Inference using full joint distribution
6. Independence
7. Bayes' rule
Prepared by: Prof. Khushali B Kathiriya
14
Basic probability notation
1. Propositions
 Complex proposition can be formed using standard logical
connectives.
 For example:
1. [(cavity=true) ^ (toothache = false)]
2. [(cavity ^ ~toothache)]
 Random variables:
 Random variables are used to represent the events and objects in the real
world.
 Random variables are like symbols in propositional logic.
 For example:
 P(a)= 1- P(~a)
Prepared by: Prof. Khushali B Kathiriya
15
Basic probability notation
2. Atomic event
 An atomic event is a complete specification of the state of the
world about which agent us uncertain.
 Example:
 If the world consists of cavity and toothache the there are four distinct
atomic events,
1. Cavity= false ^ toothache = True
2. Cavity= false ^ toothache = false
3. Cavity= true ^ toothache = false
4. Cavity= true ^ toothache = true
Prepared by: Prof. Khushali B Kathiriya
16
Basic probability notation
3. Unconditional probability
 It is the degree of belief accorded to a proposition in the absence of
any other information.
 Written as a P(a)
 Example
 Ram has cavity
 P(cavity=true)=0.1 OR P(cavity)=0.1
 When we want to express probabilities of all possible values of a random
variable, then vector of value is used.
 P(WEATHER)= <0.7,0.2, 0.08,0.02>
 P(WEATHER=sunny)=0.7
 P(WEATHER=rain)=0.2
 P(WEATHER=cloudy)=0.08
 P(WEATHER=cold)=0.02
Prepared by: Prof. Khushali B Kathiriya
17
Basic probability notation
4. Independence
 It Is relation between 2 different set of full joint distributions. It is also called as
marginal or absolute independence of the variable.
 Independence indicates that whether the 2 full joint distributions affects
probability each other.
 The weather is independent of once dental problem.
 P(toothache, catch, cavity, weather)= P(toothache, catch, cavity) P(weather)
Prepared by: Prof. Khushali B Kathiriya
18
Toothache
catch cavity
weather
Toothache
catch
cavity
weather
Decompose into
Artificial Intelligence
Basic probability notation
Prepared by:
Prof. Khushali B Kathiriya
Basic probability notation
5. Conditional Probability
 Conditional probability is a probability of occurring an event when another
event has already happened.
 Let's suppose, we want to calculate the event A when event B has already
occurred, "the probability of A under the conditions of B", it can be written
as:
 Where P(A ^ B)= Joint probability of a and B
 P(B)= Marginal probability of B.
Prepared by: Prof. Khushali B Kathiriya
20
Basic probability notation
5. Conditional Probability (Cont.)
 If the probability of A is given and we need to find the probability of B,
then it will be given as:
Prepared by: Prof. Khushali B Kathiriya
21
Basic probability notation
5. Conditional Probability (Cont.)
 𝑃(𝐴|𝐵) = ൗ
𝑃(𝐴 ^ 𝐵)
𝑃(𝐵)
 P(B) =30/100 = 0.3
 P(A ^ B) = 20/100 = 0.2
 P(A|B)=0.2/0.3 = 0.67
Prepared by: Prof. Khushali B Kathiriya
22
50 20 30
Basic probability notation
6. Inference using Full joint Distribution
 Probability inference means, computation from observed evidence of
posterior probabilities, for query propositions. The knowledge based
answering the query is represented as full joint distribution.
Prepared by: Prof. Khushali B Kathiriya
23
Toothache ~Toothache
Catch ~Catch Catch ~Catch
Cavity 0.108 0.012 0.072 0.008
~Cavity 0.016 0.064 0.144 0.576
Basic probability notation
6. Inference using Full joint Distribution
 One particular common task in inferencing is to extract the distribution over
some subset of variables or a single variable. This distribution over some
variables or single variable is called as marginal probability
(Marginalization/ Summing).
 P(Cavity) = 0.108+ 0.012 + 0.072 + 0.00 8
= 0.2
Prepared by: Prof. Khushali B Kathiriya
24
Basic probability notation
6. Inference using Full joint Distribution
 Computing probability of a cavity, given evidence of a toothache is as
follow:
 𝐏 𝐂𝐚𝐯𝐢𝐭𝐲 𝐓𝐨𝐨𝐭𝐡𝐚𝐜𝐡𝐞) = ൘
P(Cavity ^ Toothache)
P (Toothache)
= ൗ
0.108+ 0.012
0.108+ 0.012+0.016 + 0.064
= 0.6
 Just to check also compute the probability that there is no cavity goven
toothache is as follow:
 𝐏 ~ 𝐂𝐚𝐯𝐢𝐭𝐲 𝐓𝐨𝐨𝐭𝐡𝐚𝐜𝐡𝐞) = ൘
P(~ Cavity ^ Toothache)
P (Toothache)
= ൗ
0.016 + 0.064
0.108+ 0.012+0.016 + 0.064
= 0.4
Prepared by: Prof. Khushali B Kathiriya
25
Basic probability notation
6. Inference using Full joint Distribution
 Notice that in these 2 calculations the term 1/P (toothache) remains
constant, no matter which value of cavity we calculate. With this notation
we can write above two questions in one.
 P(Cavity | Toothache)
= ∞ P (Cavity, Toothache)
= ∞ [P(Cavity, Toothache, Catch) + P(~Cavity, Toothache, ~Catch)]
= ∞ [<0.108 , 0.016> + <0.012 , 0.064>]
= ∞ [<0.12, 0.08>] = [<0.6, 0.4>]
Prepared by: Prof. Khushali B Kathiriya
26
Artificial Intelligence
Bayes’ Rule
Prepared by:
Prof. Khushali B Kathiriya
Basic probability notation
7. Bayes’ Rule
 Bayes' theorem is also known as Bayes' rule, Bayes' law, or Bayesian
reasoning, which determines the probability of an event with uncertain
knowledge.
 In probability theory, it relates the conditional probability and marginal
probabilities of two random events.
 Bayes' theorem was named after the British mathematician Thomas Bayes.
The Bayesian inference is an application of Bayes' theorem, which is
fundamental to Bayesian statistics.
Prepared by: Prof. Khushali B Kathiriya
28
Refer E-notes for bays’ example

Artificial Intelligence Chap.5 : Uncertainty

  • 1.
  • 2.
    Outline  Acting underuncertainty  Basic probability notation  The axioms of probability  Inference using full join distributions. Prepared by: Prof. Khushali B Kathiriya 2
  • 3.
    Artificial Intelligence Acting underuncertainty Prepared by: Prof. Khushali B Kathiriya
  • 4.
    Acting under uncertainty A agent working in real environment almost never has access to whole truth about its environment. Therefore, agent needs to work under uncertainty.  With knowledge representation, we might write A→B, which means if A is true then B is true, but consider a situation where we are not sure about whether A is true or not then we cannot express this statement, this situation is called uncertainty.  But when agent works with uncertain knowledge then it might be impossible to construct a complete and correct description of how its actions will work. Prepared by: Prof. Khushali B Kathiriya 6
  • 5.
    Sources of Uncertainty 1.Uncertain input 2. Uncertain knowledge 3. Uncertain output Prepared by: Prof. Khushali B Kathiriya 7
  • 6.
    Sources of Uncertainty(Cont.) 1. Uncertain input 1. Missing data 2. Noisy data 2. Uncertain knowledge 1. Multiple causes leads to multiple effects 2. Incomplete knowledge 3. Theoretical ignorance 4. Practical ignorance 3. Uncertain output 1. Abduction, induction are uncertain 2. Default reasoning 3. Incomplete deduction inference Prepared by: Prof. Khushali B Kathiriya 8
  • 7.
    Sources of Uncertainty(Cont.)  Uncertainty may be caused by problems with data such as: 1. Missing data 2. Incomplete data 3. Unreliable data 4. Inconsistence data 5. Imprecise data 6. Guess data 7. Default data Prepared by: Prof. Khushali B Kathiriya 9
  • 8.
    What's the solutionfor uncertainty?  Probabilistic reasoning is a way of knowledge representation where we apply the concept of probability to indicate the uncertainty in knowledge. Prepared by: Prof. Khushali B Kathiriya 10
  • 9.
    Artificial Intelligence Basic probabilitynotation Prepared by: Prof. Khushali B Kathiriya
  • 10.
    Probability  Probability canbe defined as a chance that an uncertain event will occur. It is the numerical measure of the likelihood that an event will occur. The value of probability always remains between 0 and 1 that represent ideal uncertainties. Prepared by: Prof. Khushali B Kathiriya 12
  • 11.
    Probability (Cont.) Prepared by:Prof. Khushali B Kathiriya 13
  • 12.
    Basic probability notation 1.Propositions 2. Atomic events 3. Unconditional (prior) probability 4. Conditional probability 5. Inference using full joint distribution 6. Independence 7. Bayes' rule Prepared by: Prof. Khushali B Kathiriya 14
  • 13.
    Basic probability notation 1.Propositions  Complex proposition can be formed using standard logical connectives.  For example: 1. [(cavity=true) ^ (toothache = false)] 2. [(cavity ^ ~toothache)]  Random variables:  Random variables are used to represent the events and objects in the real world.  Random variables are like symbols in propositional logic.  For example:  P(a)= 1- P(~a) Prepared by: Prof. Khushali B Kathiriya 15
  • 14.
    Basic probability notation 2.Atomic event  An atomic event is a complete specification of the state of the world about which agent us uncertain.  Example:  If the world consists of cavity and toothache the there are four distinct atomic events, 1. Cavity= false ^ toothache = True 2. Cavity= false ^ toothache = false 3. Cavity= true ^ toothache = false 4. Cavity= true ^ toothache = true Prepared by: Prof. Khushali B Kathiriya 16
  • 15.
    Basic probability notation 3.Unconditional probability  It is the degree of belief accorded to a proposition in the absence of any other information.  Written as a P(a)  Example  Ram has cavity  P(cavity=true)=0.1 OR P(cavity)=0.1  When we want to express probabilities of all possible values of a random variable, then vector of value is used.  P(WEATHER)= <0.7,0.2, 0.08,0.02>  P(WEATHER=sunny)=0.7  P(WEATHER=rain)=0.2  P(WEATHER=cloudy)=0.08  P(WEATHER=cold)=0.02 Prepared by: Prof. Khushali B Kathiriya 17
  • 16.
    Basic probability notation 4.Independence  It Is relation between 2 different set of full joint distributions. It is also called as marginal or absolute independence of the variable.  Independence indicates that whether the 2 full joint distributions affects probability each other.  The weather is independent of once dental problem.  P(toothache, catch, cavity, weather)= P(toothache, catch, cavity) P(weather) Prepared by: Prof. Khushali B Kathiriya 18 Toothache catch cavity weather Toothache catch cavity weather Decompose into
  • 17.
    Artificial Intelligence Basic probabilitynotation Prepared by: Prof. Khushali B Kathiriya
  • 18.
    Basic probability notation 5.Conditional Probability  Conditional probability is a probability of occurring an event when another event has already happened.  Let's suppose, we want to calculate the event A when event B has already occurred, "the probability of A under the conditions of B", it can be written as:  Where P(A ^ B)= Joint probability of a and B  P(B)= Marginal probability of B. Prepared by: Prof. Khushali B Kathiriya 20
  • 19.
    Basic probability notation 5.Conditional Probability (Cont.)  If the probability of A is given and we need to find the probability of B, then it will be given as: Prepared by: Prof. Khushali B Kathiriya 21
  • 20.
    Basic probability notation 5.Conditional Probability (Cont.)  𝑃(𝐴|𝐵) = ൗ 𝑃(𝐴 ^ 𝐵) 𝑃(𝐵)  P(B) =30/100 = 0.3  P(A ^ B) = 20/100 = 0.2  P(A|B)=0.2/0.3 = 0.67 Prepared by: Prof. Khushali B Kathiriya 22 50 20 30
  • 21.
    Basic probability notation 6.Inference using Full joint Distribution  Probability inference means, computation from observed evidence of posterior probabilities, for query propositions. The knowledge based answering the query is represented as full joint distribution. Prepared by: Prof. Khushali B Kathiriya 23 Toothache ~Toothache Catch ~Catch Catch ~Catch Cavity 0.108 0.012 0.072 0.008 ~Cavity 0.016 0.064 0.144 0.576
  • 22.
    Basic probability notation 6.Inference using Full joint Distribution  One particular common task in inferencing is to extract the distribution over some subset of variables or a single variable. This distribution over some variables or single variable is called as marginal probability (Marginalization/ Summing).  P(Cavity) = 0.108+ 0.012 + 0.072 + 0.00 8 = 0.2 Prepared by: Prof. Khushali B Kathiriya 24
  • 23.
    Basic probability notation 6.Inference using Full joint Distribution  Computing probability of a cavity, given evidence of a toothache is as follow:  𝐏 𝐂𝐚𝐯𝐢𝐭𝐲 𝐓𝐨𝐨𝐭𝐡𝐚𝐜𝐡𝐞) = ൘ P(Cavity ^ Toothache) P (Toothache) = ൗ 0.108+ 0.012 0.108+ 0.012+0.016 + 0.064 = 0.6  Just to check also compute the probability that there is no cavity goven toothache is as follow:  𝐏 ~ 𝐂𝐚𝐯𝐢𝐭𝐲 𝐓𝐨𝐨𝐭𝐡𝐚𝐜𝐡𝐞) = ൘ P(~ Cavity ^ Toothache) P (Toothache) = ൗ 0.016 + 0.064 0.108+ 0.012+0.016 + 0.064 = 0.4 Prepared by: Prof. Khushali B Kathiriya 25
  • 24.
    Basic probability notation 6.Inference using Full joint Distribution  Notice that in these 2 calculations the term 1/P (toothache) remains constant, no matter which value of cavity we calculate. With this notation we can write above two questions in one.  P(Cavity | Toothache) = ∞ P (Cavity, Toothache) = ∞ [P(Cavity, Toothache, Catch) + P(~Cavity, Toothache, ~Catch)] = ∞ [<0.108 , 0.016> + <0.012 , 0.064>] = ∞ [<0.12, 0.08>] = [<0.6, 0.4>] Prepared by: Prof. Khushali B Kathiriya 26
  • 25.
    Artificial Intelligence Bayes’ Rule Preparedby: Prof. Khushali B Kathiriya
  • 26.
    Basic probability notation 7.Bayes’ Rule  Bayes' theorem is also known as Bayes' rule, Bayes' law, or Bayesian reasoning, which determines the probability of an event with uncertain knowledge.  In probability theory, it relates the conditional probability and marginal probabilities of two random events.  Bayes' theorem was named after the British mathematician Thomas Bayes. The Bayesian inference is an application of Bayes' theorem, which is fundamental to Bayesian statistics. Prepared by: Prof. Khushali B Kathiriya 28 Refer E-notes for bays’ example