6. Reasoning Under Uncertainty¶
Introduction 1¶
- There are many situations where agents must operate in open-world environments where complete knowledge of the environment is not possible.
- In environments where decisions must be made based upon incomplete information, we turn to probability theory to help guide decisions.
Conditional Probability:
- We must rely upon an evaluation of the level of belief in a proposition that is expressed in terms of a probability of truth.
- We use Bayes’ theorem to compute the probability of a proposition based upon all of the factors or axioms that are related to it.
- A prior probability is an initial probability value obtained before any additional information is obtained. For example, a child has a 51% chance of being born male and a 49% chance of being born female. Thus the prior probability of being born male is 51%.
- A posterior probability is a probability value that has been revised by using additional information. The probability that any person living on the planet selected randomly is male is slightly less than 50% for a variety of reasons such as the fact that men don’t tend to live as long as women and other factors. This additional information can change the probability that 51% of the people are male.
Belief Networks:
- Belief networks provide us with a way of visualizing the conditional probabilities between items or events.
- Each node has a conditional probability table (CPT) that gives the probability of each of its value given every possible combination of values for its parents.
Chapter 9: Reasoning under uncertainty 2¶
9.3 Belief networks¶
- The set of locally affecting variables is called the Markov blanket. This locality is exploited in a belief network.
- A belief network is a directed acyclic graph representing conditional dependence among a set of random variables. The random variables are the nodes. The arcs represent direct dependence.
- The conditional independence implied by a belief network is determined by an ordering of the variables; each variable is independent of its predecessors in the total ordering given a subset of the predecessors called its parents. Independence in the graph is indicated by missing arcs.
- Thus Xi probabilistically depends on each of its parents, but is independent of its other predecessors.
Bayes’ Theorem 3¶
Conditional Probability 4¶
Bayes’ Theorem Explained 5¶
References¶
-
Learning Guide Unit 6: Introduction | Home. (2025). Uopeople.edu. https://my.uopeople.edu/mod/book/view.php?id=454711&chapterid=555056 ↩
-
Poole, D. L., & Mackworth, A. K. (2017). Artificial Intelligence: Foundations of computational agents. Cambridge University Press. https://artint.info/2e/html/ArtInt2e.html Chapter 9: Reasoning under uncertainty. ↩
-
Bayes’ Theorem. (2024). Mathsisfun.com. https://www.mathsisfun.com/data/bayes-theorem.html ↩
-
Bazett, T. (2017, November 18). Intro to conditional probability [Video]. YouTube. https://youtu.be/ibINrxJLvlM ↩
-
3Blue1Brown. (2019, December 22). Bayes theorem [Video]. YouTube. https://youtu.be/HZGCoVF3YvM ↩