# Technical Glossary

This page provides formal and informal explanations of terms used in the website.

## Conditional Independence

Two sets of variables, A and B, are conditionally independent of a third C, if and only if P(C=c)≠0 and:

P(A=b|B=b,C=c)=P(A=a|C=c) and P(A=a|C=c)≠0 and P(B=b|C=c)≠0

or

P(A=a|C=c)=0 or P(B=b|C=c)=0

In words, the most important element of the definition is that two sets of variables are conditionally independent of a third if and only if the probability of the first set of variables taking any given values given the values taken by the second and third is equal to the the probability of the first set of variables taking any given values given the values taken by the third alone. The remainder of the definition deals with degenerate cases where probabilities are equal to zero.

Because Bayesian networks meet the Markov Condition, the existence of conditional independencies is indicated by the graph theoretic criterion of d-seperation. The most useful conditional independency to understand when interpreting Bayesian network based networks is the Markov Blanket.

## Markov Condition

A direct acyclic graph and a joint probability distribution meet the Markov Condition if and only if there is a node in the graph representing each variable, and each variable is conditionally independent of the variables represented by that node's non-descendants in the graph given the variables represented by the node's parents.

## Markov Blanket

The Markov Blanket of a node is the node's parents, its children and any other parents of its children. In Bayesian networks the variable represented by a node is conditionally independent of all other variables in the domain given the variables of its Markov Blanket.

Put another way, if you know the values of a variable's Markov Blanket, then the remaining variables will provide no additional information regarding the state of the variable.

## D-Seperation

D-seperation is a graph theoretic criterion for establishing conditional independencies in a Bayesian network. The most important d-seperation is the Markov Blanket of a node.

It is not necessary to understand the formal definition of d-seperation to be able to understand Bayesian network based technologies. It is, though, useful to understand conditional independence and the Markov Blanket.

To formally define d-seperation, we need some preliminary definitions. Firstly, let a 'chain' be any n-tuple of nodes {X1,X2, . . . .,Xn}, such that n ≥ 2 and there are edges between Xi−1 and Xi for all i. We will denote a directed edge between two nodes, X and Y, as X → Y. If we do not care about the direction of the edge, we will use X - Y. Where there is an edge, X → Y , we will say the tail of the edge is at X and the head of the edge is Y. We will also say:

- X → Z → Y is a head-to-tail meeting.
- X ← Z → Y is a tail-to-tail meeting.
- X → Z ← Y is a head-to-head meeting.
- X − Z − Y, where X and Y are not adjacent, is an uncoupled meeting.

Finally, we will say a chain, c, between two distinct nodes, X and Y, is blocked by a set of nodes, A, (X∉A and Y∉A) if one of the following holds:

- There is a node Z ∈ A on c, and the edges to Z on c meet head-to-tail at Z.
- There is a node Z ∈ A on c, and the edges to Z on c meet tail-to-tail at Z.
- There is a node Z, such that Z and all of Z’s descendents are not in A, on c, and the edges to Z on c meet head-to-head at Z.

We can now define d-seperation. Let A be a set of nodes in a Bayesian Network. Two nodes, X and Y, are d-seperated by A if and only if every chain between X and Y is blocked by A. Likewise, two set of nodes B and C are d-seperated by A if and only if all nodes X∈B and Y∈C are d-seperated by A.