Skip to main content

BAYESIAN NETWORKS in Machine Learning

So basically this topic is a mix of three theories namely:

  1. Graph Theory
  2. Probability Theory
  3. Bayes Theory

1. Graph Theory

 Graph theory is a mathematical field for the study of Graphs.

Graph = Nodes (or vertices) + Arcs (or lines, or edges!)

So a node can be anything that you want, a company, profits or a band.

The thing is- you can have many nodes- but they have to have some form of relation between them. Graphs can be directed and undirected as shown below.


2. Probability Theory

Probability is a measure of the likelihood of something happening. And is always a value between 0 and 1. The more likely, the closer the value is to 1 and vs-versa.

Probability = (Particular Event)➗(All Possible Events)

An example is coin flip. The probability of getting heads is 0.50 - and this brings us to our next idea, conditional probability.

Instead of one event at a time, what is the the probability of getting a heads twice? Which then becomes

P(Heads|Heads)

The '|' simply means 'given that' (the last value was). And as per our example is 0.50*0.50. You might ask- why multiply? Or what does multiply mean? Or cant we just add?  Well that's because we want to find the intersection of both H and H sets.

3. Bayes Theory

This is an extension of conditional probability. While using Bayes you are using conditional probability to calculate another one.

P(A|B) = P(B|A) * P(A)/P(B)

ie. Posterior = Likelihood*Prior/Evidence

If you want more i refer to this article.


For great examples, follow this link. 

Sample code : 

How AI can help fight Cholera. Feel free to contribute! https://goo.gl/kmcvKv


 


Comments