In this post I present my learning of the concepts of a simple Gaussian Bayes classifier using the MNIST data. The objective is to show the capabilities of a "generative" model as a prelude to a Generative Adversarial Network and its applications.
The solution is built around the calculation of the mean and co-variance of data clusters for a particular class. The classes are formed by singular digits from the MNIST dataset.
We will utilize the package Scipy and the function stats.multivariate_normal to generate a sample based on the mean and co-variance of that particular class. In the plot above 'Sample' will indicate a sampling from the class '8'.
“For each class y, we model p(x | y), rather than directly modeling p(y | x). This is the case where a 'Generative model' is very appropriate, as we will be using it to generate samples.”
Each Sample plot is a sampling of the class using a multivariate normal distribution assumed from the classes distribution in the data set per digit.
Python Code
A sample of the python code is shown below. I am using Visual Code from Microsoft which happened to be an excellent tool for debugging and writing Python code very pleasantly.
More to come soon once I finished the multi-modal Gaussian Classifier. (#python #bayesian #generative #classifier). Stay tuned.
Commentaires