Posts

Showing posts from October, 2019

Ensemble Methods

Ensemble methods  is a machine learning technique that combines several base models in order to produce one optimal predictive model. An ensemble is itself a supervised learning algorithm, because it can be trained and then used to make predictions.  Ensembles tend to yield better results when there is a significant diversity among the models. Ensemble techniques (especially bagging) tend to reduce problems related to over-fitting of the training data. Ensembling reduces variance and bias, two things that can cause big differences between predicted and actual results. Types of ensembles: 1)  Bayes optimal classifier 2) Bagging 3) Boosting 4) Bayesian parameter averaging 5) Bayesian model combination 6) Bucket of models 7) Stacking 1)  Bayes optimal classifier (or)  Optimal Bayes classifier: The Optimal Bayes classifier chooses the class that has greatest a posteriori probability of occurrence (so called  maximum a posteriori estimat

What is Navie Bayes Theorem?

Navie Bayes theorem: Navie Bayes is a supervised classification algorithm. Which is based on bayes theorem with the assumption of independence among features, inorder to predict an categorie of an sample. Naive Bayes is a simple technique for constructing classifiers. It states that any freature in a class is independent of any other feature in the class. The main assumption in navie bayes theorem is the features are independent and every feature has equal importance in a class. They are probabilistic classifiers, therefore will calculate the probability of each category using Bayes theorem, and the category with the highest probability will be output. Multinomial Naive Bayes: Feature vectors represent the frequencies with which certain events have been generated by a multinomial distribution. This is the event model typically used for document classification. Bernoulli Naive Bayes: In the multivariate Bernoulli event model, features are independent booleans (binary va