This post is about helping you to get a better understanding of Naive Bayes

Image for post
Image for post
Picture and edits go to my sister(https://www.instagram.com/the_snap_artistry/)

This is in continuation of part-1,

In this post, I will be discussing the following things(In bold). If you are new to Naive Bayes or want a quick revision, please check my notes to get started.

1. Why in Naive Bayes do we assume the features are conditional independent?

2. Sklearn has GaussianNB, MultinomialNB, CategoricalNB, BernoulliNB → Given data that has categories, numerical, binary features which model will you choose?

3. How do you implement Multinomial Naive Bayes from scratch for text data and match the results with Sklearn MultinomialNB?

4. How do you implement Categorical Naive Bayes from scratch…


This post is about helping you to get a better understanding of Naive Bayes.

Image for post
Image for post
Picture and edits go to my sister(https://www.instagram.com/the_snap_artistry/)

Just like Naive Bayes makes a naive assumption that the features we give to the model are independent, I have also made several naive assumptions about how NB works.

  • I assumed that MultinomialNB works for any given data. (And I use that for whatever data I have) → Explained in detail in the below post, why this is a bad practice.
  • Gaussian naive Bayes make an assumption that the features are gaussian. →The features are not assumed to be gaussian it assumes the likelihood probabilities follow Gaussian distribution.
  • Calculating of likelihood probability is same in MultinomialNB and CategoricalNB → No…


A simple NLP dataset to quickly get your hands on Text cleaning, pre-processing and training.

Image for post
Image for post

If you are a beginner to Machine Learning and want to work on a simple NLP dataset, I definitely recommend to go check this Kaggle challenge and try to solve it on your own before reading further.

In this challenge, I have got my hands on cleaning HTML tags, analyzing histograms, building custom W2V, TFIDF-W2V for vectorization, dealing with ordinal target labels, generating n-grams, using SVD for dimensionality reduction, deployment in a cloud server.

Table of contents:

  1. Introduction.
  2. Exploratory Data Analysis.
  3. Performance metric.
  4. First-cut approach.
  5. Text cleaning
  6. Text…


Sequence to sequence models with end to end Machine translation example in TensorFlow.

Image for post
Image for post
The detailed architecture of the encoder-decoder model with the sample input.

Prerequisites:

A basic understanding of LSTM and TensorFlow will suffice to go till the end of this post.

Table of contents:

  1. Introduction.
  2. High-level Architecture explanation.
  3. Data Preprocessing.
  4. Training the Encoder-Decoder Model.
  5. How to write Inference.
  6. Results.
  7. Shortcomings of the Encoder-Decoder Model.

1. Introduction:

Encoder — decoder models are used in Machine Translation, Conversational chatbot. It is also used in Image captioning(given an image briefly describe the image), video captioning(given a video file briefly describe the video), LaTex Mathematical expressions (given an image of the formula generate mathematical expression in LaTex format).

In this post, I will work on machine translation. Our goal is given an Italian…


Image for post
Image for post
Simple Encoder — Decode model

In this post, we will learn the basics of a sequence to sequence models and build a simple calculator application.

A sequence to sequence model is used widely in machine translation, image captioning. In our calculator sequence to sequence model, given an input sequence (‘10+21’) we will try to predict an output sequence (‘31’). We will limit our scope to three-digit addition, multiplication, subtraction, and division.

Table of contents:

  1. Data preparation
  2. Pre-processing
  3. Building a sequence to sequence model using LSTM.
  4. Inference.
  5. Results
  6. Shortcomings of the current approach.

1. Data Preparation:

At the end of this section we will try to generate input output pairs as shown…


Image for post
Image for post
Image credits: https://wallpaperaccess.com

In middleware companies and startups, people expect data scientists to do everything from data pre-processing to model productionization. So if you are a beginner level ML engineer who wants to build a product or someone interested in deploying their model you are in the correct place.

Before going further I will be using Amazon AWS free EC2 services for launching an Instance, you need to have a valid debit or credit card to create an AWS account.

Let’s get started!!.

On a high level we will be doing the following:

1. Writing inference code for our model. 2. Creating a simple web API using Flask. 3. Containerizing the application with…

GOWTHAM CH

Trying to make machines more smart.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store