Skip to main content

Posts

A Brief Overview of GPT-3 by OpenAI

Recent posts

Machine Learning Loss Functions in Practice

  Error/Loss functions are used to estimate the loss of a model so that the weights can be updated to reduce the error rate on the next iteration.  As you have clicked in this article, I am assuming you know the fundamental stuffs of machine learning pipelines and you want to know about loss functions specifically. So, let's jump directly to loss functions. I will also show you how you can use these loss functions in Scikit-learn/Pytorch. Broadly, we can categorize loss functions in two categories.  Loss functions for Regression problems. Loss functions for Classification problems. Regression problems The two most common loss functions for regression problems are: MSE( Mean Squared Error)  MAE (Mean Absolute Error)  MSE / Quadratic Loss / L2 Loss   If the target values falls into Gaussian/ Normal distribution, then it is the preferred loss function for regression problems. MSE is the sum of squared distances between target variables (ground truth) and predicted values. The implemen

Calculate Confusion Matrix Manually

 This article was originally published at medium .           To understand the terminologies properly, I will take a simple binary classification problem. Let’s say, our dataset contains the product reviews of an e-commerce website. Each review has a label, either positive (1) or negative (0). Our task is to classify whether a review is positive or negative. Let’s assume, using different NLP techniques, we have made a good/bad model that can predict the labels somehow. For example, the below CSV file snap is the sample of our actual and predicted labels after the prediction that our model made.   fig 1: our sample product review predictions against actual true labels                                               In this dataset, 0 means it’s a negative review, and 1 means it’s a positive review. Here, we got our predicted labels using a machine learning model. I won’t explain any machine learning model or training/testing phase here in this article. If you calculate the tr

Regularization in Deep Learning / Machine Learning - Prevent Overfitting

image source: mlexplained Overfittng happens in every machine learning (ML) problem.  Learning how to deal with overfitting is essential to mastering machine learning.  The fundamental issue in machine learning is the tension between optimization  and generalization. Optimization refers to the process of adjusting a model to get the  best performance possible on the training data (the learning in machine learning ),  whereas generalization refers to how well the trained model performs on data it has  never seen before . The goal of the game is to get good generalization, of course, but you don’t control generalization; you can only adjust the model based on its training  data.  The processing of fighting overfitting is a way  called regularization . [1].  How do you know whether a model is overfitting? The best initial method is to measure error on a training and test set. If you see a low error on the training set and high error on test & validation set then you have like

Encapsulation VS Abstraction in Object-Oriented Programming

Encapsulation  binds together the data and functions that manipulate the data, keeping it safe from interference and misuse. Real-World Example:   Every time you log into your email account( Gmail, Yahoo, Hotmail or official mail), you have a whole lot of processes taking place in the backend, that you have no control over. So your password, would probably be retrieved in an encrypted form, verified and only then you are given access. You do not have any control, over how the password is verified, and this keeps it safe from misuse. Abstraction  is a process of hiding the implementation from the user, only the functionality is exposed here. So you are aware only of what the application does, not how it does it. Real-World Example:   When you log into your email, compose and send a mail. Again there is a whole lot of background processing involved, verifying the recipient, sending a request to the email server, sending your email. Here you are only interested in composing and clicking o

Lambda, Map, and Filter in Python

Lambda: Lambda functions are known as the anonymous functions in Python. This kind of functions are declared without any name. Can accept any number of argument.  Restricted to only a single expression.  Used only one-time. Lambda function is used when you need a function for a short period of time.  Basic syntax:  lambda arguments : expression Example: Map: Map is a built-in function that takes a function object and iterable list (list/dictionary) as its parameter. Basic syntax: map (function_object, iterable_1, iterable_2, ... ) Few examples of Map in python: Filter: It works similar way like map. The difference is it makes a new list from the element in the list that satisfy some condition . In case of of parameters, it takes a function and one iterable. Basic syntax: filter (function, iterable) Normally used with lambda functions. Example is given below:

Differences between MVC and MVT framework

MVC (Model-View-Controller) is a software design pattern for developing web applications. It consists of the following parts: Model : It is responsible for managing data of the application. It can be a database, a single object, or some structure of objects etc. It can also have logic to update controller if its data changes. View : View represents the visualization of the data that a model contains. It is responsible for which data should be displayed for a particular request.  Controller : It connects the View with the Model. It controls the data flow and keeps Model and View separate. It is responsible for responding to the user input and perform interactions on the data model objects.  Django - MVT (Model-View-Template ) pattern is slightly different from MVC pattern. The main difference is Django itself take care of the Controller part. In the Template, it's just HTML file mixed with DTL - Django Template Language.