×
Home Discussions Write at Opengenus IQ
×
  • DSA Cheatsheet
  • HOME
  • Track your progress
  • Deep Learning (FREE)
  • Join our Internship 🎓
  • RANDOM
  • One Liner

Machine Learning (ML)

Machine Learning is the fastest growing and most potential field that enables a computer to perform specific tasks better than humans. It is actively used in companies like Apple, Tesla, Google and Facebook. We are covering the latest developments in the field

Machine Learning (ML)

Idea of Pruning in Machine Learning (ML)

Pruning in Machine Learning is an optimization technique for Neural Network models. We have explained how to do pruning, different types of pruning, random pruning, advantages and disadvantages of pruning.

Srishti Guleria Srishti Guleria
Machine Learning (ML)

One Shot Learning in ML

In this article, we have explained the idea of One Shot Learning in Machine Learning (ML) and where and how it is used along with the limitation of One Shot Learning.

Akshay Atam
Machine Learning (ML)

SqueezeNet Model [with Architecture]

Squeezenet is a CNN architecture which has 50 times less parameters than AlexNet but still maintains AlexNet level accuracy. We also showcased the architecture of the model along with the implementation on the ImageNet dataset.

Akshay Atam
Machine Learning (ML)

Back Propagation (Intuitively)

Back Propagation is one of the most important topics of Neural Net Training. It is the process of tuning the weights of neural network and is based on the rate of error of previous epoch.

Srishti Guleria Srishti Guleria
Machine Learning (ML)

Stacking in Machine Learning

Stacking (a.k.a Stack Generalization) is an ensemble technique that uses meta-learning for generating predictions. We have explained stacking in Machine Learning (ML) in depth.

MAYANK PATEL
Machine Learning (ML)

SAME and VALID padding

SAME and VALID padding are two common types of padding using in CNN (Convolution Neural Network) models. SAME padding is equal to kernel size while VALID padding is equal to 0.

Ue Kiao, PhD Ue Kiao, PhD
Machine Learning (ML)

Xception: Deep Learning with Depth-wise Separable Convolutions

Xception is a deep convolutional neural network architecture that involves Depthwise Separable Convolutions. This network was introduced Francois Chollet who works at Google, Inc.

Zuhaib Akhtar Zuhaib Akhtar
Machine Learning (ML)

Understand Autoencoders by implementing in TensorFlow

Autoencoders are a type of unsupervised neural networks and has two components: encoder and decoder. We have provided the intuitive explanation of the working of autoencoder along with a step by step TensorFlow implementation.

Surya Pratap Singh
Machine Learning (ML)

Concept Whitening in Machine Learning

Concept Whitening is a technique that can be used to find the inner functioning. The difference in its approach is that rather than searching for solutions in the trillions of trained parameters it tries to find answers by interpreting the networks.

Srishti Guleria Srishti Guleria
Machine Learning (ML)

ELMo: Deep contextualized word representations

ELMo is the state-of-the-art NLP model that was developed by researchers at Paul G. Allen School of Computer Science & Engineering, University of Washington. In this article, we will go through ELMo in depth and understand its working.

Zuhaib Akhtar Zuhaib Akhtar
Machine Learning (ML)

Differentiating fake faces using simple ML and computer vision

We have explored a technique to Differentiate fake faces using simple Machine Learning (ML) and computer vision. We have used Large-scale CelebFaces Attributes dataset.

Akshay Atam
Machine Learning (ML)

How a ML Dataset is designed?

In this article, we have explored How a (Machine Learning) ML Dataset is designed? with the example of the PASCAL dataset (Pascal-1 Visual Object Classes (VOC)) and how it evolved over the years and how the initial dataset was prepared.

Ayush Mehar
Machine Learning (ML)

Different Basic Operations in CNN

We have explored the different operations in CNN (Convolution Neural Network) such as Convolution operation, Pooling, Flattening, Padding, Fully connected layers, Activation function (like Softmax) and Batch Normalization.

Apoorva Kandpal Apoorva Kandpal
Machine Learning (ML)

ResNet50 v1.5 architecture

ResNet50 v1.5 is the modified version of the original ResNet50 model. ResNet50 v1.5 is slightly more accurate (0.5%) and slightly lower performance (down by 5%) compared to ResNet50.

Ayush Mehar
Machine Learning (ML)

Multilayer Perceptron

Multilayer perceptron is a fundamental concept in Machine Learning (ML) that lead to the first successful ML model, Artificial Neural Network (ANN). We have explored the idea of Multilayer Perceptron in depth.

Ayush Mehar
Machine Learning (ML)

Multilayer Perceptrons vs CNN

We have explored the key differences between Multilayer perceptron and CNN in depth. Multilayer Perceptron and CNN are two fundamental concepts in Machine Learning. When we apply activations to Multilayer perceptrons, we get Artificial Neural Network (ANN) which is one of the earliest ML models.

Ayush Mehar
Machine Learning (ML)

Disadvantages of RNN

We have explored the disadvantages of RNN in depth. Recurrent Neural Networks (or RNNs) are the first of their kind neural networks that can help in analyzing and learning sequences of data rather than just instance-based learning.

Dishant Parikh
Machine Learning (ML)

BERT base vs BERT large

BERT base model has 12 encoder layers stacked on top of each other whereas BERT large has 24 layers of encoders stacked on top of each other.

Zuhaib Akhtar Zuhaib Akhtar
Machine Learning (ML)

Applications of NLP: Extraction from PDF, Language Translation and more

In this, we have explored core NLP applications such as text extraction, language translation, text classification, question answering, text to speech, speech to text and more.

Shubham Sood Shubham Sood
Machine Learning (ML)

Applications of NLP: Text Generation, Text Summarization and Sentiment Analysis

In this article, we have explored 3 core NLP applications such as Text Generation using GPT models, Text summarization and Sentiment Analysis.

Shubham Sood Shubham Sood
Machine Learning (ML)

Differences between Standardization, Regularization, Normalization in ML

We have covered the Differences between Standardization, Regularization, Normalization in depth along with the introductory knowledge and complete explanation of the key terms.

Ayush Mehar
Machine Learning (ML)

ALBERT (A Lite BERT) NLP model

ALBERT stands for A Lite BERT and is a modified version of BERT NLP model. It builds on three key points such as Parameter Sharing, Embedding Factorization and Sentence Order Prediction (SOP).

Zuhaib Akhtar Zuhaib Akhtar
Machine Learning (ML)

Different core topics in NLP (with Python NLTK library code)

In this, we have covered different NLP tasks/ topics such as Tokenization of Sentences and Words, Stemming, Lemmatization, POS Tagging, Named Entity Relationship and more.

Shubham Sood Shubham Sood
Machine Learning (ML)

XLNet, RoBERTa, ALBERT models for Natural Language Processing (NLP)

We have explored some advanced NLP models such as XLNet, RoBERTa and ALBERT and will compare to see how these models are different from the fundamental model i.e BERT.

Shubham Sood Shubham Sood
Machine Learning (ML)

LSTM & BERT models for Natural Language Processing (NLP)

The fundamental NLP model that is used initially is LSTM model but because of its drawbacks BERT became the favored model for the NLP tasks.

Shubham Sood Shubham Sood
OpenGenus IQ © 2025 All rights reserved â„¢
Contact - Email: team@opengenus.org
Primary Address: JR Shinjuku Miraina Tower, Tokyo, Shinjuku 160-0022, JP
Office #2: Commercial Complex D4, Delhi, Delhi 110017, IN
Top Posts LinkedIn Twitter
Android App
Apply for Internship