×
Home Discussions Write at Opengenus IQ
×
  • DSA Cheatsheet
  • HOME
  • Track your progress
  • Deep Learning (FREE)
  • Join our Internship 🎓
  • RANDOM
  • One Liner

Deep Learning

Deep Learning is a subset of Machine Learning which leverages the core concepts like Neural Networks to do tasks comparable to human precision.

Deep Learning

Concept and Data Drift in Deep Learning

"Drift" is a concept frequently employed in machine learning to delineate the gradual deterioration in the operational efficacy of a machine learning model over an extended period.

 Rishi Shivhare Rishi Shivhare
Deep Learning

Polyak Averaging

Polyak averaging is named after its inventors Anatoli Juditsky and Arkadii Nemirovski (also known as Polyak and Ruppert) and is a method of model averaging that helps to produce a more robust and accurate final model by combining the information from different iterations.

 Rishi Shivhare Rishi Shivhare
Deep Learning

Neural Scaling Law: A Brief Introduction

Neural scaling law is a term that describes how the performance of a neural network model depends on various factors such as the size of the model, the size of the training dataset, the cost of training, and the complexity of the task.

Alexander Nilsson
Deep Learning

Gradient / Activation checkpointing

Discover Gradient Checkpointing, a memory-saving technique in deep learning. Learn how it reduces memory usage, enabling the training of deeper networks. Explore real-world applications, advantages, and key takeaways in this exploration of memory optimization.

Abhikalp Srivastava Abhikalp Srivastava
Machine Learning (ML)

Understanding of Correlation vs. Causation

The ideas of correlation and causation are essential for understanding data and coming to conclusions. This article at OpenGenus will examine the definitions of these phrases, the reasons why they are frequently misinterpreted, and how to tell them apart.

Abraham Roy
Deep Learning

Multi-Head Attention in Deep Learning

Multi-Head Attention, a foundational component in modern neural networks, has transformed the way we process and understand sequential and structured data.

Abhikalp Srivastava Abhikalp Srivastava
Deep Learning

Gradient clipping in DL

Gradient clipping is a technique commonly used in deep / machine learning, particularly in the training of deep neural networks, to address the issue of exploding gradients.

 Rishi Shivhare Rishi Shivhare
Deep Learning

Undistillable class in Deep Learning

Undistillable class is a class in the knowledge of a teacher model that cannot be distilled or transferred to a student model.

Rudransh Deshmukh
Deep Learning

Capacity Mismatch in Deep Learning

In this article at OpenGenus.org, we have explored the concept of Capacity Mismatch in Deep Learning and Knowledge Distillation and discussed some solutions towards it.

Aditya Chatterjee Aditya Chatterjee
Deep Learning

Teach Less, Learn More (TLLM): Inspiration from Singapore Gov to DL

Teach Less, Learn More (TLLM) is a teaching initiative started by Singapore Government in 2005 and has inspired a popular paper in Deep Learning which has attempted to solve the problem of Undistillable Classes in Knowledge Distillation.

Rudransh Deshmukh
Machine Learning (ML)

Mastering Multi-Label Classification

Dive into the realm of multi-label classification, where AI tackles the intricacies of assigning multiple labels to data points. Explore label correlations, ethical dimensions, and algorithmic strategies in this captivating journey of AI complexity.

Abhikalp Srivastava Abhikalp Srivastava
Machine Learning (ML)

Non-negative matrix factorization (NMF) vs Principal Component Analysis (PCA)

In the field of data analysis and dimensionality reduction, Non-negative Matrix Factorization (NMF) and Principal Component Analysis (PCA) are two powerful techniques that play an important role in uncovering patterns, reducing noise, and extracting essential features from complex datasets.

Abdesselam Benameur Abdesselam Benameur
Deep Learning

Inference process in Deep Learning [Complete Guide]

Inference is the process of applying trained models to new data for predictions. It plays a vital role in real-world applications, enabling insights, automation, and real-time responses.

Agniva Maiti Agniva Maiti
Natural Language Processing (NLP)

Retrieval Augmented Generation (RAG): Basics

In this article at OpenGenus, we have explored a new finetuning technique for Large Language Models (LLMs) developed by Meta (formerly Facebook). This technique is known as Retrieval Augmented Generation (RAG).

Devansh Biswal
Deep Learning

Deep Learning for traffic prediction [project with source code]

This article at OpenGenus explores the development of a Deep Learning (DL) traffic predictor using a comprehensive dataset.

Alexander Nilsson
Deep Learning

Time Complexity of im2row and im2col

In this article at OpenGenus, we have explored the time and space complexity of im2row and im2col algorithms that are frequently, used for GEMM based Convolution algorithms.

Rudransh Deshmukh
Deep Learning

Journey through Backpropagation algorithm

This article talks about the process of Backpropagation, an algorithm crucial to refining Neural Networks. It also delves into its core process, explaining how it enables networks to learn from errors and enhance their accuracy. From math to application, witness AI's ongoing enhancement.

Abhikalp Srivastava Abhikalp Srivastava
Deep Learning

Winograd's Convolution Theorem [Explained]

Winograd Convolution proved a lower bound on the number of multiplications required for convolution, and used Chinese Remainder Theorem to construct optimal algorithm to achieve minimum number of multiplies.

Priyanshi Sharma Priyanshi Sharma
Deep Learning

Training process of Deep Learning models

Deep learning has revolutionized artificial intelligence, helping machines to learn and make decisions autonomously like humans. The training process, vital for model generalization and performance, has been explored in this article at OpenGenus, along with variations across model types.

Agniva Maiti Agniva Maiti
Deep Learning

6 Uses of Deep Learning in Laptops

Deep Learning, a powerful offshoot of machine learning and artificial intelligence, has been revolutionizing multiple industries, with laptops being one of them. This article at OpenGenus explores profound applications of deep learning in laptops, enhancing our interactions with these vital devices.

Anirudh Edpuganti Anirudh Edpuganti
Deep Learning

5 uses of Deep Learning for Media: A Revolution in Content Creation

This article at OpenGenus delves into how deep learning is being applied in the media industry, revolutionizing the way we create, consume, and interact with media content.

Anirudh Edpuganti Anirudh Edpuganti
Deep Learning

Deep Learning mock interview

In this article at OpenGenus, we will understand the flow of a deep learning interview for a data science based role.

Sanjana Babu
Deep Learning

Tensor Operations: Flatten and Squeeze

In this article at OpenGenus, we have explored the 2 fundamental operations in Deep Learning models that modify tensor structure that is Flatten and Squeeze op.

Sai Vamsi Karnam
Deep Learning

Adagrad: An Adaptive Gradient Algorithm for Optimization

Adaptive Gradient Algorithm, abbreviated as Adagrad, is a gradient-based optimization algorithm first introduced in 2011.

Sai Vamsi Karnam
Deep Learning

Adam Optimizer

This article introduces the Adam optimizer, an adaptive algorithm widely used in machine learning and deep learning. It combines Adagrad and RMSprop, ensuring faster convergence and improved performance for various tasks like image classification, object detection etc.

Agniva Maiti Agniva Maiti
OpenGenus IQ © 2025 All rights reserved â„¢
Contact - Email: team@opengenus.org
Primary Address: JR Shinjuku Miraina Tower, Tokyo, Shinjuku 160-0022, JP
Office #2: Commercial Complex D4, Delhi, Delhi 110017, IN
Top Posts LinkedIn Twitter
Android App
Apply for Internship