×
Home Discussions Write at Opengenus IQ
×
  • About
  • Track your progress
  • Deep Learning Projects
  • Python Projects
  • Join our Internship 🎓
  • RANDOM
  • 100+ Graph Algorithms
  • 100+ DP Problems
  • 50+ Linked List Problems
  • 50+ Array Problems
  • One Liner
  • 50+ Binary Tree problems
  • Home
  • Rust Projects
Zuhaib Akhtar

Zuhaib Akhtar

Zuhaib is an Applied Scientist at Amazon, California and has been a Machine Learning Developer, Intern at OPENGENUS. He completed his B. Tech from Aligarh Muslim University, Aligarh in 2019.

Uttar Pradesh, India •
11 posts •
Machine Learning (ML)

Xception: Deep Learning with Depth-wise Separable Convolutions

Xception is a deep convolutional neural network architecture that involves Depthwise Separable Convolutions. This network was introduced Francois Chollet who works at Google, Inc.

Zuhaib Akhtar Zuhaib Akhtar
Machine Learning (ML)

ELMo: Deep contextualized word representations

ELMo is the state-of-the-art NLP model that was developed by researchers at Paul G. Allen School of Computer Science & Engineering, University of Washington. In this article, we will go through ELMo in depth and understand its working.

Zuhaib Akhtar Zuhaib Akhtar
Machine Learning (ML)

BERT base vs BERT large

BERT base model has 12 encoder layers stacked on top of each other whereas BERT large has 24 layers of encoders stacked on top of each other.

Zuhaib Akhtar Zuhaib Akhtar
Machine Learning (ML)

ALBERT (A Lite BERT) NLP model

ALBERT stands for A Lite BERT and is a modified version of BERT NLP model. It builds on three key points such as Parameter Sharing, Embedding Factorization and Sentence Order Prediction (SOP).

Zuhaib Akhtar Zuhaib Akhtar
Machine Learning (ML)

RoBERTa: Robustly Optimized BERT pre-training Approach

RoBERTa (Robustly Optimized BERT pre-training Approach) is a NLP model and is the modified version (by Facebook) of the popular NLP model, BERT. It is more like an approach better train and optimize BERT (Bidirectional Encoder Representations from Transformers).

Zuhaib Akhtar Zuhaib Akhtar
Machine Learning (ML)

Introduction to GPT models

Generative Pre-Training (GPT) models are trained on unlabeled dataset (which are available in abundance). There are different variants like GPT-1, GPT-2 and GPT-3 which we have explored.

Zuhaib Akhtar Zuhaib Akhtar
Machine Learning (ML)

BERT and SEARCH: How BERT is used to improve searching?

In this article, we have explored how BERT model can be used to improve search results in search engines like Google Search, Bing and others.

Zuhaib Akhtar Zuhaib Akhtar
Machine Learning (ML)

Introduction to Multilingual BERT (M-BERT)

We explored what is Multilingual BERT (M-BERT) and see a general introduction of this NLP model.

Zuhaib Akhtar Zuhaib Akhtar
Machine Learning (ML)

A Deep Learning Approach for Native Language Identification (NLI)

Native language identification (NLI) is the task of determining an author's native language based only on their writings or speeches in a second language. In this article, we will implement a model to identify native language of the author.

Zuhaib Akhtar Zuhaib Akhtar
Machine Learning (ML)

Native Language Identification (NLI)

Native language identification (NLI) is the task of determining an author's native language based only on their writings or speeches in a second language. This is an application of Machine Learning.

Zuhaib Akhtar Zuhaib Akhtar
Machine Learning (ML)

An Introduction to Recommendation System

This is the introduction to recommendation systems, how it works and more. We have different approaches to it like Content-based systems, Collaborative filtering systems and Hybrid systems.

Zuhaib Akhtar Zuhaib Akhtar
OpenGenus IQ © 2023 All rights reserved â„¢ [email: team@opengenus.org]
Top Posts LinkedIn Twitter