Natural Language Processing (NLP) Tokenization in NLP [Complete Guide] In this article, we will look at the different approaches to tokenization and their pros and cons in Natural Language Processing (NLP).
Natural Language Processing (NLP) N-gram language model in NLP In this article, we will explore what N-gram models are, how they work, their advantages and disadvantages, and finally, we'll provide an example of how to implement an N-gram model.
Natural Language Processing (NLP) Lemmatization in NLP In this article, we have explored about Lemmatization approaches in NLP in depth and presented Lemmatization approaches in Python with code examples.
Natural Language Processing (NLP) Bag of Words (BoW) in NLP The Bag of Words technique falls under the category of text representation in NLP, wherein the words are converted to numerical values which can be understood and used by algorithms.
Natural Language Processing (NLP) CBOW and Skip gram This article at OpenGenus gives the idea of CBOW (Continuous Bag of Words) and Skip-gram in a detailed way along with differences between the two concepts.
Natural Language Processing (NLP) Stop Words in NLP In this article, we shall focus on the concept of stopwords and implementation of stopword removal in NLP.
Natural Language Processing (NLP) POS Tagging in NLP using Python POS tagging is a text preprocessing task within the ambit of Natural Language Processing (NLP) whose goal is to analyze the syntactic structure of a given sentence and to understand the input text in a better manner.
Natural Language Processing (NLP) 40 Cutting-Edge NLP Project Ideas with source code In this article, we have explored 40 Cutting-Edge NLP Project Ideas with source code and associated research papers. These projects form a strong part of a Machine Learning Engineer Portfolio.
Natural Language Processing (NLP) BERT for Legal Document Classification: A Study on Adaptation and Pretraining In this work, we aim to address these challenges by investigating how to effectively adapt BERT to handle long legal documents, and how important pre-training on in-domain documents is.
Deep Learning Discover the Revolutionary Instruct GPT Instruct GPT, or simply Instruct, is a powerful tool that allows users to fine-tune the language generation capabilities of the GPT (Generative Pre-trained Transformer) model.
Deep Learning DistilBERT: The Compact NLP Powerhouse DistilBERT is a smaller, faster, and lighter version of the popular BERT (Bidirectional Encoder Representations from Transformers) model developed by Hugging Face. It was introduced in 2019.
Deep Learning Battle of the Titans: Comparing BART and BERT in NLP In this article, we have explored the differences between two state of the art NLP models namely BERT and BART.
Deep Learning GPT-3.5 model architecture GPT-3.5 model is a fined-tuned version of the GPT3 (Generative Pre-Trained Transformer) model. GPT-3.5 was developed in January 2022 and has 3 variants each with 1.3B, 6B and 175B parameters. The main feature of GPT-3.5 was to eliminate toxic output to a certain extend.
Deep Learning Self-attention in Transformer Today we will discuss one of the revolutionary concepts in the artificial intelligence sector not only in Natural Language Processing but also nowadays in the Computer Vision, which is the Transformers and the heart of it Self-Attention.
List of Interview Questions BERT Interview Questions (NLP) In this article, we will go over various questions that cover the fundamentals and inner workings of the BERT model.
Natural Language Processing (NLP) NLP Project: Compare Text Summarization Models In this article, we will go over the basics of Text Summarization, the different approaches to generating automatic summaries, some of the real world applications of Text Summarization, and finally, we will compare various Text Summarization models with the help of ROUGE.
Natural Language Processing (NLP) Text Summarization Interview Questions (NLP) In this article, we will go over 70 questions that cover everything from the very basics of Text Summarization to the evaluation of summarized pieces of text using various metrics.
Natural Language Processing (NLP) Types of NLP models Natural Language Processing (NLP) refers to a branch of Artificial Intelligence (AI) in Computer Science that gives computers the ability to analyze and interpret human language.
Machine Learning (ML) Text Summarization using Transformers In this article, we will learn about the fundamentals of Text Summarization, some of the different ways in which we can summarize text, Transformers, the BART model, and finally, we will practically implement some of these concepts.
Machine Learning (ML) Embeddings in BERT We will see what is BERT (bi-directional Encoder Representations from Transformers). How the BERT actually works and what are the embeddings in BERT that make it so special and functional compared to other NLP learning techniques.
Natural Language Processing (NLP) Word Embedding [Complete Guide] We have explained the idea behind Word Embedding, why it is important, different Word Embedding algorithms like Embedding layers, word2Vec and other algorithms.
Natural Language Processing (NLP) Why SpaCy over NLTK? We listed 10 aspects where spaCy shines better than NLTK. It also includes information when NLTK outsmarts spaCy.
Machine Learning (ML) Applications of NLP: Extraction from PDF, Language Translation and more In this, we have explored core NLP applications such as text extraction, language translation, text classification, question answering, text to speech, speech to text and more.
Machine Learning (ML) Applications of NLP: Text Generation, Text Summarization and Sentiment Analysis In this article, we have explored 3 core NLP applications such as Text Generation using GPT models, Text summarization and Sentiment Analysis.
Machine Learning (ML) ALBERT (A Lite BERT) NLP model ALBERT stands for A Lite BERT and is a modified version of BERT NLP model. It builds on three key points such as Parameter Sharing, Embedding Factorization and Sentence Order Prediction (SOP).