Machine Learning (ML) Jensen Shannon Divergence Jensen Shannon Divergence is one of the distribution comparison techniques that can be easily used in parametric tests in ML.
TensorFlow Depthwise Convolution op in TensorFlow (tf.nn.depthwise_conv2d) This article will discuss about the Depthwise Convolution operation and how it is implemented using the TensorFlow framework (tf.nn.depthwise_conv2d).
TensorFlow New features in TensorFlow v2.8 TensorFlow 2.8 has been finally released. Let us take a look at some of the new features and improvements being rolled out in this version. This new version comes with lots of additions, bug fixes and changes.
Machine Learning (ML) One hot encoding in TensorFlow (tf.one_hot) This article discusses about one of the commonly used data pre-processing techniques in Feature Engineering that is One Hot Encoding and its use in TensorFlow.
TensorFlow Dropout operation in TensorFlow (tf.nn.dropout) This article discusses about a special kind of layer called the Dropout layer in TensorFlow (tf.nn.dropout) which is used in Deep Neural Networks as a measure for preventing or correcting the problem of over-fitting.
Machine Learning (ML) Scaled-YOLOv4 model The authors of YOLOv4 pushed the YOLOv4 model forward by scaling it's design and scale and thus outperforming the benchmarks of EfficientDet. This resulted in Scaled-YOLOv4 model.
Machine Learning (ML) YOLOv4 model architecture This article discusses about the YOLOv4's architecture. It outperforms the other object detection models in terms of the inference speeds. It is the ideal choice for Real-time object detection, where the input is a video stream.
Machine Learning (ML) ReLU (Rectified Linear Unit) Activation Function We will take a look at the most widely used activation function called ReLU (Rectified Linear Unit) and understand why it is preferred as the default choice for Neural Networks. This article tries to cover most of the important points about this function.