Projects I have done in Machine Learning and Deep Learning Specializations
Implemented Transformer from scratch using TensorFlow and Keras โ including scaled dot-product attention, multi-head attention, positional encodings, and encoder-decoder blocks.
Course: Deep Learning Specialization โ Course 5, Week 4
๐ง End-to-End Transformer Model with Attention in TensorFlow
Built a Named-Entity Recognition model using Transformers and TensorFlow โ part of DeepLearning.AI's Sequence Models course (Week 4).
Course: Deep Learning Specialization โ Course 5, Week 4
๐ Entity Recognition with Transformer + TensorFlow
Covers input embeddings, positional encoding, and token handling as a preprocessing step for Transformers in the Sequence Models course.
Course: Deep Learning Specialization โ Course 5, Week 4
๐ง Embeddings & Positional Encoding for Transformers
Built an audio-based trigger word detection system using spectrogram features and LSTM-based sequence modeling to detect "activate" in streaming speech.
Course: Deep Learning Specialization โ Course 5, Week 3
๐ง Detecting the Word โActivateโ from Live Audio with LSTMs
Implemented a seq2seq model with attention to translate English sentences into French. Demonstrates attention-enhanced decoding for improved context and accuracy.
Course: Deep Learning Specialization โ Course 5, Week 3
๐ English-to-French Translation with Seq2Seq + Attention
Built LSTM-based text classification models that map input sentences to emojis using pre-trained GloVe vectors and sequence modeling.
Course: Deep Learning Specialization โ Course 5, Week 2
๐ Mapping Sentences to Emojis using Word Embeddings + LSTM
Implemented vector arithmetic and bias mitigation on GloVe embeddings. Applied neutralization and equalization to address gender bias in NLP tasks.
Course: Deep Learning Specialization โ Course 5, Week 2
๐ง Exploring Fairness and Bias Mitigation in Word Embeddings
Implemented an LSTM-based model to generate jazz music sequences, capturing long-term musical patterns and improvisation using deep learning.
Course: Deep Learning Specialization โ Course 5, Week 1
๐ท AI-Powered Jazz Sequence Generation using LSTM Networks
Built a character-level RNN from scratch using NumPy to generate dinosaur names. Demonstrates sequence modeling, sampling, and training stabilization techniques.
Course: Deep Learning Specialization โ Course 5, Week 1
๐ฆ Generating Dinosaur Names Using RNNs and NumPy
Step-by-step implementation of a character-level RNN using NumPy to understand sequence modeling and BPTT without deep learning libraries.
Course: Deep Learning Specialization โ Course 5, Week 1
๐ Character-Level Sequence Generation via Custom RNN
Implemented neural style transfer to blend content and artistic style using deep convolutional features from a VGG-19 model.
Course: Deep Learning Specialization โ Course 4, Week 4
๐จ Artistic Image Generation via Deep Style Transfer
Built a face recognition system using FaceNet embeddings to perform identity verification and recognition.
Course: Deep Learning Specialization โ Course 4, Week 4
๐ง Identity Verification Using Deep Face Embeddings
Implemented a U-Net architecture for road segmentation from dashcam images using pixel-wise classification.
Course: Deep Learning Specialization โ Course 4, Week 3
๐ง Semantic Segmentation for Road Scenes with U-Net
Used a pre-trained YOLO model to detect cars in images and video frames using non-max suppression and bounding box visualization.
Course: Deep Learning Specialization โ Course 4, Week 3
๐ Object Detection using YOLOv2 & Non-Max Suppression
Applied a pretrained MobileNet model for image classification, using feature extraction and fine-tuning for edge-efficient inference.
Course: Deep Learning Specialization โ Course 4, Week 2
๐ฑ Lightweight CNNs for Mobile & Embedded AI
Implemented a deep residual network (ResNet) with skip connections to solve vanishing gradient issues and build deep CNNs.
Course: Deep Learning Specialization โ Course 4, Week 2
๐ Skip Connections for Training Deep CNNs
Applied a pretrained CNN using transfer learning to classify images into categories with Keras and TensorFlow.
Course: Deep Learning Specialization โ Course 4, Week 1
๐ผ๏ธ Real-World Image Classification using Transfer Learning
Manually implemented CNN architecture from scratch including forward and backward passes for convolution, pooling, and dense layers using NumPy.
Course: Deep Learning Specialization โ Course 4, Week 1
๐ง Low-Level CNN Components Coded from Scratch
Introduced core TensorFlow concepts like tensors, sessions, placeholders, and computation graphs for simple model training.
Course: Deep Learning Specialization โ Course 2, Week 3
๐ง Build Simple Models using TensorFlow Basics
Implemented and compared mini-batch gradient descent, momentum, and Adam optimizers for efficient deep network training and convergence.
Course: Deep Learning Specialization โ Course 2, Week 2
โก Optimize Deep Learning with Momentum & Adam
Implemented numerical gradient checking to validate backpropagation in deep neural networks by comparing analytical and approximated gradients.
Course: Deep Learning Specialization โ Course 2, Week 3
๐งช Validate Backprop with Analytical vs Numerical Gradients
Implemented L2 regularization and dropout to prevent overfitting in deep neural networks, improving generalization and reducing test error.
Course: Deep Learning Specialization โ Course 2, Week 2
๐ Prevent Overfitting with L2 & Dropout Regularization
Explored Zero, Random, and He initialization methods and their impact on gradient behavior, training speed, and convergence in deep neural networks.
Course: Deep Learning Specialization โ Course 2, Week 1
โ๏ธ Comparing Weight Initialization Methods in DNNs
Implemented a deep L-layer neural network to classify images as cat or non-cat, applying full forward and backward propagation, cost optimization, and prediction using NumPy.
Course: Deep Learning Specialization โ Course 1, Week 4
๐ฑ L-Layer DNN for Binary Image Classification
Constructed a modular L-layer deep neural network from scratch using only NumPy, including all core functions from initialization to backpropagation and parameter updates.
Course: Deep Learning Specialization โ Course 1, Week 4
๐ง Layered Architecture Built from Scratch with NumPy
Built a shallow neural network with one hidden layer to classify non-linearly separable 2D data using NumPy, including full forward/backward propagation.
Course: Deep Learning Specialization โ Course 1, Week 3
๐ฎ Shallow Neural Network for 2D Nonlinear Classification
Built a binary image classifier using logistic regression and NumPy with a neural network perspective. Applied vectorized computation and gradient descent to train the model on cat/non-cat data.
Course: Deep Learning Specialization โ Course 1, Week 2
๐ธ Binary Image Classifier with Vectorized NumPy
Practiced vectorized operations, broadcasting, and avoiding loops with NumPy to enable efficient deep learning computations.
Course: Deep Learning Specialization โ Course 1, Week 2
๐ก Foundation for Efficient Neural Network Computation
Applied Q-learning with function approximation to learn optimal policies in continuous state spaces using scalable policy and value function updates.
Course: ML Specialization โ Course 3, Week 3
๐ค Q-learning in Continuous State Spaces
Implemented Principal Component Analysis (PCA) for dimensionality reduction and data visualization, including eigen-decomposition, SVD, and high-dimensional data projection.
Course: ML Specialization โ Course 3, Week 2
๐ PCA for Dimensionality Reduction and Visualization
Implemented statistical anomaly detection using the multivariate Gaussian distribution to detect outliers in datasets through probability estimation and threshold tuning.
Course: ML Specialization โ Course 3, Week 1
๐ Gaussian-Based Outlier Detection from ML Specialization
Trained a decision tree classifier on real-world data, visualized decision boundaries, and introduced ensemble methods like Random Forest and Gradient Boosting.
Course: ML Specialization โ Course 2, Week 4
๐ฒ Implements Decision Trees and Boosting Techniques
Explored strategies for handling skewed data and evaluating model performance using precision, recall, F1-score, and error analysis.
Course: ML Specialization โ Course 2, Week 3
โญ From Andrew Ngโs ML Specialization โ Course 2, Week 3
Built a neural network using softmax activation and cross-entropy loss for multiclass classification, with full vectorization for efficiency.
Course: ML Specialization โ Course 2, Week 2
โญ From Andrew Ngโs ML Specialization โ Course 2, Week 2
Implemented a two-layer neural network using vectorized operations for binary classification with forward/backward propagation and visualization.
Course: ML Specialization โ Course 2, Week 1
๐ง From Advanced Learning Algorithms โ Week 1
Implemented logistic regression for binary classification using sigmoid activation and gradient descent, including L2 regularization to reduce overfitting.
Course: ML Specialization โ Week 3
โญ From Andrew Ngโs ML Specialization โ Course 1, Week 3
Implemented univariate linear regression using gradient descent to fit a line to data and visualize optimization with matplotlib.
Course: ML Specialization โ Week 2
โญ From Andrew Ngโs ML Specialization โ Course 1, Week 2