<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Posts | anurajmohan.in</title><link>https://anurajmohan.in/read/</link><atom:link href="https://anurajmohan.in/read/index.xml" rel="self" type="application/rss+xml"/><description>Posts</description><generator>Wowchemy (https://wowchemy.com)</generator><language>en-us</language><item><title>NLP → Transformers: Prerequisite Learning Path</title><link>https://anurajmohan.in/read/llm-pre/</link><pubDate>Sat, 07 Mar 2026 00:00:00 +0000</pubDate><guid>https://anurajmohan.in/read/llm-pre/</guid><description>&lt;h2 id="nlp--transformers-prerequisite-learning-path">NLP → Transformers: Prerequisite Learning Path&lt;/h2>
&lt;p>Before learning &lt;strong>Large Language Models (LLMs)&lt;/strong> such as &lt;strong>GPT, BERT, Gemini, or Claude&lt;/strong>, it is important to understand some foundational concepts in &lt;strong>Natural Language Processing (NLP)&lt;/strong> and &lt;strong>Deep Learning&lt;/strong>.&lt;/p>
&lt;p>This roadmap provides a &lt;strong>quick step-by-step learning path&lt;/strong> from basic NLP concepts to &lt;strong>Transformers&lt;/strong> , which are the backbone of modern LLMs.&lt;/p>
&lt;hr>
&lt;h2 id="1-introduction-to-natural-language-processing">1. Introduction to Natural Language Processing&lt;/h2>
&lt;p>Understand what &lt;strong>Natural Language Processing (NLP)&lt;/strong> is and why it is important in AI.&lt;/p>
&lt;h3 id="topics-to-learn">Topics to Learn&lt;/h3>
&lt;ul>
&lt;li>What is NLP&lt;/li>
&lt;li>Applications of NLP&lt;/li>
&lt;li>Text preprocessing&lt;/li>
&lt;li>Tokenization&lt;/li>
&lt;li>Stopword removal&lt;/li>
&lt;li>Stemming and Lemmatization&lt;/li>
&lt;/ul>
&lt;h3 id="learning-resource">Learning Resource&lt;/h3>
&lt;ul>
&lt;li>&lt;a href="https://youtu.be/CMrHM8a3hqw?si=9HqK18KYFYLCyBDp" target="_blank" rel="noopener">Natural Language Processing In 5 Minutes | What Is NLP And How Does It Work?&lt;/a>&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="2-word-representations">2. Word Representations&lt;/h2>
&lt;p>Machines cannot understand raw text directly.&lt;br>
Words must be converted into &lt;strong>numerical vectors&lt;/strong>.&lt;/p>
&lt;h3 id="topics-to-learn-1">Topics to Learn&lt;/h3>
&lt;ul>
&lt;li>Bag of Words&lt;/li>
&lt;li>TF-IDF&lt;/li>
&lt;li>Word Embeddings&lt;/li>
&lt;/ul>
&lt;h3 id="learning-resource-1">Learning Resource&lt;/h3>
&lt;ul>
&lt;li>&lt;a href="https://youtu.be/wgfSDrqYMJ4?si=vWtJwBxkmqKKcpMq" target="_blank" rel="noopener">What are Word Embeddings?&lt;/a>&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="3-neural-networks-for-nlp">3. Neural Networks for NLP&lt;/h2>
&lt;p>Deep learning models are widely used for NLP tasks.&lt;/p>
&lt;h3 id="topics-to-learn-2">Topics to Learn&lt;/h3>
&lt;ul>
&lt;li>Basics of Neural Networks&lt;/li>
&lt;li>Forward propagation&lt;/li>
&lt;li>Backpropagation&lt;/li>
&lt;li>Activation functions&lt;/li>
&lt;li>Neural networks for text processing&lt;/li>
&lt;/ul>
&lt;h3 id="learning-resource-2">Learning Resource&lt;/h3>
&lt;ul>
&lt;li>&lt;a href="https://youtu.be/bfmFfD2RIcg?si=CmpB0hfWh7emJJtM" target="_blank" rel="noopener">Neural Network In 5 Minutes | What Is A Neural Network?&lt;/a>&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="4-sequence-models">4. Sequence Models&lt;/h2>
&lt;p>Text is sequential in nature, so models must understand &lt;strong>context and order of words&lt;/strong>.&lt;/p>
&lt;h3 id="topics-to-learn-3">Topics to Learn&lt;/h3>
&lt;ul>
&lt;li>Recurrent Neural Networks (RNN)&lt;/li>
&lt;li>Long Short-Term Memory (LSTM)&lt;/li>
&lt;li>Gated Recurrent Units (GRU)&lt;/li>
&lt;li>Sequence modeling&lt;/li>
&lt;/ul>
&lt;h3 id="learning-resource-3">Learning Resource&lt;/h3>
&lt;ul>
&lt;li>[Illustrated Guide to Recurrent Neural Networks] (&lt;a href="https://youtu.be/LHXXI4-IEns?si=IiUFY9F5uMYyCptP">https://youtu.be/LHXXI4-IEns?si=IiUFY9F5uMYyCptP&lt;/a>)&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="5-transformers">5. Transformers&lt;/h2>
&lt;p>Transformers are the &lt;strong>foundation of modern LLMs&lt;/strong>.&lt;/p>
&lt;p>They overcome the limitations of RNNs and allow models to process sequences &lt;strong>in parallel using attention mechanisms&lt;/strong>.&lt;/p>
&lt;h3 id="topics-to-learn-4">Topics to Learn&lt;/h3>
&lt;ul>
&lt;li>Attention mechanism&lt;/li>
&lt;li>Self-attention&lt;/li>
&lt;li>Transformer architecture&lt;/li>
&lt;li>Encoder–decoder structure&lt;/li>
&lt;li>Positional encoding&lt;/li>
&lt;/ul>
&lt;h3 id="learning-resource-4">Learning Resource&lt;/h3>
&lt;ul>
&lt;li>&lt;a href="https://youtu.be/ZXiruGOCn9s?si=tk_lmrSi3iX1g2iB" target="_blank" rel="noopener">What are Transformers (Machine Learning Model)?&lt;/a>&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="next-step-large-language-models">Next Step: Large Language Models&lt;/h2>
&lt;p>Once you understand Transformers, you can move to:&lt;/p>
&lt;ul>
&lt;li>BERT&lt;/li>
&lt;li>GPT models&lt;/li>
&lt;li>Generative AI&lt;/li>
&lt;li>Retrieval Augmented Generation (RAG)&lt;/li>
&lt;li>Agentic AI&lt;/li>
&lt;/ul>
&lt;p>These concepts are build on the &lt;strong>Transformer architecture&lt;/strong>.&lt;/p>
&lt;hr></description></item><item><title>A Self-Learning Roadmap to Deep Learning</title><link>https://anurajmohan.in/read/dl/</link><pubDate>Wed, 18 Feb 2026 00:00:00 +0000</pubDate><guid>https://anurajmohan.in/read/dl/</guid><description>&lt;p>This roadmap is designed for undergraduate students transitioning from Machine Learning to Deep Learning. It covers Neural Networks, CNNs, RNNs, LSTMs, and Transformers using PyTorch.&lt;/p>
&lt;hr>
&lt;h1 id="-phase-1-deep-learning-foundations">🔵 Phase 1: Deep Learning Foundations&lt;/h1>
&lt;h2 id="1-neural-network-fundamentals">1️⃣ Neural Network Fundamentals&lt;/h2>
&lt;h3 id="-core-concepts">✅ Core Concepts&lt;/h3>
&lt;ul>
&lt;li>What is a Neural Network?&lt;/li>
&lt;li>Perceptron&lt;/li>
&lt;li>Activation Functions (ReLU, Sigmoid, Tanh, Softmax)&lt;/li>
&lt;li>Forward Propagation&lt;/li>
&lt;li>Loss Functions (MSE, Cross-Entropy)&lt;/li>
&lt;li>Backpropagation&lt;/li>
&lt;li>Gradient Descent&lt;/li>
&lt;li>Overfitting &amp;amp; Regularization (Dropout, L2)&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="2-math-behind-deep-learning-revision">2️⃣ Math Behind Deep Learning (Revision)&lt;/h2>
&lt;h3 id="-linear-algebra">📌 Linear Algebra&lt;/h3>
&lt;ul>
&lt;li>Matrix multiplication&lt;/li>
&lt;li>Dot product&lt;/li>
&lt;li>Vector spaces&lt;/li>
&lt;li>Eigenvalues (basic intuition)&lt;/li>
&lt;/ul>
&lt;h3 id="-calculus">📌 Calculus&lt;/h3>
&lt;ul>
&lt;li>Partial derivatives&lt;/li>
&lt;li>Chain rule&lt;/li>
&lt;li>Gradient computation&lt;/li>
&lt;/ul>
&lt;h3 id="-probability">📌 Probability&lt;/h3>
&lt;ul>
&lt;li>Softmax as probability distribution&lt;/li>
&lt;li>Log-likelihood&lt;/li>
&lt;li>Cross-entropy loss&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h1 id="-learning-resources-foundations">📚 Learning Resources (Foundations)&lt;/h1>
&lt;h2 id="-youtube-playlists">🎥 YouTube Playlists&lt;/h2>
&lt;ul>
&lt;li>&lt;a href="https://youtube.com/playlist?list=PLblh5JKOoLUIxGDQs4LFFD--41Vzf-ME1" target="_blank" rel="noopener">StatQuest – Neural Networks&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://youtube.com/playlist?list=PLZHQObOWTQDNU6R1_67000Dx_ZCJB-3pi" target="_blank" rel="noopener">3Blue1Brown – Neural Networks&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://www.deeplearning.ai/" target="_blank" rel="noopener">DeepLearning.AI (Andrew Ng)&lt;/a>&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="-courses">🎓 Courses&lt;/h2>
&lt;ul>
&lt;li>&lt;a href="https://developers.google.com/machine-learning/crash-course/neural-networks" target="_blank" rel="noopener">Google Deep Learning Crash Course&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://developers.google.com/machine-learning/crash-course/embeddings" target="_blank" rel="noopener">Embeddings Module (Google ML Crash Course)&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://www.coursera.org/specializations/deep-learning" target="_blank" rel="noopener">Coursera – Deep Learning Specialization&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://course.fast.ai/" target="_blank" rel="noopener">Fast.ai – Practical Deep Learning&lt;/a>&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h1 id="3-pytorch-basics">3️⃣ PyTorch Basics&lt;/h1>
&lt;p>Students must learn:&lt;/p>
&lt;ul>
&lt;li>Working with tensors&lt;/li>
&lt;li>Autograd (automatic differentiation)&lt;/li>
&lt;li>Implementing a three-layer neural network&lt;/li>
&lt;li>Writing a full training loop&lt;/li>
&lt;li>Model evaluation&lt;/li>
&lt;li>GPU training&lt;/li>
&lt;/ul>
&lt;h2 id="-resources">📚 Resources&lt;/h2>
&lt;ul>
&lt;li>&lt;a href="https://pytorch.org/tutorials/" target="_blank" rel="noopener">PyTorch Official Tutorials&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://youtube.com/playlist?list=PLKnIA16_Rmvboy8bmDCjwNHgTaYH2puK7" target="_blank" rel="noopener">PyTorch Playlist (YouTube)&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://pytorch.org/tutorials/beginner/deep_learning_60min_blitz.html" target="_blank" rel="noopener">PyTorch 60 Minute Blitz&lt;/a>&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h1 id="-phase-2-core-deep-learning-architectures">🔵 Phase 2: Core Deep Learning Architectures&lt;/h1>
&lt;h2 id="-feed-forward-neural-networks-fnn--mlp">📘 Feed-Forward Neural Networks (FNN / MLP)&lt;/h2>
&lt;h3 id="topics">Topics&lt;/h3>
&lt;ul>
&lt;li>Multi-layer neural networks&lt;/li>
&lt;li>Activation functions&lt;/li>
&lt;li>Loss functions&lt;/li>
&lt;li>Backpropagation&lt;/li>
&lt;li>Weight initialization&lt;/li>
&lt;/ul>
&lt;h3 id="recommended-resources">Recommended Resources&lt;/h3>
&lt;ul>
&lt;li>&lt;a href="https://www.coursera.org/learn/neural-networks-deep-learning" target="_blank" rel="noopener">Neural Networks &amp;amp; Deep Learning (Coursera)&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://d2l.ai/" target="_blank" rel="noopener">Dive into Deep Learning (Free Book with PyTorch Code)&lt;/a>&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="-convolutional-neural-networks-cnn">📘 Convolutional Neural Networks (CNN)&lt;/h2>
&lt;h3 id="topics-1">Topics&lt;/h3>
&lt;ul>
&lt;li>Convolution operation&lt;/li>
&lt;li>Filters and feature maps&lt;/li>
&lt;li>Stride and padding&lt;/li>
&lt;li>Pooling layers&lt;/li>
&lt;li>CNN architectures&lt;/li>
&lt;li>Training CNN in PyTorch&lt;/li>
&lt;/ul>
&lt;h3 id="recommended-resources-1">Recommended Resources&lt;/h3>
&lt;ul>
&lt;li>&lt;a href="https://www.coursera.org/learn/convolutional-neural-networks" target="_blank" rel="noopener">Convolutional Neural Networks (Coursera)&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://d2l.ai/chapter_convolutional-neural-networks/index.html" target="_blank" rel="noopener">Dive into Deep Learning – CNN Chapter&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://pytorch.org/tutorials/beginner/blitz/cifar10_tutorial.html" target="_blank" rel="noopener">PyTorch CIFAR-10 Tutorial&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://www.datacamp.com/tutorial/pytorch-cnn-tutorial" target="_blank" rel="noopener">MNIST CNN Tutorial (DataCamp)&lt;/a>&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="-recurrent-neural-networks-rnn-lstm-gru">📘 Recurrent Neural Networks (RNN, LSTM, GRU)&lt;/h2>
&lt;h3 id="topics-2">Topics&lt;/h3>
&lt;ul>
&lt;li>Sequential data&lt;/li>
&lt;li>Vanishing gradient problem&lt;/li>
&lt;li>Basic RNN&lt;/li>
&lt;li>LSTM&lt;/li>
&lt;li>GRU&lt;/li>
&lt;li>Next-word prediction&lt;/li>
&lt;/ul>
&lt;h3 id="recommended-resources-2">Recommended Resources&lt;/h3>
&lt;ul>
&lt;li>&lt;a href="https://www.coursera.org/learn/nlp-sequence-models" target="_blank" rel="noopener">Sequence Models (Coursera)&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://d2l.ai/chapter_recurrent-neural-networks/index.html" target="_blank" rel="noopener">Dive into Deep Learning – RNN Chapter&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://pytorch.org/tutorials/beginner/nlp/sequence_models_tutorial.html" target="_blank" rel="noopener">PyTorch NLP Tutorial&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://medium.com/@abhishekjainindore24/session-14-next-word-predictor-using-lstm-in-pytorch-bddd2068a909" target="_blank" rel="noopener">Next Word Prediction using LSTM&lt;/a>&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="-transformers">📘 Transformers&lt;/h2>
&lt;h3 id="topics-3">Topics&lt;/h3>
&lt;ul>
&lt;li>Attention mechanism&lt;/li>
&lt;li>Self-attention&lt;/li>
&lt;li>Scaled dot-product attention&lt;/li>
&lt;li>Multi-head attention&lt;/li>
&lt;li>Positional encoding&lt;/li>
&lt;li>Encoder–Decoder architecture&lt;/li>
&lt;li>Fine-tuning pretrained models&lt;/li>
&lt;/ul>
&lt;h3 id="recommended-resources-3">Recommended Resources&lt;/h3>
&lt;ul>
&lt;li>&lt;a href="https://arxiv.org/abs/1706.03762" target="_blank" rel="noopener">Attention Is All You Need (Original Paper)&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://d2l.ai/chapter_attention-mechanisms/index.html" target="_blank" rel="noopener">Dive into Deep Learning – Attention Mechanisms Chapter&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://ketanhdoshi.github.io/Transformers-Arch/" target="_blank" rel="noopener">Transformers Explained (Visual Guide 1)&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://blog.londogard.com/posts/2021-02-18-transformers-explained/transformers-explained.html" target="_blank" rel="noopener">Transformers Explained (Visual Guide 2)&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://medium.com/data-science/transformers-explained-visually-part-1-overview-of-functionality-95a6dd460452" target="_blank" rel="noopener">Transformers Explained (Visual Guide 3)&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://www.geeksforgeeks.org/nlp/machine-translation-with-transformer-in-python/" target="_blank" rel="noopener">Transformers for Machine Translation&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://drive.google.com/file/d/1iftb63orgDV9Lmi_SOSqBjRM79cLn7wp/view?usp=sharing" target="_blank" rel="noopener">My Lecture Slides on Transformer&lt;/a>&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h1 id="-phase-3-mini-projects-choose-any-2">🔵 Phase 3: Mini Projects (Choose Any 2)&lt;/h1>
&lt;h2 id="-option-a-image-classification-cnn">🔹 Option A: Image Classification (CNN)&lt;/h2>
&lt;p>&lt;strong>Dataset:&lt;/strong>&lt;br>
&lt;a href="https://www.cs.toronto.edu/~kriz/cifar.html" target="_blank" rel="noopener">CIFAR-10 Dataset&lt;/a>&lt;/p>
&lt;h3 id="deliverables">Deliverables&lt;/h3>
&lt;ul>
&lt;li>Data preprocessing&lt;/li>
&lt;li>CNN architecture&lt;/li>
&lt;li>Training curves&lt;/li>
&lt;li>Accuracy&lt;/li>
&lt;li>Confusion matrix&lt;/li>
&lt;li>Conclusion&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="-option-b-sentiment-analysis-rnn--lstm">🔹 Option B: Sentiment Analysis (RNN / LSTM)&lt;/h2>
&lt;p>&lt;strong>Dataset:&lt;/strong>&lt;br>
&lt;a href="https://www.kaggle.com/datasets/lakshmi25npathi/imdb-dataset-of-50k-movie-reviews" target="_blank" rel="noopener">IMDB Reviews Dataset&lt;/a>&lt;/p>
&lt;h3 id="deliverables-1">Deliverables&lt;/h3>
&lt;ul>
&lt;li>Text preprocessing&lt;/li>
&lt;li>Tokenization&lt;/li>
&lt;li>LSTM model&lt;/li>
&lt;li>F1-score evaluation&lt;/li>
&lt;li>Error analysis&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="-option-c-text-summarization-transformer">🔹 Option C: Text Summarization (Transformer)&lt;/h2>
&lt;p>&lt;strong>Dataset:&lt;/strong>&lt;br>
&lt;a href="https://github.com/abisee/cnn-dailymail" target="_blank" rel="noopener">CNN/DailyMail Dataset&lt;/a>&lt;/p>
&lt;h3 id="deliverables-2">Deliverables&lt;/h3>
&lt;ul>
&lt;li>Fine-tuning pretrained model&lt;/li>
&lt;li>ROUGE evaluation&lt;/li>
&lt;li>Generated samples&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h1 id="-expected-outcomes">🎯 Expected Outcomes&lt;/h1>
&lt;p>After completing this roadmap, students will be able to:&lt;/p>
&lt;ul>
&lt;li>Build neural networks from scratch&lt;/li>
&lt;li>Train CNN, RNN, and LSTM models&lt;/li>
&lt;li>Understand and implement Transformers&lt;/li>
&lt;li>Fine-tune pretrained models&lt;/li>
&lt;li>Write structured DL project reports&lt;/li>
&lt;li>Apply Deep Learning in internships and projects&lt;/li>
&lt;/ul>
&lt;hr></description></item><item><title>A Self-Learning Roadmap to Machine Learning</title><link>https://anurajmohan.in/read/ml/</link><pubDate>Wed, 18 Feb 2026 00:00:00 +0000</pubDate><guid>https://anurajmohan.in/read/ml/</guid><description>&lt;h2 id="-overview">📌 Overview&lt;/h2>
&lt;p>This roadmap is designed for students starting Machine Learning from scratch.&lt;br>
It is split into two phases:&lt;/p>
&lt;ul>
&lt;li>&lt;strong>Phase 1:&lt;/strong> Foundations (Math + Python)&lt;/li>
&lt;li>&lt;strong>Phase 2:&lt;/strong> ML Concepts + Practice + Mini Projects&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="phase-1-machine-learning-foundations">Phase 1: Machine Learning Foundations&lt;/h2>
&lt;h3 id="1-math-for-machine-learning">1) Math for Machine Learning&lt;/h3>
&lt;p>Before starting ML, students should be comfortable with the following topics.&lt;/p>
&lt;hr>
&lt;h4 id="-topics-to-learn">✅ Topics to Learn&lt;/h4>
&lt;p>&lt;strong>Linear Algebra&lt;/strong>&lt;/p>
&lt;ul>
&lt;li>Vectors, matrices&lt;/li>
&lt;li>Eigenvalues, eigenvectors&lt;/li>
&lt;/ul>
&lt;p>&lt;strong>Calculus &amp;amp; Gradients&lt;/strong>&lt;/p>
&lt;ul>
&lt;li>Derivatives&lt;/li>
&lt;li>Chain rule&lt;/li>
&lt;li>Gradients&lt;/li>
&lt;/ul>
&lt;p>&lt;strong>Probability &amp;amp; Statistics&lt;/strong>&lt;/p>
&lt;ul>
&lt;li>Distributions&lt;/li>
&lt;li>Expectation&lt;/li>
&lt;li>Mean, variance&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h4 id="-best-resources">⭐ Best Resources&lt;/h4>
&lt;p>&lt;strong>Linear Algebra&lt;/strong>&lt;/p>
&lt;ul>
&lt;li>&lt;a href="https://youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab&amp;amp;si=ecPHeuQnKV83hPlc" target="_blank" rel="noopener">3Blue1Brown Playlist&lt;/a>&lt;/li>
&lt;/ul>
&lt;p>&lt;strong>Calculus&lt;/strong>&lt;/p>
&lt;ul>
&lt;li>&lt;a href="https://youtube.com/playlist?list=PLZHQObOWTQDMsr9K-rj53DwVRMYO3t5Yr&amp;amp;si=gPBc9ST2Ha5CJSiE" target="_blank" rel="noopener">3Blue1Brown Playlist&lt;/a>&lt;/li>
&lt;/ul>
&lt;p>&lt;strong>Probability and Statistics&lt;/strong>&lt;/p>
&lt;ul>
&lt;li>&lt;a href="https://youtube.com/playlist?list=PLblh5JKOoLUK0FLuzwntyYI10UQFUhsY9&amp;amp;si=xZdAjuw9JXCSSk6y" target="_blank" rel="noopener">StatQuest Playlist&lt;/a>&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h4 id="-courses-optional-but-recommended">🎓 Courses (Optional but Recommended)&lt;/h4>
&lt;ul>
&lt;li>&lt;a href="https://www.coursera.org/specializations/mathematics-machine-learning" target="_blank" rel="noopener">Coursera: Mathematics for Machine Learning&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://www.scaler.com/topics/course/mathematics-for-machine-learning-free-course/" target="_blank" rel="noopener">Scaler: Math for ML (Free)&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://www.linkedin.com/learning/paths/foundational-math-for-machine-learning" target="_blank" rel="noopener">LinkedIn Learning Path&lt;/a>&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h3 id="2-python-for-data-science">2) Python for Data Science&lt;/h3>
&lt;p>Students should become confident in using Python for data handling and visualization.&lt;/p>
&lt;hr>
&lt;h4 id="-skills-students-must-learn">✅ Skills Students Must Learn&lt;/h4>
&lt;ul>
&lt;li>NumPy — numerical computing&lt;/li>
&lt;li>Pandas — data manipulation&lt;/li>
&lt;li>Matplotlib / Seaborn — visualization&lt;/li>
&lt;li>scikit-learn — ML models (after ML basics)&lt;/li>
&lt;li>PyTorch — deep learning (after DL basics)&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h4 id="-resources">⭐ Resources&lt;/h4>
&lt;ul>
&lt;li>&lt;a href="https://youtube.com/playlist?list=PLWKjhJtqVAblvI1i46ScbKV2jH1gdL7VQ&amp;amp;si=ynhbspg4N5SuSG6b9JXCSSk6y" target="_blank" rel="noopener">Data Analysis with Python (YouTube Course)&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://github.com/veb-101/Numpy-Pandas-Matplotlib-Tutorial" target="_blank" rel="noopener">NumPy + Pandas + Matplotlib GitHub&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://www.kaggle.com/learn/python" target="_blank" rel="noopener">Kaggle Python&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://github.com/Probability-Statistics-Jupyter-Notebook/probability-statistics-notebook" target="_blank" rel="noopener">Probability and Statistics Notebook&lt;/a>&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="phase-2-machine-learning-concepts--practice">Phase 2: Machine Learning Concepts + Practice&lt;/h2>
&lt;h3 id="1-ml-core-concepts">1) ML Core Concepts&lt;/h3>
&lt;p>Students should clearly understand:&lt;/p>
&lt;ul>
&lt;li>Features (X) and Target (y)&lt;/li>
&lt;li>Train vs Test split&lt;/li>
&lt;li>Overfitting vs Underfitting&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h3 id="2-models-to-learn-beginner-level">2) Models to Learn (Beginner Level)&lt;/h3>
&lt;p>Start with these 3 models:&lt;/p>
&lt;ul>
&lt;li>Linear Regression&lt;/li>
&lt;li>Logistic Regression&lt;/li>
&lt;li>Decision Trees&lt;/li>
&lt;/ul>
&lt;p>Also learn the basics of:&lt;/p>
&lt;p>&lt;strong>Representation → Loss Function → Optimization&lt;/strong>&lt;/p>
&lt;hr>
&lt;h3 id="3-evaluation-metrics">3) Evaluation Metrics&lt;/h3>
&lt;h4 id="regression-metrics">Regression Metrics&lt;/h4>
&lt;ul>
&lt;li>MAE&lt;/li>
&lt;li>MSE&lt;/li>
&lt;li>RMSE&lt;/li>
&lt;/ul>
&lt;h4 id="classification-metrics">Classification Metrics&lt;/h4>
&lt;ul>
&lt;li>Accuracy&lt;/li>
&lt;li>Confusion Matrix&lt;/li>
&lt;li>Precision, Recall, F1-score&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="-resources-for-phase-2">📚 Resources for Phase 2&lt;/h2>
&lt;h3 id="theory-resources">Theory Resources&lt;/h3>
&lt;ul>
&lt;li>&lt;a href="https://developers.google.com/machine-learning/crash-course" target="_blank" rel="noopener">Google ML Crash Course&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://statquest.org/" target="_blank" rel="noopener">StatQuest&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://machinelearningmastery.com/" target="_blank" rel="noopener">Machine Learning Mastery&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://www.coursera.org/learn/machine-learning" target="_blank" rel="noopener">Coursera: Machine Learning&lt;/a>&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h3 id="hands-on-resources">Hands-on Resources&lt;/h3>
&lt;ul>
&lt;li>&lt;a href="https://www.kaggle.com/learn/intro-to-machine-learning" target="_blank" rel="noopener">Kaggle Intro to ML&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://github.com/jakevdp/PythonDataScienceHandbook" target="_blank" rel="noopener">Python Data Science Handbook&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://github.com/microsoft/ML-For-Beginners" target="_blank" rel="noopener">Microsoft ML for Beginners&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://youtube.com/playlist?list=PL5-da3qGB5ICeMbQuqbbCOQWcS6OYBr5A&amp;amp;si=UcZQGrRt7LJKirts" target="_blank" rel="noopener">YouTube ML Playlist&lt;/a>&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="-mini-projects-choose-any-2">🧪 Mini Projects (Choose Any 2)&lt;/h2>
&lt;p>Students must complete &lt;strong>any 2 projects&lt;/strong> from the list below.&lt;/p>
&lt;hr>
&lt;h3 id="option-a-house-price-prediction-regression">Option A: House Price Prediction (Regression)&lt;/h3>
&lt;p>Dataset:&lt;/p>
&lt;ul>
&lt;li>&lt;a href="https://www.kaggle.com/c/house-prices-advanced-regression-techniques" target="_blank" rel="noopener">House Prices Kaggle Dataset&lt;/a>&lt;/li>
&lt;/ul>
&lt;p>Deliverables:&lt;/p>
&lt;ul>
&lt;li>Preprocessing&lt;/li>
&lt;li>Model training&lt;/li>
&lt;li>RMSE evaluation&lt;/li>
&lt;li>Conclusion&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h3 id="option-b-titanic-survival-prediction-classification">Option B: Titanic Survival Prediction (Classification)&lt;/h3>
&lt;p>Dataset:&lt;/p>
&lt;ul>
&lt;li>&lt;a href="https://www.kaggle.com/c/titanic" target="_blank" rel="noopener">Titanic Kaggle Dataset&lt;/a>&lt;/li>
&lt;/ul>
&lt;p>Deliverables:&lt;/p>
&lt;ul>
&lt;li>Encoding + preprocessing&lt;/li>
&lt;li>Model training&lt;/li>
&lt;li>Confusion matrix&lt;/li>
&lt;li>Precision/Recall&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h3 id="option-c-student-performance-prediction">Option C: Student Performance Prediction&lt;/h3>
&lt;p>Dataset:&lt;/p>
&lt;ul>
&lt;li>&lt;a href="https://www.kaggle.com/datasets/spscientist/students-performance-in-exams" target="_blank" rel="noopener">Students Performance Dataset&lt;/a>&lt;/li>
&lt;/ul>
&lt;p>Deliverables:&lt;/p>
&lt;ul>
&lt;li>EDA&lt;/li>
&lt;li>Correlation analysis&lt;/li>
&lt;li>Model training&lt;/li>
&lt;li>Evaluation&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h3 id="option-d-diabetes-prediction">Option D: Diabetes Prediction&lt;/h3>
&lt;p>Dataset:&lt;/p>
&lt;ul>
&lt;li>&lt;a href="https://www.kaggle.com/datasets/uciml/pima-indians-diabetes-database" target="_blank" rel="noopener">Pima Indians Diabetes Dataset&lt;/a>&lt;/li>
&lt;/ul>
&lt;p>Deliverables:&lt;/p>
&lt;ul>
&lt;li>Classification model&lt;/li>
&lt;li>F1-score evaluation&lt;/li>
&lt;li>Conclusion&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="-final-submission-format-phase-1--phase-2">📦 Final Submission Format (Phase 1 + Phase 2)&lt;/h2>
&lt;p>Students must submit &lt;strong>one notebook/report&lt;/strong> containing:&lt;/p>
&lt;ul>
&lt;li>Dataset loading&lt;/li>
&lt;li>Data cleaning + missing value handling&lt;/li>
&lt;li>Exploratory Data Analysis (EDA) + plots&lt;/li>
&lt;li>Feature engineering (basic)&lt;/li>
&lt;li>Model training&lt;/li>
&lt;li>Evaluation metrics&lt;/li>
&lt;li>Final conclusion (5–10 lines)&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="-outcome">✅ Outcome&lt;/h2>
&lt;p>By the end of Phase 1 and Phase 2, students will be able to:&lt;/p>
&lt;ul>
&lt;li>Understand ML fundamentals clearly&lt;/li>
&lt;li>Build beginner ML models using scikit-learn&lt;/li>
&lt;li>Evaluate models properly&lt;/li>
&lt;li>Complete 2 end-to-end mini projects&lt;/li>
&lt;li>Write a clean ML notebook/report for submission&lt;/li>
&lt;/ul>
&lt;hr></description></item><item><title>Introduction to Geometric Deep Learning</title><link>https://anurajmohan.in/read/gdl/</link><pubDate>Tue, 01 Feb 2022 00:00:00 +0000</pubDate><guid>https://anurajmohan.in/read/gdl/</guid><description>&lt;p>A Collection of good reads on Geometric Deep Learning&lt;/p>
&lt;ul>
&lt;li>👉 &lt;a href="https://geometricdeeplearning.com/" target="_blank" rel="noopener">&lt;strong>https://geometricdeeplearning.com/&lt;/strong>&lt;/a>&lt;/li>
&lt;li>👉 &lt;a href="https://flawnsontong.medium.com/what-is-geometric-deep-learning-b2adb662d91d" target="_blank" rel="noopener">&lt;strong>Blog on Medium&lt;/strong>&lt;/a>&lt;/li>
&lt;li>👉 &lt;a href="https://arxiv.org/pdf/1611.08097.pdf" target="_blank" rel="noopener">&lt;strong>Paper on Arxiv&lt;/strong>&lt;/a>&lt;/li>
&lt;li>👉 &lt;a href="https://pytorch-geometric.readthedocs.io/en/latest/" target="_blank" rel="noopener">&lt;strong>Python based Tool - Pytorch Geometric&lt;/strong>&lt;/a>&lt;/li>
&lt;/ul></description></item><item><title>Introduction to Network Representation Learning</title><link>https://anurajmohan.in/read/nrl/</link><pubDate>Sat, 19 Jun 2021 00:00:00 +0000</pubDate><guid>https://anurajmohan.in/read/nrl/</guid><description>&lt;p>Get familarized with Network Representation Learning&lt;/p>
&lt;ul>
&lt;li>👉 &lt;a href="http://snap.stanford.edu/proj/embeddings-www/" target="_blank" rel="noopener">&lt;strong>Stanford Link&lt;/strong>&lt;/a>&lt;/li>
&lt;li>👉 &lt;a href="https://towardsdatascience.com/introduction-to-graph-representation-learning-a51c963d8d11" target="_blank" rel="noopener">&lt;strong>Blog on towardsdatascience&lt;/strong>&lt;/a>&lt;/li>
&lt;li>👉 &lt;a href="https://www.cs.mcgill.ca/~wlh/grl_book/" target="_blank" rel="noopener">&lt;strong>E-book&lt;/strong>&lt;/a>&lt;/li>
&lt;li>👉 &lt;a href="https://github.com/thunlp/NRLPapers" target="_blank" rel="noopener">&lt;strong>Must Read Papers - Collection in Github&lt;/strong>&lt;/a>&lt;/li>
&lt;li>👉 &lt;a href="https://ai.googleblog.com/2019/06/innovations-in-graph-representation.html" target="_blank" rel="noopener">&lt;strong>Google AI blog&lt;/strong>&lt;/a>&lt;/li>
&lt;li>👉 &lt;a href="https://www.dgl.ai/" target="_blank" rel="noopener">&lt;strong>Python based Tool on Graph Neural Network - DGL&lt;/strong>&lt;/a>&lt;/li>
&lt;li>👉 &lt;a href="https://www.stellargraph.io/" target="_blank" rel="noopener">&lt;strong>Python based Tool for Graph Machine Learning - StellarGraph&lt;/strong>&lt;/a>&lt;/li>
&lt;/ul></description></item><item><title>Concepts and Tools for Social Network Analysis &amp; Mining</title><link>https://anurajmohan.in/read/snm/</link><pubDate>Fri, 09 Apr 2021 00:00:00 +0000</pubDate><guid>https://anurajmohan.in/read/snm/</guid><description>&lt;p>Concepts and Tools for Social Network Analysis &amp;amp; Mining&lt;/p>
&lt;ul>
&lt;li>👉 &lt;a href="https://www.sciencedirect.com/topics/computer-science/social-network-analysis" target="_blank" rel="noopener">&lt;strong>Papers on Social Network Analysis&lt;/strong>&lt;/a>&lt;/li>
&lt;li>👉 &lt;a href="https://towardsdatascience.com/how-to-get-started-with-social-network-analysis-6d527685d374" target="_blank" rel="noopener">&lt;strong>Blog on towardsdatascience&lt;/strong>&lt;/a>&lt;/li>
&lt;li>👉 &lt;a href="https://www.springer.com/gp/book/9781441962867" target="_blank" rel="noopener">&lt;strong>E-book&lt;/strong>&lt;/a>&lt;/li>
&lt;li>👉 &lt;a href="https://research.google/teams/graph-mining/" target="_blank" rel="noopener">&lt;strong>Google AI Research Page on Graph Mining&lt;/strong>&lt;/a>&lt;/li>
&lt;li>👉 &lt;a href="https://snap.stanford.edu/data/" target="_blank" rel="noopener">&lt;strong>Stanford Network Datasets&lt;/strong>&lt;/a>&lt;/li>
&lt;li>👉 &lt;a href="https://networkx.org//" target="_blank" rel="noopener">&lt;strong>Python based Tool for Network Analysis - NetworkX&lt;/strong>&lt;/a>&lt;/li>
&lt;/ul></description></item></channel></rss>