Skip to content

EntangledQuantum/LearningAI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

4 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

๐Ÿš€ LearningAI - My AI/ML Learning Journey

Python PyTorch License Status

A comprehensive repository documenting my journey through AI and Machine Learning fundamentals, from basic neural networks to physics-based models.

โœจ Features โ€ข ๐Ÿ“ Directory Structure โ€ข ๐ŸŽ“ Learning Path โ€ข ๐Ÿ”— Resources


๐ŸŽฏ Overview

This repository serves as my personal learning hub for AI/ML concepts and implementations. It contains:

  • Foundational concepts from tutorials and textbooks
  • Hands-on implementations using PyTorch
  • Physics-based models and simulations
  • Progressive complexity from basics to advanced topics

Each project includes detailed notebooks with explanations, code, and results.


โœจ Features

  • ๐Ÿ“š Tutorial-Based Learning - Follow along with industry-standard courses
  • ๐Ÿ”ฌ Practical Implementations - Real code, not just theory
  • ๐Ÿ“Š Jupyter Notebooks - Interactive learning with visualizations
  • ๐ŸŽ“ Well-Documented - Comments and explanations throughout
  • ๐Ÿง  Progressive Difficulty - Start simple, build complexity
  • ๐Ÿš€ Production-Ready Code - Clean, modular implementations

๐Ÿ“ Directory Structure

LearningAI/
โ”œโ”€โ”€ Basics/
โ”‚   โ”œโ”€โ”€ biagram_model/
โ”‚   โ”‚   โ”œโ”€โ”€ makemore.ipynb              # ๐Ÿ“Œ [Andrej Karpathy - Makemore Tutorial]
โ”‚   โ”‚   โ””โ”€โ”€ names.txt                   # Dataset: 32K English baby names
โ”‚   โ””โ”€โ”€ py_torch_basics.py              # PyTorch fundamentals
โ”‚
โ”œโ”€โ”€ PhysicsBased/
โ”‚   โ””โ”€โ”€ basics.py                       # Physics-based model implementations
โ”‚
โ””โ”€โ”€ README.md                           # You are here! ๐Ÿ‘ˆ

๐Ÿ“š Basics Directory

The Basics folder contains foundational concepts and implementations:

๐ŸŽฌ Biagram Model (Makemore)

  • Tutorial: Makemore - Andrej Karpathy
  • Content: Building a character-level language model to generate baby names
  • Key Concepts:
    • Bigram probability distributions
    • Character encoding/decoding
    • PyTorch tensor operations
    • Probability sampling with generators
  • Dataset: 32,033 English baby names
  • Output: Generated synthetic names based on learned patterns

๐Ÿ”ง PyTorch Basics

  • Fundamental PyTorch operations
  • Tensor manipulations
  • Basic neural network concepts

๐Ÿงฌ PhysicsBased Directory

Advanced implementations incorporating physics principles:

  • Physics-informed neural networks
  • Conservation laws
  • Differential equations
  • Simulation-based learning

๐ŸŽ“ Learning Path

Phase 1: Foundations (Current) โœ…

  • Character-level language models
  • Probability distributions
  • PyTorch basics
  • Multi-layer perceptrons

Phase 2: Intermediate (Upcoming)

  • Recurrent Neural Networks (RNNs)
  • Attention mechanisms
  • Transformer architectures
  • Fine-tuning pre-trained models

Phase 3: Advanced (Future)

  • Physics-informed neural networks (PINNs)
  • Graph neural networks
  • Reinforcement learning
  • Diffusion models

๐Ÿš€ Quick Start

Prerequisites

python >= 3.12
uv (Fast Python package installer)

Installation

# Clone the repository
git clone https://github.com/yourusername/LearningAI.git
cd LearningAI

# Create and sync virtual environment with uv
uv sync

# Activate the virtual environment
source .venv/bin/activate  # On Windows: .venv\Scripts\activate

Running the Notebooks

# Start Jupyter with uv
uv run jupyter notebook

# Open and run:
# - Basics/biagram_model/makemore.ipynb
# - Other notebooks as you explore

Running Python Scripts

# Run PyTorch basics
uv run python Basics/py_torch_basics.py

# Run physics-based models
uv run python PhysicsBased/basics.py

Installing Additional Dependencies

# Add new packages to the project
uv pip install package_name

# Or use uv add for managed dependencies
uv add package_name

๐Ÿ“Š Project Highlights

Makemore - Character-Level Language Model

This project implements a simple but elegant language model:

๐Ÿ“ˆ Model Architecture:
โ”œโ”€โ”€ Input Layer: Character embeddings
โ”œโ”€โ”€ Bigram Statistics: Probability distributions
โ””โ”€โ”€ Output Layer: Next character prediction

๐Ÿ“Š Results:
โ”œโ”€โ”€ Vocabulary Size: 27 characters (a-z + special token)
โ”œโ”€โ”€ Training Data: 32,033 names
โ””โ”€โ”€ Sample Output: Generated realistic names from learned patterns

Key Learnings:

  • Building probability distributions from data
  • Character encoding strategies
  • Sampling from distributions
  • Data visualization with matplotlib

๐Ÿ”— Resources

Recommended Tutorials & Courses

Books

  • "Deep Learning" by Goodfellow, Bengio, and Courville
  • "Neural Networks from Scratch" by Trask
  • "Physics-Informed Machine Learning" - Recent research papers

Tools & Libraries

  • ๐Ÿ”ฅ PyTorch - Deep learning framework
  • ๐Ÿ““ Jupyter Notebooks - Interactive computing
  • ๐Ÿ“Š Matplotlib - Data visualization
  • ๐Ÿ”ข NumPy - Numerical computing

๐Ÿ’ก Key Concepts Covered

Concept Status Location
Character Encoding โœ… Complete Basics/biagram_model/
Probability Distributions โœ… Complete Basics/biagram_model/
PyTorch Tensors โœ… Complete Basics/py_torch_basics.py
Physics-Based Models ๐Ÿ”„ In Progress PhysicsBased/
RNNs โณ Planned TBD
Transformers โณ Planned TBD

๐Ÿค Contributing

This is a personal learning repository, but I welcome suggestions! Feel free to:

  • Report issues or corrections
  • Suggest improvements
  • Share learning resources
  • Discuss concepts

๐Ÿ“ Notes & Documentation

Each file includes:

  • Comments: Inline explanations of complex logic
  • Docstrings: Function and module documentation
  • Markdown cells (in notebooks): Concept explanations
  • Output examples: Expected results and visualizations

๐ŸŽฏ Goals & Objectives

Short Term (Next 3 months):

  • โœ… Master character-level language models
  • โณ Implement RNNs from scratch
  • โณ Build a simple transformer

Medium Term (Next 6 months):

  • โณ Implement attention mechanisms
  • โณ Explore transfer learning
  • โณ Create physics-informed models

Long Term (Next Year):

  • โณ Build advanced neural architectures
  • โณ Contribute to open-source ML projects
  • โณ Create production-ready models

๐Ÿ”ฎ Future Additions

  • Recurrent Neural Networks (RNNs)
  • Long Short-Term Memory (LSTM) networks
  • Gated Recurrent Units (GRUs)
  • Attention Mechanisms
  • Transformer from scratch
  • Vision Transformers (ViT)
  • Physics-Informed Neural Networks (PINNs)
  • Graph Neural Networks (GNNs)
  • Reinforcement Learning fundamentals
  • Generative models (VAE, GAN, Diffusion)

๐Ÿ“š Notebook Descriptions

Basics/biagram_model/makemore.ipynb

Status: โœ… Complete
Time to Complete: ~2 hours
Difficulty: Beginner

A comprehensive walkthrough of building a character-level language model using bigram statistics. Starting from raw data loading to generating synthetic names, this notebook covers all the fundamentals needed to understand how language models work at the most basic level.

Topics Covered:

  • Data loading and preprocessing
  • Bigram extraction and counting
  • Probability matrix construction
  • Visualization of statistics
  • Sampling from distributions
  • Name generation

๐Ÿ› ๏ธ Tech Stack

Backend:
โ”œโ”€โ”€ Python 3.12+
โ”œโ”€โ”€ PyTorch 2.0+
โ”œโ”€โ”€ NumPy
โ””โ”€โ”€ Matplotlib

Development:
โ”œโ”€โ”€ Jupyter Notebook
โ”œโ”€โ”€ Git & GitHub
โ””โ”€โ”€ VS Code / Cursor IDE

๐Ÿ“ž Get In Touch

  • GitHub: [Your GitHub Profile]
  • Twitter: [@YourHandle]
  • LinkedIn: [Your LinkedIn]

โญ If This Repo Helped You!

If you found this repository useful for your own learning journey, please consider:

  • โญ Starring this repository
  • ๐Ÿ”„ Sharing it with others
  • ๐Ÿ’ฌ Leaving feedback and suggestions

๐Ÿ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.


๐Ÿš€ Happy Learning! Keep Building, Keep Improving!

Last Updated: December 22, 2025
Last Modified: 2 weeks ago

โฌ† Back to Top

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published