Convert Figma logo to code with AI

dair-ai logoML-Course-Notes

🎓 Sharing machine learning course / lecture notes.

6,017
795
6,017
5

Top Related Projects

12 weeks, 26 lessons, 52 quizzes, classic Machine Learning for all

A series of Jupyter notebooks that walk you through the fundamentals of Machine Learning and Deep Learning in Python using Scikit-Learn, Keras and TensorFlow 2.

VIP cheatsheets for Stanford's CS 229 Machine Learning

The "Python Machine Learning (3rd edition)" book code repository

Machine Learning From Scratch. Bare bones NumPy implementations of machine learning models and algorithms with a focus on accessibility. Aims to cover everything from linear regression to deep learning.

A complete daily plan for studying to become a machine learning engineer.

Quick Overview

The dair-ai/ML-Course-Notes repository is a collection of comprehensive notes and resources for various machine learning courses. It covers a wide range of topics from introductory machine learning concepts to advanced deep learning techniques, providing a valuable resource for students and practitioners in the field of AI and machine learning.

Pros

  • Extensive coverage of multiple ML courses from reputable institutions
  • Well-organized and structured content, making it easy to navigate
  • Regular updates with new course materials and resources
  • Free and open-source, accessible to anyone interested in learning ML

Cons

  • May lack interactive elements or hands-on exercises
  • Could be overwhelming for complete beginners due to the breadth of content
  • Might require additional context or background knowledge for some topics
  • Dependent on external course materials, which may change or become unavailable

Note: As this is not a code library, the code example and quick start sections have been omitted as per the instructions.

Competitor Comparisons

12 weeks, 26 lessons, 52 quizzes, classic Machine Learning for all

Pros of ML-For-Beginners

  • More comprehensive curriculum covering a wide range of ML topics
  • Includes hands-on projects and quizzes for practical learning
  • Well-structured with clear learning paths and lesson plans

Cons of ML-For-Beginners

  • May be overwhelming for absolute beginners due to its breadth
  • Less focus on mathematical foundations compared to ML-Course-Notes
  • Requires more time commitment to complete the entire course

Code Comparison

ML-For-Beginners:

from sklearn.model_selection import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)
model = RandomForestClassifier(n_estimators=100)
model.fit(X_train, y_train)

ML-Course-Notes:

import numpy as np
def sigmoid(x):
    return 1 / (1 + np.exp(-x))
def forward_propagation(X, W1, b1, W2, b2):
    Z1 = np.dot(W1, X) + b1
    A1 = sigmoid(Z1)

The code snippets demonstrate that ML-For-Beginners focuses more on practical implementation using popular libraries, while ML-Course-Notes emphasizes understanding the underlying algorithms and mathematics.

A series of Jupyter notebooks that walk you through the fundamentals of Machine Learning and Deep Learning in Python using Scikit-Learn, Keras and TensorFlow 2.

Pros of handson-ml2

  • Comprehensive, hands-on approach with practical code examples
  • Covers a wide range of ML topics, including deep learning and neural networks
  • Regularly updated to reflect the latest developments in ML libraries and techniques

Cons of handson-ml2

  • May be overwhelming for absolute beginners due to its depth and breadth
  • Focuses primarily on scikit-learn and TensorFlow, potentially limiting exposure to other ML frameworks
  • Requires more time investment to work through all the examples and exercises

Code Comparison

ML-Course-Notes (PyTorch example):

import torch
import torch.nn as nn

model = nn.Sequential(
    nn.Linear(input_size, hidden_size),
    nn.ReLU(),
    nn.Linear(hidden_size, output_size)
)

handson-ml2 (TensorFlow example):

import tensorflow as tf

model = tf.keras.models.Sequential([
    tf.keras.layers.Dense(hidden_size, activation="relu", input_shape=[input_size]),
    tf.keras.layers.Dense(output_size)
])

Both repositories provide valuable resources for learning machine learning, with ML-Course-Notes offering a broader overview of various ML concepts and handson-ml2 providing a more in-depth, practical approach using specific libraries.

VIP cheatsheets for Stanford's CS 229 Machine Learning

Pros of stanford-cs-229-machine-learning

  • Comprehensive coverage of Stanford's CS229 course content
  • Well-organized cheatsheets for quick reference
  • Available in multiple languages

Cons of stanford-cs-229-machine-learning

  • Focused on a single course, potentially limiting broader ML topics
  • Less frequent updates compared to ML-Course-Notes

Code Comparison

ML-Course-Notes:

import numpy as np
from sklearn.linear_model import LinearRegression

X = np.array([[1, 1], [1, 2], [2, 2], [2, 3]])
y = np.dot(X, np.array([1, 2])) + 3
reg = LinearRegression().fit(X, y)

stanford-cs-229-machine-learning:

import numpy as np

def gradient_descent(X, y, theta, alpha, num_iters):
    m = len(y)
    for i in range(num_iters):
        h = np.dot(X, theta)
        theta = theta - (alpha / m) * np.dot(X.T, (h - y))
    return theta

Both repositories provide valuable resources for machine learning students. ML-Course-Notes offers a broader range of topics and more frequent updates, while stanford-cs-229-machine-learning provides in-depth coverage of a specific course with well-organized cheatsheets. The code examples demonstrate different approaches, with ML-Course-Notes using scikit-learn for implementation and stanford-cs-229-machine-learning focusing on fundamental algorithms.

The "Python Machine Learning (3rd edition)" book code repository

Pros of python-machine-learning-book-3rd-edition

  • Comprehensive coverage of machine learning concepts with practical examples
  • Includes Jupyter notebooks for hands-on learning and experimentation
  • Based on a published book, offering a structured learning path

Cons of python-machine-learning-book-3rd-edition

  • May be less frequently updated compared to community-driven course notes
  • Focuses primarily on Python, which might limit exposure to other languages/frameworks
  • Could be more challenging for absolute beginners in programming

Code Comparison

ML-Course-Notes (PyTorch example):

import torch
model = torch.nn.Linear(10, 1)
criterion = torch.nn.MSELoss()
optimizer = torch.optim.SGD(model.parameters(), lr=0.01)

python-machine-learning-book-3rd-edition (scikit-learn example):

from sklearn.linear_model import LogisticRegression
from sklearn.model_selection import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3, random_state=1)
lr = LogisticRegression(random_state=1)
lr.fit(X_train, y_train)

Both repositories provide valuable resources for learning machine learning. ML-Course-Notes offers a broader overview of various ML topics and frameworks, while python-machine-learning-book-3rd-edition provides a more structured, in-depth approach focused on Python implementations. The choice between them depends on the learner's preferences and prior experience.

Machine Learning From Scratch. Bare bones NumPy implementations of machine learning models and algorithms with a focus on accessibility. Aims to cover everything from linear regression to deep learning.

Pros of ML-From-Scratch

  • Provides hands-on implementation of ML algorithms from scratch, offering a deeper understanding of their inner workings
  • Includes a wide range of algorithms, from basic to advanced, covering various aspects of machine learning
  • Code is well-organized and documented, making it easy to follow and learn from

Cons of ML-From-Scratch

  • Lacks comprehensive theoretical explanations and mathematical foundations
  • May not cover the latest advancements in machine learning techniques and architectures
  • Focuses primarily on implementation, which may not be suitable for those seeking a broader overview of ML concepts

Code Comparison

ML-From-Scratch:

class LinearRegression:
    def __init__(self):
        self.w = None
        self.b = None

    def fit(self, X, y):
        X = np.insert(X, 0, 1, axis=1)
        self.w = np.linalg.inv(X.T.dot(X)).dot(X.T).dot(y)
        self.b = self.w[0]
        self.w = self.w[1:]

ML-Course-Notes:

# No direct code implementation available in the repository
# The repository focuses on providing lecture notes and explanations

ML-Course-Notes primarily offers theoretical explanations and course notes, while ML-From-Scratch provides practical implementations of algorithms. The choice between the two depends on whether the user is looking for theoretical understanding or hands-on coding experience in machine learning.

A complete daily plan for studying to become a machine learning engineer.

Pros of machine-learning-for-software-engineers

  • Provides a comprehensive roadmap for software engineers to learn machine learning
  • Includes a wide range of resources, from beginner to advanced levels
  • Offers practical advice and tips for applying ML in software engineering contexts

Cons of machine-learning-for-software-engineers

  • Less structured than ML-Course-Notes, which follows specific course curricula
  • May lack the depth of academic-focused resources found in ML-Course-Notes
  • Could be overwhelming for beginners due to the vast amount of information

Code Comparison

ML-Course-Notes typically includes more academic-style code snippets:

def gradient_descent(X, y, theta, alpha, num_iters):
    m = len(y)
    for _ in range(num_iters):
        h = np.dot(X, theta)
        theta = theta - (alpha / m) * np.dot(X.T, (h - y))
    return theta

machine-learning-for-software-engineers focuses on practical implementations:

from sklearn.linear_model import LinearRegression

model = LinearRegression()
model.fit(X_train, y_train)
predictions = model.predict(X_test)

Both repositories serve different purposes: ML-Course-Notes provides in-depth academic knowledge, while machine-learning-for-software-engineers offers a practical roadmap for software engineers to integrate ML into their skillset.

Convert Figma logo designs to code with AI

Visual Copilot

Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.

Try Visual Copilot

README

🎓 Machine Learning Course Notes

A place to collaborate and share lecture notes on all topics related to machine learning, NLP, and AI.

WIP denotes work in progress.


Machine Learning Specialization (2022)

Website | Instructor: Andrew Ng

Lecture Description Video Notes Author
Introduction to Machine Learning Supervised Machine Learning: Regression and Classification Videos Notes Elvis
Advanced Learning Algorithms Advanced Learning Algorithms Videos WIP Elvis
Unsupervised Learning, Recommenders, Reinforcement Learning Unsupervised Learning, Recommenders, Reinforcement Learning Videos WIP Elvis

MIT 6.S191 Introduction to Deep Learning (2022)

Website | Lectures by: Alexander Amini and Ava Soleimany

Lecture Description Video Notes Author
Introduction to Deep Learning Basic fundamentals of neural networks and deep learning. Video Notes Elvis
RNNs and Transformers Introduction to recurrent neural networks and transformers. Video Notes Elvis
Deep Computer Vision Deep Neural Networks for Computer Vision. Video Notes Elvis
Deep Generative Modeling Autoencoders and GANs. Video Notes Elvis
Deep Reinforcement Learning Deep RL key concepts and DQNs. Video Notes Elvis

CMU Neural Nets for NLP (2021)

Website | Instructor: Graham Neubig

Lecture Description Video Notes Author
Introduction to Simple Neural Networks for NLP Provides an introduction to neural networks for NLP covering concepts like BOW, CBOW, and Deep CBOW Video Notes Elvis

CS224N: Natural Language Processing with Deep Learning (2022)

Website | Instructor: C‪hristopher Manning

Lecture Description Video Notes Author
Introduction and Word Vectors Introduction to NLP and Word Vectors. Video Notes Elvis
Neural Classifiers Neural Classifiers for NLP. Video WIP Elvis

CS25: Transformers United

Website | Instructors: Div Garg, Chetanya Rastogi, Advay Pal

Lecture Description Video Notes Author
Introduction to Transformers A short summary of attention and Transformers. Video Notes Elvis
Transformers in Language: GPT-3, Codex The development of GPT Models including GPT3. Video WIP Elvis

Neural Networks: Zero to Hero

Lectures | Instructors: Andrej Karpathy

Lecture Description Video Notes Author
Let's build GPT: from scratch, in code, spelled out Detailed walkthrough of GPT Video WIP Elvis

Miscellaneous Lectures

Lecture Description Video Notes Author
Introduction to Diffusion Models Technical overview of Diffusion Models Video WIP Elvis
Reinforcement Learning from Human Feedback (RLHF) Overview of RLHF Video WIP Elvis

How To Contribute

  1. Identify a course and lecture from this list. If you are working on notes for a lecture, please indicate by opening an issue. This avoids duplicate work.
  2. Write your notes, preferably in a Google document, Notion document, or GitHub repo.
  3. We care about quality, so make sure to revise your notes before submitting.
  4. Once you are finished, open a PR here.

If you have any questions, open an issue or reach out to me on Twitter.

Join our Discord.