Convert Figma logo to code with AI

pytorch logotutorials

PyTorch tutorials.

8,095
4,022
8,095
234

Top Related Projects

6,106

TensorFlow documentation

Keras documentation, hosted live at keras.io

12 weeks, 26 lessons, 52 quizzes, classic Machine Learning for all

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.

scikit-learn: machine learning in Python

26,090

The fastai deep learning library

Quick Overview

PyTorch Tutorials is an official repository containing a comprehensive set of tutorials and examples for PyTorch, a popular open-source machine learning library. These tutorials cover a wide range of topics, from basic tensor operations to advanced deep learning techniques, providing both beginners and experienced practitioners with valuable resources to learn and master PyTorch.

Pros

  • Extensive coverage of PyTorch features and applications
  • Well-structured and regularly updated content
  • Official documentation, ensuring accuracy and reliability
  • Includes both text explanations and interactive Jupyter notebooks

Cons

  • Some tutorials may be overwhelming for absolute beginners
  • Occasional inconsistencies in coding style across different tutorials
  • Limited coverage of certain specialized topics or cutting-edge techniques

Code Examples

  1. Basic tensor operations:
import torch

# Create a tensor
x = torch.tensor([1, 2, 3])

# Perform operations
y = x + 2
z = torch.matmul(x, y)

print(y)
print(z)
  1. Simple neural network:
import torch
import torch.nn as nn

class SimpleNet(nn.Module):
    def __init__(self):
        super(SimpleNet, self).__init__()
        self.fc1 = nn.Linear(10, 5)
        self.fc2 = nn.Linear(5, 2)

    def forward(self, x):
        x = torch.relu(self.fc1(x))
        x = self.fc2(x)
        return x

model = SimpleNet()
print(model)
  1. Training loop example:
import torch.optim as optim

criterion = nn.CrossEntropyLoss()
optimizer = optim.SGD(model.parameters(), lr=0.01)

for epoch in range(10):
    for inputs, labels in dataloader:
        optimizer.zero_grad()
        outputs = model(inputs)
        loss = criterion(outputs, labels)
        loss.backward()
        optimizer.step()

Getting Started

To get started with PyTorch Tutorials:

  1. Clone the repository:

    git clone https://github.com/pytorch/tutorials.git
    
  2. Install dependencies:

    pip install -r requirements.txt
    
  3. Navigate to a specific tutorial directory and run the Jupyter notebook:

    jupyter notebook
    
  4. Alternatively, you can view the tutorials online at the official PyTorch website: https://pytorch.org/tutorials/

Competitor Comparisons

6,106

TensorFlow documentation

Pros of docs

  • More comprehensive documentation covering a wider range of TensorFlow features
  • Better organization with clear categorization of topics
  • Includes API reference alongside tutorials and guides

Cons of docs

  • Less focus on practical, hands-on examples compared to tutorials
  • May be overwhelming for beginners due to the sheer volume of information
  • Slower to update with the latest features and changes

Code Comparison

tutorials:

import torch

x = torch.rand(5, 3)
print(x)

docs:

import tensorflow as tf

x = tf.random.uniform((5, 3))
print(x)

Both repositories provide code examples, but tutorials tends to offer more complete, runnable examples, while docs often provides shorter snippets to illustrate specific concepts or API usage.

Summary

While docs offers a more comprehensive and well-organized resource for TensorFlow, tutorials provides a more hands-on, practical approach to learning PyTorch. The choice between the two depends on the user's preferred learning style and the specific framework they're working with.

Keras documentation, hosted live at keras.io

Pros of keras-io

  • More comprehensive documentation with detailed guides and examples
  • Better organized structure with clear categorization of topics
  • Includes interactive Colab notebooks for hands-on learning

Cons of keras-io

  • Less frequent updates compared to pytorch/tutorials
  • Focuses primarily on Keras, limiting exposure to other deep learning frameworks
  • May not cover some advanced topics found in pytorch/tutorials

Code Comparison

Keras-io example:

model = keras.Sequential([
    keras.layers.Dense(64, activation='relu'),
    keras.layers.Dense(10, activation='softmax')
])
model.compile(optimizer='adam', loss='categorical_crossentropy')

PyTorch Tutorials example:

class Net(nn.Module):
    def __init__(self):
        super(Net, self).__init__()
        self.fc1 = nn.Linear(784, 64)
        self.fc2 = nn.Linear(64, 10)

    def forward(self, x):
        x = F.relu(self.fc1(x))
        return F.softmax(self.fc2(x), dim=1)

Both repositories offer valuable resources for learning their respective frameworks. Keras-io provides a more structured and comprehensive learning experience, while PyTorch Tutorials offers a wider range of advanced topics and more frequent updates. The code examples demonstrate the different approaches to model creation in Keras and PyTorch.

12 weeks, 26 lessons, 52 quizzes, classic Machine Learning for all

Pros of ML-For-Beginners

  • Comprehensive curriculum covering various ML topics
  • Beginner-friendly with clear explanations and hands-on projects
  • Language-agnostic approach, not tied to a specific framework

Cons of ML-For-Beginners

  • Less focus on deep learning compared to PyTorch Tutorials
  • May not cover advanced topics in as much depth
  • Fewer code examples specific to production-ready implementations

Code Comparison

ML-For-Beginners (using scikit-learn):

from sklearn.model_selection import train_test_split
from sklearn.linear_model import LogisticRegression

X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2)
model = LogisticRegression()
model.fit(X_train, y_train)

PyTorch Tutorials:

import torch
import torch.nn as nn

model = nn.Linear(input_size, output_size)
criterion = nn.MSELoss()
optimizer = torch.optim.SGD(model.parameters(), lr=0.01)

ML-For-Beginners provides a broader introduction to machine learning concepts using various tools, while PyTorch Tutorials focuses specifically on deep learning with PyTorch. The former is more suitable for beginners looking to understand ML fundamentals, while the latter is better for those wanting to dive deep into PyTorch and neural networks.

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.

Pros of transformers

  • Extensive library of pre-trained models for various NLP tasks
  • High-level APIs for easy model fine-tuning and deployment
  • Active community and frequent updates

Cons of transformers

  • Steeper learning curve for beginners
  • More focused on NLP tasks, less general-purpose than tutorials

Code Comparison

transformers:

from transformers import BertTokenizer, BertForSequenceClassification
tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
model = BertForSequenceClassification.from_pretrained('bert-base-uncased')
inputs = tokenizer("Hello, my dog is cute", return_tensors="pt")
outputs = model(**inputs)

tutorials:

import torch
import torch.nn as nn

class SimpleModel(nn.Module):
    def __init__(self):
        super(SimpleModel, self).__init__()
        self.linear = nn.Linear(10, 1)

    def forward(self, x):
        return self.linear(x)

The transformers example showcases the ease of using pre-trained models, while the tutorials example demonstrates basic PyTorch model creation. transformers provides a higher-level abstraction, whereas tutorials offers more flexibility for custom architectures.

scikit-learn: machine learning in Python

Pros of scikit-learn

  • Comprehensive library for traditional machine learning algorithms
  • Easier to use for beginners and non-deep learning tasks
  • Extensive documentation and community support

Cons of scikit-learn

  • Limited support for deep learning and neural networks
  • Less flexibility for custom model architectures
  • Not optimized for GPU acceleration

Code Comparison

scikit-learn:

from sklearn.ensemble import RandomForestClassifier
from sklearn.datasets import make_classification

X, y = make_classification(n_samples=1000, n_features=4)
clf = RandomForestClassifier()
clf.fit(X, y)

PyTorch:

import torch
import torch.nn as nn

class SimpleNet(nn.Module):
    def __init__(self):
        super(SimpleNet, self).__init__()
        self.fc = nn.Linear(4, 2)

    def forward(self, x):
        return self.fc(x)

model = SimpleNet()

The scikit-learn example shows a simple random forest classifier, while the PyTorch example demonstrates a basic neural network structure. scikit-learn's code is more concise for traditional ML tasks, whereas PyTorch offers more flexibility for deep learning architectures.

26,090

The fastai deep learning library

Pros of fastai

  • Provides a higher-level API, making it easier for beginners to get started with deep learning
  • Includes built-in best practices and optimized defaults for common tasks
  • Offers a comprehensive library of pre-built models and techniques

Cons of fastai

  • Less flexibility for customizing low-level components compared to PyTorch
  • Steeper learning curve for understanding fastai-specific concepts and abstractions
  • Smaller community and ecosystem compared to PyTorch

Code Comparison

fastai:

from fastai.vision.all import *
path = untar_data(URLs.PETS)
dls = ImageDataLoaders.from_folder(path, valid_pct=0.2, seed=42)
learn = cnn_learner(dls, resnet34, metrics=error_rate)
learn.fine_tune(5)

PyTorch tutorials:

import torch
import torchvision
model = torchvision.models.resnet34(pretrained=True)
criterion = nn.CrossEntropyLoss()
optimizer = torch.optim.SGD(model.parameters(), lr=0.001, momentum=0.9)
for epoch in range(5):
    # Training loop implementation

The fastai code is more concise and abstracts away many details, while the PyTorch tutorials code provides more explicit control over the training process.

Convert Figma logo designs to code with AI

Visual Copilot

Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.

Try Visual Copilot

README

PyTorch Tutorials

All the tutorials are now presented as sphinx style documentation at:

https://pytorch.org/tutorials

Asking a question

If you have a question about a tutorial, post in https://dev-discuss.pytorch.org/ rather than creating an issue in this repo. Your question will be answered much faster on the dev-discuss forum.

Submitting an issue

You can submit the following types of issues:

  • Feature request - request a new tutorial to be added. Please explain why this tutorial is needed and how it demonstrates PyTorch value.
  • Bug report - report a failure or outdated information in an existing tutorial. When submitting a bug report, please run: python3 -m torch.utils.collect_env to get information about your environment and add the output to the bug report.

Contributing

We use sphinx-gallery's notebook styled examples to create the tutorials. Syntax is very simple. In essence, you write a slightly well formatted Python file and it shows up as an HTML page. In addition, a Jupyter notebook is autogenerated and available to run in Google Colab.

Here is how you can create a new tutorial (for a detailed description, see CONTRIBUTING.md):

NOTE: Before submitting a new tutorial, read PyTorch Tutorial Submission Policy.

  1. Create a Python file. If you want it executed while inserted into documentation, save the file with the suffix tutorial so that the file name is your_tutorial.py.
  2. Put it in one of the beginner_source, intermediate_source, advanced_source directory based on the level of difficulty. If it is a recipe, add it to recipes_source. For tutorials demonstrating unstable prototype features, add to the prototype_source.
  3. For Tutorials (except if it is a prototype feature), include it in the toctree directive and create a customcarditem in index.rst.
  4. For Tutorials (except if it is a prototype feature), create a thumbnail in the index.rst file using a command like .. customcarditem:: beginner/your_tutorial.html. For Recipes, create a thumbnail in the recipes_index.rst

If you are starting off with a Jupyter notebook, you can use this script to convert the notebook to Python file. After conversion and addition to the project, please make sure that section headings and other things are in logical order.

Building locally

The tutorial build is very large and requires a GPU. If your machine does not have a GPU device, you can preview your HTML build without actually downloading the data and running the tutorial code:

  1. Install required dependencies by running: pip install -r requirements.txt.

Typically, you would run either in conda or virtualenv. If you want to use virtualenv, in the root of the repo, run: virtualenv venv, then source venv/bin/activate.

  • If you have a GPU-powered laptop, you can build using make docs. This will download the data, execute the tutorials and build the documentation to docs/ directory. This might take about 60-120 min for systems with GPUs. If you do not have a GPU installed on your system, then see next step.
  • You can skip the computationally intensive graph generation by running make html-noplot to build basic html documentation to _build/html. This way, you can quickly preview your tutorial.

Building a single tutorial

You can build a single tutorial by using the GALLERY_PATTERN environment variable. For example to run only neural_style_transfer_tutorial.py, run:

GALLERY_PATTERN="neural_style_transfer_tutorial.py" make html

or

GALLERY_PATTERN="neural_style_transfer_tutorial.py" sphinx-build . _build

The GALLERY_PATTERN variable respects regular expressions.

About contributing to PyTorch Documentation and Tutorials

  • You can find information about contributing to PyTorch documentation in the PyTorch Repo README.md file.
  • Additional information can be found in PyTorch CONTRIBUTING.md.

License

PyTorch Tutorials is BSD licensed, as found in the LICENSE file.