Convert Figma logo to code with AI

pyro-ppl logopyro

Deep universal probabilistic programming with Python and PyTorch

8,524
985
8,524
258

Top Related Projects

82,049

Tensors and Dynamic neural networks in Python with strong GPU acceleration

Probabilistic reasoning and statistical analysis in TensorFlow

11,137

Low-code framework for building custom LLMs, neural networks, and other AI models

30,218

Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more

2,589

Stan development repository. The master branch contains the current release. The develop branch contains the latest stable development. See the Developer Process Wiki for details.

Python code for "Probabilistic Machine learning" book by Kevin Murphy

Quick Overview

Pyro is a probabilistic programming language built on top of PyTorch. It provides a flexible and expressive framework for defining probabilistic models, performing inference, and making predictions. Pyro is designed to be scalable and supports both variational inference and Markov chain Monte Carlo (MCMC) methods.

Pros

  • Seamless integration with PyTorch, allowing for easy use of deep learning models in probabilistic programming
  • Supports a wide range of inference algorithms, including stochastic variational inference (SVI) and MCMC
  • Highly extensible and customizable, enabling users to implement custom inference algorithms and distributions
  • Excellent documentation and tutorials for both beginners and advanced users

Cons

  • Steeper learning curve compared to some other probabilistic programming languages
  • Performance can be slower than specialized libraries for certain types of models
  • Limited support for non-PyTorch-based operations and models
  • Requires a good understanding of both probabilistic modeling and PyTorch

Code Examples

  1. Simple Bayesian regression model:
import pyro
import torch
import pyro.distributions as dist

def model(x, y):
    weight = pyro.sample("weight", dist.Normal(0., 1.))
    bias = pyro.sample("bias", dist.Normal(0., 1.))
    mean = weight * x + bias
    pyro.sample("y", dist.Normal(mean, 0.1), obs=y)

# Generate synthetic data
x = torch.linspace(0, 1, 100)
y = 3 * x + 1 + torch.randn(100) * 0.1

# Perform inference
nuts_kernel = pyro.infer.NUTS(model)
mcmc = pyro.infer.MCMC(nuts_kernel, num_samples=1000, warmup_steps=200)
mcmc.run(x, y)
  1. Variational Autoencoder (VAE):
import pyro
import pyro.distributions as dist
from pyro.infer import SVI, Trace_ELBO
from pyro.optim import Adam

class VAE(pyro.nn.PyroModule):
    def __init__(self, input_size, hidden_size, latent_size):
        super().__init__()
        self.encoder = pyro.nn.PyroModule[torch.nn.Sequential](
            torch.nn.Linear(input_size, hidden_size),
            torch.nn.ReLU(),
            torch.nn.Linear(hidden_size, latent_size * 2)
        )
        self.decoder = pyro.nn.PyroModule[torch.nn.Sequential](
            torch.nn.Linear(latent_size, hidden_size),
            torch.nn.ReLU(),
            torch.nn.Linear(hidden_size, input_size)
        )

    def model(self, x):
        pyro.module("decoder", self.decoder)
        with pyro.plate("data", x.shape[0]):
            z = pyro.sample("z", dist.Normal(0, 1).expand([x.shape[0], latent_size]).to_event(1))
            x_hat = self.decoder(z)
            pyro.sample("obs", dist.Bernoulli(logits=x_hat).to_event(1), obs=x)

    def guide(self, x):
        pyro.module("encoder", self.encoder)
        with pyro.plate("data", x.shape[0]):
            z_loc, z_scale = self.encoder(x).chunk(2, dim=-1)
            pyro.sample("z", dist.Normal(z_loc, z_scale).to_event(1))

vae = VAE(input_size=784, hidden_size=400, latent_size=20)
optimizer = Adam({"lr": 1e-3})
svi = SVI(vae.model, vae.guide, optimizer, loss=Trace_ELBO())

Getting Started

To get started with Pyro, first install it using pip:

pip install pyro-ppl

Then, import

Competitor Comparisons

82,049

Tensors and Dynamic neural networks in Python with strong GPU acceleration

Pros of PyTorch

  • Broader scope and functionality for general deep learning tasks
  • Larger community and more extensive documentation
  • More frequent updates and releases

Cons of PyTorch

  • Steeper learning curve for beginners
  • Less specialized for probabilistic programming
  • Potentially more complex setup for specific use cases

Code Comparison

PyTorch (basic neural network):

import torch.nn as nn

model = nn.Sequential(
    nn.Linear(10, 20),
    nn.ReLU(),
    nn.Linear(20, 1)
)

Pyro (probabilistic model):

import pyro

def model(data):
    loc = pyro.param("loc", torch.tensor(0.0))
    scale = pyro.param("scale", torch.tensor(1.0))
    return pyro.sample("obs", dist.Normal(loc, scale), obs=data)

PyTorch is a more general-purpose deep learning framework, while Pyro is built on top of PyTorch and specializes in probabilistic programming. PyTorch offers greater flexibility for various machine learning tasks, whereas Pyro provides a more focused toolkit for Bayesian inference and probabilistic modeling. The choice between the two depends on the specific requirements of your project and your familiarity with probabilistic programming concepts.

Probabilistic reasoning and statistical analysis in TensorFlow

Pros of TensorFlow Probability

  • Seamless integration with TensorFlow ecosystem
  • Extensive documentation and tutorials
  • Strong support for distributed computing and GPU acceleration

Cons of TensorFlow Probability

  • Steeper learning curve for those unfamiliar with TensorFlow
  • Less flexibility in defining custom probabilistic models compared to Pyro

Code Comparison

Pyro:

import pyro
import torch

def model(data):
    loc = pyro.param("loc", torch.tensor(0.0))
    scale = pyro.param("scale", torch.tensor(1.0))
    return pyro.sample("obs", pyro.distributions.Normal(loc, scale), obs=data)

TensorFlow Probability:

import tensorflow as tf
import tensorflow_probability as tfp

def model(data):
    loc = tf.Variable(0.0, name="loc")
    scale = tf.Variable(1.0, name="scale")
    return tfp.distributions.Normal(loc, scale).log_prob(data)

Both libraries offer powerful probabilistic programming capabilities, but Pyro's syntax is often considered more intuitive for those familiar with PyTorch. TensorFlow Probability, on the other hand, provides tighter integration with the broader TensorFlow ecosystem and may be preferred by those already working with TensorFlow-based projects.

11,137

Low-code framework for building custom LLMs, neural networks, and other AI models

Pros of Ludwig

  • More user-friendly with a high-level API for non-experts
  • Supports a wider range of input data types (text, images, audio, etc.)
  • Offers automated machine learning capabilities for easier model selection and hyperparameter tuning

Cons of Ludwig

  • Less flexible for custom probabilistic modeling compared to Pyro
  • May not be as suitable for advanced statistical inference tasks
  • Limited support for specialized probabilistic programming techniques

Code Comparison

Ludwig example:

from ludwig.api import LudwigModel

model = LudwigModel(config)
results = model.train(dataset=train_data)
predictions = model.predict(dataset=test_data)

Pyro example:

import pyro
import torch

def model(data):
    # Define probabilistic model
    theta = pyro.sample("theta", pyro.distributions.Normal(0, 1))
    return pyro.sample("obs", pyro.distributions.Normal(theta, 1), obs=data)

pyro.infer.SVI(model, guide, optimizer, loss=pyro.infer.Trace_ELBO()).step()

Ludwig focuses on simplifying the machine learning workflow with a high-level API, while Pyro provides more flexibility for custom probabilistic modeling and inference. Ludwig is better suited for users who want quick results with minimal coding, whereas Pyro is ideal for researchers and practitioners who need fine-grained control over their probabilistic models.

30,218

Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more

Pros of JAX

  • Faster execution and better performance optimization, especially for large-scale computations
  • More flexible and lower-level API, allowing for fine-grained control over computations
  • Seamless integration with NumPy and support for automatic differentiation

Cons of JAX

  • Steeper learning curve due to its lower-level nature and functional programming paradigm
  • Less extensive probabilistic programming features compared to Pyro's specialized tools
  • Smaller ecosystem and fewer high-level abstractions for specific machine learning tasks

Code Comparison

Pyro example:

import pyro
import torch

def model(data):
    loc = pyro.param("loc", torch.tensor(0.0))
    scale = pyro.param("scale", torch.tensor(1.0))
    return pyro.sample("obs", pyro.distributions.Normal(loc, scale), obs=data)

JAX example:

import jax.numpy as jnp
from jax import random

def model(key, data):
    loc = jnp.array(0.0)
    scale = jnp.array(1.0)
    return random.normal(key, shape=data.shape) * scale + loc
2,589

Stan development repository. The master branch contains the current release. The develop branch contains the latest stable development. See the Developer Process Wiki for details.

Pros of Stan

  • More mature and established probabilistic programming language
  • Highly optimized for statistical inference and MCMC sampling
  • Extensive documentation and community support

Cons of Stan

  • Steeper learning curve, especially for those new to probabilistic programming
  • Less flexible for deep learning integration compared to Pyro
  • Slower development cycle for new features

Code Comparison

Stan:

data {
  int<lower=0> N;
  vector[N] x;
  vector[N] y;
}
parameters {
  real alpha;
  real beta;
  real<lower=0> sigma;
}
model {
  y ~ normal(alpha + beta * x, sigma);
}

Pyro:

def model(x, y):
    alpha = pyro.sample('alpha', dist.Normal(0, 10))
    beta = pyro.sample('beta', dist.Normal(0, 10))
    sigma = pyro.sample('sigma', dist.HalfNormal(10))
    with pyro.plate('data', len(x)):
        pyro.sample('y', dist.Normal(alpha + beta * x, sigma), obs=y)

Both examples show a simple linear regression model, but Stan uses its own domain-specific language, while Pyro leverages Python syntax and PyTorch integration for more flexibility in model specification and deep learning capabilities.

Python code for "Probabilistic Machine learning" book by Kevin Murphy

Pros of pyprobml

  • Focuses on educational content and examples for probabilistic machine learning
  • Provides a wide range of code implementations for various ML algorithms
  • Includes Jupyter notebooks for interactive learning and experimentation

Cons of pyprobml

  • Less comprehensive probabilistic programming framework compared to Pyro
  • Not as optimized for large-scale, production-ready applications
  • Fewer built-in inference algorithms and abstractions

Code Comparison

pyprobml example (simple Gaussian mixture model):

import numpy as np
from sklearn.mixture import GaussianMixture

X = np.random.randn(100, 2)
gmm = GaussianMixture(n_components=2)
gmm.fit(X)

Pyro example (simple Gaussian mixture model):

import pyro
import pyro.distributions as dist

def model(data):
    mixture_weights = pyro.sample("weights", dist.Dirichlet(torch.ones(2)))
    with pyro.plate("components", 2):
        locs = pyro.sample("locs", dist.Normal(0., 10.))
        scales = pyro.sample("scales", dist.LogNormal(0., 1.))
    with pyro.plate("data", len(data)):
        assignment = pyro.sample("assignment", dist.Categorical(mixture_weights))
        pyro.sample("obs", dist.Normal(locs[assignment], scales[assignment]), obs=data)

Convert Figma logo designs to code with AI

Visual Copilot

Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.

Try Visual Copilot

README


Build Status Coverage Status Latest Version Documentation Status CII Best Practices

Getting Started | Documentation | Community | Contributing

Pyro is a flexible, scalable deep probabilistic programming library built on PyTorch. Notably, it was designed with these principles in mind:

  • Universal: Pyro is a universal PPL - it can represent any computable probability distribution.
  • Scalable: Pyro scales to large data sets with little overhead compared to hand-written code.
  • Minimal: Pyro is agile and maintainable. It is implemented with a small core of powerful, composable abstractions.
  • Flexible: Pyro aims for automation when you want it, control when you need it. This is accomplished through high-level abstractions to express generative and inference models, while allowing experts easy-access to customize inference.

Pyro was originally developed at Uber AI and is now actively maintained by community contributors, including a dedicated team at the Broad Institute. In 2019, Pyro became a project of the Linux Foundation, a neutral space for collaboration on open source software, open standards, open data, and open hardware.

For more information about the high level motivation for Pyro, check out our launch blog post. For additional blog posts, check out work on experimental design and time-to-event modeling in Pyro.

Installing

Installing a stable Pyro release

Install using pip:

pip install pyro-ppl

Install from source:

git clone git@github.com:pyro-ppl/pyro.git
cd pyro
git checkout master  # master is pinned to the latest release
pip install .

Install with extra packages:

To install the dependencies required to run the probabilistic models included in the examples/tutorials directories, please use the following command:

pip install pyro-ppl[extras] 

Make sure that the models come from the same release version of the Pyro source code as you have installed.

Installing Pyro dev branch

For recent features you can install Pyro from source.

Install Pyro using pip:

pip install git+https://github.com/pyro-ppl/pyro.git

or, with the extras dependency to run the probabilistic models included in the examples/tutorials directories:

pip install git+https://github.com/pyro-ppl/pyro.git#egg=project[extras]

Install Pyro from source:

git clone https://github.com/pyro-ppl/pyro
cd pyro
pip install .  # pip install .[extras] for running models in examples/tutorials

Running Pyro from a Docker Container

Refer to the instructions here.

Citation

If you use Pyro, please consider citing:

@article{bingham2019pyro,
  author    = {Eli Bingham and
               Jonathan P. Chen and
               Martin Jankowiak and
               Fritz Obermeyer and
               Neeraj Pradhan and
               Theofanis Karaletsos and
               Rohit Singh and
               Paul A. Szerlip and
               Paul Horsfall and
               Noah D. Goodman},
  title     = {Pyro: Deep Universal Probabilistic Programming},
  journal   = {J. Mach. Learn. Res.},
  volume    = {20},
  pages     = {28:1--28:6},
  year      = {2019},
  url       = {http://jmlr.org/papers/v20/18-403.html}
}