Convert Figma logo to code with AI

mmp logopbrt-v4

Source code to pbrt, the ray tracer described in the forthcoming 4th edition of the "Physically Based Rendering: From Theory to Implementation" book.

2,810
429
2,810
79

Top Related Projects

Advanced shading language for production GI renderers

5,989

Universal Scene Description

2,351

Embree ray tracing kernels repository.

1,149

LuxCore source repository

Quick Overview

PBRT-v4 is the fourth version of the Physically Based Rendering Toolkit, an open-source rendering system for creating photorealistic images. It serves as both a state-of-the-art renderer and a comprehensive resource for learning about modern rendering techniques, accompanying the book "Physically Based Rendering: From Theory to Implementation."

Pros

  • Highly educational, providing a complete implementation of modern rendering algorithms
  • Produces high-quality, physically-based rendered images
  • Extensively documented, with a companion book explaining the theory and implementation
  • Supports various advanced rendering techniques like path tracing, photon mapping, and volumetric rendering

Cons

  • Steep learning curve for beginners due to the complexity of rendering algorithms
  • Performance may not be as optimized as some commercial renderers
  • Limited GUI tools, primarily designed as a command-line application
  • May require significant computational resources for complex scenes

Code Examples

  1. Creating a simple scene:
#include <pbrt/pbrt.h>

int main(int argc, char *argv[]) {
    pbrt::Init(argc, argv);
    pbrt::ParseFile("simple_scene.pbrt");
    pbrt::Render();
    pbrt::CleanUp();
    return 0;
}
  1. Defining a material:
pbrt::Material *CreateMatte(const pbrt::TextureParams &mp) {
    std::shared_ptr<pbrt::Texture<pbrt::Spectrum>> Kd = 
        mp.GetSpectrumTexture("Kd", pbrt::Spectrum(0.5f));
    std::shared_ptr<pbrt::Texture<Float>> sigma = 
        mp.GetFloatTexture("sigma", 0.f);
    return new pbrt::MatteMaterial(Kd, sigma);
}
  1. Setting up a camera:
pbrt::Camera *CreatePerspectiveCamera(const pbrt::AnimatedTransform &cam2world,
                                      const pbrt::CameraParams &params) {
    Float fov = params.FindOneFloat("fov", 90.);
    Float lensradius = params.FindOneFloat("lensradius", 0.);
    Float focaldistance = params.FindOneFloat("focaldistance", 1e30f);
    return new pbrt::PerspectiveCamera(cam2world, fov, lensradius, focaldistance);
}

Getting Started

  1. Clone the repository:

    git clone https://github.com/mmp/pbrt-v4.git
    
  2. Build PBRT-v4:

    cd pbrt-v4
    mkdir build && cd build
    cmake ..
    make -j8
    
  3. Render a sample scene:

    ./pbrt ../scenes/simple.pbrt
    

This will generate a rendered image of the simple scene. For more complex usage and scene creation, refer to the PBRT book and documentation.

Competitor Comparisons

Advanced shading language for production GI renderers

Pros of OpenShadingLanguage

  • Specialized shading language designed for production rendering
  • Extensive support for advanced shading techniques and material definitions
  • Widely adopted in the film and VFX industry

Cons of OpenShadingLanguage

  • Steeper learning curve for beginners compared to PBRT-v4
  • More complex setup and integration process
  • Focused primarily on shading, while PBRT-v4 offers a complete rendering system

Code Comparison

OpenShadingLanguage:

surface wood (
    float Kd = 0.5,
    color woodcolor = color(0.7, 0.5, 0.3),
    float ringscale = 1.0,
    float contrast = 1.0
) {
    // Shading logic here
}

PBRT-v4:

Spectrum WoodMaterial::Evaluate(const MaterialEvalContext &ctx) const {
    // Material evaluation logic here
}

OpenShadingLanguage provides a more specialized syntax for defining shaders, while PBRT-v4 uses C++ for material definitions within its rendering framework. OSL offers greater flexibility for complex shading, but PBRT-v4 integrates material definitions more tightly with its overall rendering system.

5,989

Universal Scene Description

Pros of OpenUSD

  • Comprehensive framework for 3D scene description and interchange
  • Extensive industry adoption and support
  • Rich set of tools and APIs for asset management and collaboration

Cons of OpenUSD

  • Steeper learning curve due to complex architecture
  • Larger codebase and dependencies
  • Primarily focused on asset pipeline, not rendering

Code Comparison

OpenUSD (C++):

#include "pxr/usd/usd/stage.h"
#include "pxr/usd/usdGeom/sphere.h"

auto stage = pxr::UsdStage::CreateInMemory();
auto spherePrim = pxr::UsdGeomSphere::Define(stage, pxr::SdfPath("/mySphere"));
spherePrim.CreateRadiusAttr().Set(2.0);

pbrt-v4 (C++):

#include "pbrt.h"
#include "shapes.h"

Sphere *sphere = new Sphere(Transform(), Transform(), false, 2.0);
std::shared_ptr<Shape> spherePtr(sphere);
primitive = new GeometricPrimitive(spherePtr, material, areaLight);

While OpenUSD focuses on scene description and asset management, pbrt-v4 is primarily a physically-based renderer. OpenUSD provides a more extensive framework for managing complex scenes and assets across pipelines, while pbrt-v4 offers a focused, high-quality rendering solution. The choice between them depends on specific project requirements and workflow needs.

2,351

Embree ray tracing kernels repository.

Pros of Embree

  • Highly optimized for CPU ray tracing performance
  • Supports advanced features like motion blur and hair/fur rendering
  • Extensive documentation and examples for integration

Cons of Embree

  • Focused solely on ray tracing, lacking full rendering pipeline
  • Steeper learning curve for integration into existing projects
  • Limited to CPU rendering, no GPU support

Code Comparison

PBRT-v4:

Ray ray(Point3f(0, 0, 0), Vector3f(1, 0, 0));
SurfaceInteraction isect;
if (scene.Intersect(ray, &isect)) {
    // Handle intersection
}

Embree:

RTCIntersectContext context;
rtcInitIntersectContext(&context);
rtcIntersect1(scene, &context, &rayhit);
if (rayhit.hit.geomID != RTC_INVALID_GEOMETRY_ID) {
    // Handle intersection
}

Both PBRT-v4 and Embree are powerful rendering tools, but they serve different purposes. PBRT-v4 is a complete physically-based renderer, offering a full pipeline from scene description to final image output. Embree, on the other hand, is a specialized library focusing on high-performance ray tracing operations.

PBRT-v4 provides a more comprehensive solution for rendering, including material models, light transport algorithms, and image output. Embree excels in raw ray tracing performance and can be integrated into existing rendering systems to accelerate intersection tests.

The code comparison shows the difference in API design. PBRT-v4 uses a higher-level, object-oriented approach, while Embree's API is more low-level and C-style, reflecting its focus on performance and integration flexibility.

1,149

LuxCore source repository

Pros of LuxCore

  • More advanced and feature-rich rendering engine
  • Supports real-time rendering and interactive viewport updates
  • Offers a wider range of material types and lighting techniques

Cons of LuxCore

  • Steeper learning curve due to its complexity
  • Potentially slower render times for simple scenes
  • Less educational value as a codebase compared to PBRT-v4

Code Comparison

LuxCore (C++):

Scene *Scene::FromProperties(const Properties &sceneProps, const Properties &cfgProps) {
    Scene *scene = new Scene(cfgProps);
    scene->Parse(sceneProps);
    return scene;
}

PBRT-v4 (C++):

std::unique_ptr<Scene> Scene::Create(const ParsedScene &parsedScene) {
    std::unique_ptr<Scene> scene(new Scene);
    scene->Initialize(parsedScene);
    return scene;
}

Both examples show scene creation, but LuxCore uses a property-based approach, while PBRT-v4 employs a parsed scene structure. LuxCore's method may offer more flexibility, while PBRT-v4's approach could be more straightforward for educational purposes.

Convert Figma logo designs to code with AI

Visual Copilot

Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.

Try Visual Copilot

README

pbrt, Version 4 (Early Release)

Transparent Machines frame, via @beeple

This is an early release of pbrt-v4, the rendering system that will be described in the forthcoming fourth edition of Physically Based Rendering: From Theory to Implementation. (The printed book will be available in mid-February 2023; a few chapters will be made available in late Fall of 2022; and the full contents of the book will be freely available six months after the book's release, like the third edition is already.)

We are making this code available for hardy adventurers; it's not yet extensively documented, but if you are familiar with previous versions of pbrt, you should be able to make your way around it. Our hope is that the system will be useful to some people in its current form and that any bugs in the current implementation might be found now, allowing us to correct them before the book is final.

Resources

Features

pbrt-v4 represents a substantial update to the previous version of pbrt-v3. Major changes include:

  • Spectral rendering
    • Rendering computations are always performed using point-sampled spectra; the use of RGB color is limited to the scene description (e.g., image texture maps), and final image output.
  • Modernized volumetric scattering
    • An all-new VolPathIntegrator based on the null-scattering path integral formulation of Miller et al. 2019 has been added.
    • Tighter majorants are used for null-scattering with the GridDensityMedium via a separate low-resolution grid of majorants.
    • Both emissive volumes and volumes with RGB-valued absorption and scattering coefficients are now supported.
  • Support for rendering on GPUs is available on systems that have CUDA and OptiX.
    • The GPU path provides all of the functionality of the CPU-based VolPathIntegrator, including volumetric scattering, subsurface scattering, all of pbrt's cameras, samplers, shapes, lights, materials and BxDFs, etc.
    • Performance is substantially faster than rendering on the CPU.
  • New BxDFs and Materials
    • The provided BxDFs and Materials have been redesigned to be more closely tied to physical scattering processes, along the lines of Mitsuba's materials. (Among other things, the kitchen-sink UberMaterial is now gone.)
    • Measured BRDFs are now represented using Dupuy and Jakob's approach.
    • Scattering from layered materials is accurately simulated using Monte Carlo random walks (after Guo et al. 2018.)
  • A variety of light sampling improvements have been implemented.
    • "Many-light" sampling is available via light BVHs (Conty and Kulla 2018).
    • Solid angle sampling is used for triangle (Arvo1995) and quadrilateral (Ureña et al. 2013) light sources.
    • A single ray is now traced for both indirect lighting and BSDF-sampled direct-lighting.
    • Warp product sampling is used for approximate cosine-weighted solid angle sampling (Hart et al. 2019).
    • An implementation of Bitterli et al's environment light portal sampling technique is included.
  • Rendering can now be performed in absolute physical units with modelling of real cameras as per Langlands & Fascione 2020.
  • And also...
    • Various improvements have been made to the Sampler classes, including better randomization and a new sampler that implements Ahmed and Wonka's blue noise Sobol' sampler.
    • A new GBufferFilm that provides position, normal, albedo, etc., at each pixel is now available. (This is particularly useful for denoising and ML training.)
    • Path regularization (optionally).
    • A bilinear patch primitive has been added (Reshetov 2019).
    • Various improvements to ray--shape intersection precision.
    • Most of the low-level sampling code has been factored out into stand-alone functions for easier reuse. Also, functions that invert many sampling techniques are provided.
    • Unit test coverage has been substantially increased.

We have also made a refactoring pass throughout the entire system, cleaning up various APIs and data types to improve both readability and usability.

Finally, pbrt-v4 can work together with the tev image viewer to display the image as it's being rendered. As of recent versions, tev can display images provided to it via a network socket; by default, it listens to port 14158, though this can be changed via its --hostname command-line option. If you have an instance of tev running, you can run pbrt like:

$ pbrt --display-server localhost:14158 scene.pbrt

In that case, the image will be progressively displayed as it renders.

Building the code

As before, pbrt uses git submodules for a number of third-party libraries that it depends on. Therefore, be sure to use the --recursive flag when cloning the repository:

$ git clone --recursive https://github.com/mmp/pbrt-v4.git

If you accidentally clone pbrt without using --recursive (or to update the pbrt source tree after a new submodule has been added, run the following command to also fetch the dependencies:

$ git submodule update --init --recursive

pbrt uses cmake for its build system. Note that a release build is the default; provide -DCMAKE_BUILD_TYPE=Debug to cmake for a debug build.

pbrt should build on any system that has C++ compiler with support for C++17; we have verified that it builds on Ubuntu 20.04, MacOS 10.14, and Windows 10. We welcome PRs that fix any issues that prevent it from building on other systems.

Bug Reports and PRs

Please use the pbrt-v4 github issue tracker to report bugs in pbrt-v4. (We have pre-populated it with a number of issues corresponding to known bugs in the initial release.)

We are always happy to receive pull requests that fix bugs, including bugs you find yourself or fixes for open issues in the issue tracker. We are also happy to hear suggestions about improvements to the implementations of the various algorithms we have implemented.

Note, however, that in the interests of finishing the book in a finite amount of time, the functionality of pbrt-v4 is basically fixed at this point. We therefore will not be accepting PRs that make major changes to the system's operation or structure (but feel free to keep them in your own forks!). Also, don't bother sending PRs for anything marked "TODO" or "FIXME" in the source code; we'll take care of those as we finish polishing things up.

Updating pbrt-v3 scenes

There are a variety of changes to the input file format and, as noted above, the new format is not yet documented. However, pbrt-v4 partially makes up for that by providing an automatic upgrade mechanism:

$ pbrt --upgrade old.pbrt > new.pbrt

Most scene files can be automatically updated. In some cases manual intervention is required; an error message will be printed in this case.

The environment map parameterization has also changed (from equi-rect to an equi-area mapping); you can upgrade environment maps using

$ imgtool makeequiarea old.exr --outfile new.exr

Converting scenes to pbrt's file format

The best option for importing scenes to pbrt is to use assimp, which as of January 21, 2021 includes support for exporting to pbrt-v4's file format:

$ assimp export scene.fbx scene.pbrt

While the converter tries to convert materials to pbrt's material model, some manual tweaking may be necessary after export. Furthermore, area light sources are not always successfully detected; manual intervention may be required for them as well. Use of pbrt's built-in support for converting meshes to use the binary PLY format is also recommended after conversion. (pbrt --toply scene.pbrt > newscene.pbrt).

Using pbrt on the GPU

To run on the GPU, pbrt requires:

  • C++17 support on the GPU, including kernel launch with C++ lambdas.
  • Unified memory so that the CPU can allocate and initialize data structures for code that runs on the GPU.
  • An API for ray-object intersections on the GPU.

These requirements are effectively what makes it possible to bring pbrt to the GPU with limited changes to the core system. As a practical matter, these capabilities are only available via CUDA and OptiX on NVIDIA GPUs today, though we'd be happy to see pbrt running on any other GPUs that provide those capabilities.

pbrt's GPU path currently requires CUDA 11.0 or later and OptiX 7.1 or later. Both Linux and Windows are supported.

The build scripts automatically attempt to find a CUDA compiler, looking in the usual places; the cmake output will indicate whether it was successful. It is necessary to manually set the cmake PBRT_OPTIX7_PATH configuration option to point at an OptiX installation. By default, the GPU shader model that pbrt targets is set automatically based on the GPU in the system. Alternatively, the PBRT_GPU_SHADER_MODEL option can be set manually (e.g., -DPBRT_GPU_SHADER_MODEL=sm_80).

Even when compiled with GPU support, pbrt uses the CPU by default unless the --gpu command-line option is given. Note that when rendering with the GPU, the --spp command-line flag can be helpful to easily crank up the number of samples per pixel. Also, it's extra fun to use tev to watch rendering progress.

The imgtool program that is built as part of pbrt provides support for the OptiX denoiser in the GPU build. The denoiser is capable of operating on RGB-only images, but gives better results with "deep" images that include auxiliary channels like albedo and normal. Setting the scene's "Film" type to be "gbuffer" when rendering and using EXR for the image format causes pbrt to generate such a "deep" image. In either case, using the denoiser is straightforward:

$ imgtool denoise-optix noisy.exr --outfile denoised.exr