Top Related Projects
FlameScope is a visualization tool for exploring different time ranges as Flame Graphs.
Quick Overview
Go-torch is a tool for generating flame graphs from Go programs. It allows developers to visualize CPU performance and identify bottlenecks in their Go applications. Go-torch integrates with the pprof profiling tool and uses the FlameGraph library to create interactive SVG visualizations.
Pros
- Easy to use and integrate with existing Go projects
- Provides clear, interactive visualizations of CPU performance
- Helps identify performance bottlenecks quickly
- Supports both local and remote profiling
Cons
- Requires installation of additional dependencies (FlameGraph)
- Limited to CPU profiling, doesn't cover other performance aspects
- May have a slight performance impact when profiling is enabled
- Project is archived and no longer actively maintained
Code Examples
- Profiling a local Go program:
package main
import (
"net/http"
_ "net/http/pprof"
)
func main() {
go func() {
http.ListenAndServe("localhost:6060", nil)
}()
// Your application code here
}
- Generating a flame graph:
go-torch --url http://localhost:6060 -t 30
- Profiling a specific function:
import "runtime/pprof"
func someFunction() {
f, _ := os.Create("cpu.prof")
pprof.StartCPUProfile(f)
defer pprof.StopCPUProfile()
// Function code here
}
Getting Started
-
Install go-torch:
go get github.com/uber/go-torch
-
Install FlameGraph:
git clone https://github.com/brendangregg/FlameGraph.git export PATH=$PATH:/path/to/FlameGraph
-
Add profiling to your Go program:
import _ "net/http/pprof" func main() { go func() { http.ListenAndServe("localhost:6060", nil) }() // Your code here }
-
Run your program and generate a flame graph:
go-torch --url http://localhost:6060 -t 30
Competitor Comparisons
FlameScope is a visualization tool for exploring different time ranges as Flame Graphs.
Pros of Flamescope
- Web-based interface for easier visualization and interaction
- Supports multiple profile formats (perf, eBPF, etc.)
- Actively maintained with regular updates
Cons of Flamescope
- Requires more setup and dependencies (Python, Flask)
- Limited to Linux systems for data collection
- Steeper learning curve for non-developers
Code Comparison
go-torch:
func (p *Profile) Generate() ([]byte, error) {
var buf bytes.Buffer
if err := p.render(&buf); err != nil {
return nil, err
}
return buf.Bytes(), nil
}
Flamescope:
@app.route('/profile/<profile_type>/<profile_name>')
def profile(profile_type, profile_name):
return render_template('profile.html',
profile_type=profile_type,
profile_name=profile_name)
Summary
Flamescope offers a more comprehensive and user-friendly approach to flame graph generation with its web interface and support for multiple profile formats. However, it comes with increased complexity and system requirements. go-torch, while simpler and more portable, lacks some advanced features and is no longer actively maintained. The choice between the two depends on specific use cases and system constraints.
Convert designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual CopilotREADME
go-torch
go-torch is deprecated, use pprof instead
As of Go 1.11, flamegraph visualizations are available in go tool pprof
directly!
# This will listen on :8081 and open a browser.
# Change :8081 to a port of your choice.
$ go tool pprof -http=":8081" [binary] [profile]
If you cannot use Go 1.11, you can get the latest pprof
tool and use it instead:
# Get the pprof tool directly
$ go get -u github.com/google/pprof
$ pprof -http=":8081" [binary] [profile]
Synopsis
Tool for stochastically profiling Go programs. Collects stack traces and synthesizes them into a flame graph. Uses Go's built in pprof library.
Example Flame Graph
Basic Usage
$ go-torch -h
Usage:
go-torch [options] [binary] <profile source>
pprof Options:
-u, --url= Base URL of your Go program (default: http://localhost:8080)
-s, --suffix= URL path of pprof profile (default: /debug/pprof/profile)
-b, --binaryinput= File path of previously saved binary profile. (binary profile is anything accepted by https://golang.org/cmd/pprof)
--binaryname= File path of the binary that the binaryinput is for, used for pprof inputs
-t, --seconds= Number of seconds to profile for (default: 30)
--pprofArgs= Extra arguments for pprof
Output Options:
-f, --file= Output file name (must be .svg) (default: torch.svg)
-p, --print Print the generated svg to stdout instead of writing to file
-r, --raw Print the raw call graph output to stdout instead of creating a flame graph; use with Brendan Gregg's flame graph perl script (see https://github.com/brendangregg/FlameGraph)
--title= Graph title to display in the output file (default: Flame Graph)
--width= Generated graph width (default: 1200)
--hash Colors are keyed by function name hash
--colors= Set color palette. Valid choices are: hot (default), mem, io, wakeup, chain, java,
js, perl, red, green, blue, aqua, yellow, purple, orange
--hash Graph colors are keyed by function name hash
--cp Graph use consistent palette (palette.map)
--inverted Icicle graph
Help Options:
-h, --help Show this help message
Write flamegraph using /debug/pprof endpoint
The default options will hit http://localhost:8080/debug/pprof/profile
for
a 30 second CPU profile, and write it out to torch.svg
$ go-torch
INFO[19:10:58] Run pprof command: go tool pprof -raw -seconds 30 http://localhost:8080/debug/pprof/profile
INFO[19:11:03] Writing svg to torch.svg
You can customize the base URL by using -u
$ go-torch -u http://my-service:8080/
INFO[19:10:58] Run pprof command: go tool pprof -raw -seconds 30 http://my-service:8080/debug/pprof/profile
INFO[19:11:03] Writing svg to torch.svg
Or change the number of seconds to profile using --seconds
:
$ go-torch --seconds 5
INFO[19:10:58] Run pprof command: go tool pprof -raw -seconds 5 http://localhost:8080/debug/pprof/profile
INFO[19:11:03] Writing svg to torch.svg
Using pprof arguments
go-torch
will pass through arguments to go tool pprof
, which lets you take
existing pprof commands and easily make them work with go-torch
.
For example, after creating a CPU profile from a benchmark:
$ go test -bench . -cpuprofile=cpu.prof
# This creates a cpu.prof file, and the $PKG.test binary.
The same arguments that can be used with go tool pprof
will also work
with go-torch
:
$ go tool pprof main.test cpu.prof
# Same arguments work with go-torch
$ go-torch main.test cpu.prof
INFO[19:00:29] Run pprof command: go tool pprof -raw -seconds 30 main.test cpu.prof
INFO[19:00:29] Writing svg to torch.svg
Flags that are not handled by go-torch
are passed through as well:
$ go-torch --alloc_objects main.test mem.prof
INFO[19:00:29] Run pprof command: go tool pprof -raw -seconds 30 --alloc_objects main.test mem.prof
INFO[19:00:29] Writing svg to torch.svg
Integrating With Your Application
To add profiling endpoints in your application, follow the official Go docs here. If your application is already running a server on the DefaultServeMux, just add this import to your application.
import _ "net/http/pprof"
If your application is not using the DefaultServeMux, you can still easily expose pprof endpoints by manually registering the net/http/pprof handlers or by using a library like this one.
Installation
$ go get github.com/uber/go-torch
You can also use go-torch using docker:
$ docker run uber/go-torch -u http://[address-of-host] -p > torch.svg
Using -p
will print the SVG to standard out, which can then be redirected
to a file. This avoids mounting volumes to a container.
Get the flame graph script:
When using the go-torch
binary locally, you will need the Flamegraph scripts
in your PATH
:
$ cd $GOPATH/src/github.com/uber/go-torch
$ git clone https://github.com/brendangregg/FlameGraph.git
Development and Testing
Install the Go dependencies:
$ go get github.com/Masterminds/glide
$ cd $GOPATH/src/github.com/uber/go-torch
$ glide install
Run the Tests
$ go test ./...
ok github.com/uber/go-torch 0.012s
ok github.com/uber/go-torch/graph 0.017s
ok github.com/uber/go-torch/visualization 0.052s
Top Related Projects
FlameScope is a visualization tool for exploring different time ranges as Flame Graphs.
Convert designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual Copilot