python-betterproto
Clean, modern, Python 3.6+ code generator & library for Protobuf 3 and async gRPC
Top Related Projects
Protocol Buffers - Google's data interchange format
The C based gRPC (C++, Python, Ruby, Objective-C, PHP, C#)
Public interface definitions of Google APIs.
gRPC to JSON proxy generator following the gRPC HTTP spec
The best way of working with Protocol Buffers.
Protocol Buffers library for idiomatic .NET
Quick Overview
Python-betterproto is a library that provides a faster and more feature-rich implementation of Protocol Buffers for Python. It offers improved performance, better type hinting, and additional functionality compared to the official protobuf library.
Pros
- Significantly faster serialization and deserialization compared to the official protobuf library
- Better type hinting support, improving IDE integration and code completion
- More Pythonic API with additional features like dict-like access to message fields
- Supports Python 3.7+ and works well with asyncio
Cons
- Not as widely adopted as the official protobuf library
- May have compatibility issues with some existing protobuf-based systems
- Requires a separate compilation step for .proto files
- Documentation could be more comprehensive
Code Examples
- Defining a message:
from betterproto import Message, Field
class Person(Message):
name: str = Field(1)
age: int = Field(2)
email: str = Field(3, optional=True)
- Serializing and deserializing:
person = Person(name="Alice", age=30)
serialized = bytes(person)
deserialized = Person().parse(serialized)
- Using dict-like access:
person = Person(name="Bob", age=25)
print(person["name"]) # Output: Bob
person["age"] = 26
- Working with repeated fields:
from typing import List
class Team(Message):
name: str = Field(1)
members: List[Person] = Field(2)
team = Team(name="Developers", members=[Person(name="Alice", age=30), Person(name="Bob", age=25)])
Getting Started
-
Install the library:
pip install betterproto[compiler]
-
Compile your .proto file:
python -m grpc_tools.protoc -I. --python_betterproto_out=. your_file.proto
-
Use the generated code:
from your_file_pb2 import YourMessage message = YourMessage(field1="value", field2=42) serialized = bytes(message) deserialized = YourMessage().parse(serialized)
Competitor Comparisons
Protocol Buffers - Google's data interchange format
Pros of protobuf
- Widely adopted and supported across multiple languages and platforms
- Extensive documentation and community resources
- Robust performance optimizations for large-scale systems
Cons of protobuf
- More complex setup and configuration process
- Steeper learning curve for beginners
- Less Pythonic syntax and API design
Code Comparison
protobuf:
message Person {
string name = 1;
int32 age = 2;
repeated string hobbies = 3;
}
person = Person(name="Alice", age=30, hobbies=["reading", "hiking"])
python-betterproto:
@dataclass
class Person(Message):
name: str = field(1)
age: int = field(2)
hobbies: List[str] = field(3, default_factory=list)
person = Person(name="Alice", age=30, hobbies=["reading", "hiking"])
python-betterproto offers a more Pythonic approach with native dataclasses, while protobuf uses a custom message format. The python-betterproto syntax is generally more familiar to Python developers, but protobuf's approach is consistent across multiple languages.
The C based gRPC (C++, Python, Ruby, Objective-C, PHP, C#)
Pros of grpc
- Mature, widely-adopted project with extensive language support
- High-performance RPC framework with built-in load balancing and security features
- Strong community support and extensive documentation
Cons of grpc
- Steeper learning curve and more complex setup compared to python-betterproto
- Heavier dependency footprint and larger codebase
- Less Pythonic API, requiring more boilerplate code
Code Comparison
grpc:
import grpc
from example_pb2 import ExampleRequest
from example_pb2_grpc import ExampleServiceStub
channel = grpc.insecure_channel('localhost:50051')
stub = ExampleServiceStub(channel)
response = stub.ExampleMethod(ExampleRequest(name='Alice'))
python-betterproto:
from example_pb2 import ExampleRequest, ExampleService
client = ExampleService()
response = await client.example_method(name='Alice')
Summary
grpc is a robust, high-performance RPC framework with broad language support and extensive features. However, it comes with a steeper learning curve and more complex setup. python-betterproto offers a more Pythonic and lightweight alternative, sacrificing some advanced features for simplicity and ease of use. The choice between the two depends on project requirements, performance needs, and developer preferences.
Public interface definitions of Google APIs.
Pros of googleapis
- Comprehensive collection of Google API definitions
- Official repository maintained by Google
- Extensive documentation and examples
Cons of googleapis
- Focused solely on Google APIs, less versatile for general use
- Larger repository size, potentially overwhelming for simple projects
- Steeper learning curve for developers unfamiliar with Google's ecosystem
Code Comparison
googleapis:
syntax = "proto3";
package google.pubsub.v1;
message Topic {
string name = 1;
}
python-betterproto:
from betterproto import Message, StringField
class Topic(Message):
name: str = StringField(1)
Summary
googleapis is an official, comprehensive repository for Google API definitions, offering extensive documentation but focusing solely on Google services. python-betterproto, on the other hand, is a more versatile tool for working with Protocol Buffers in Python, providing a simpler syntax and broader applicability beyond Google APIs. The code comparison demonstrates the difference in approach, with googleapis using standard protobuf syntax and python-betterproto offering a more Pythonic implementation.
gRPC to JSON proxy generator following the gRPC HTTP spec
Pros of grpc-gateway
- Language-agnostic: Works with any gRPC implementation, not limited to Python
- Mature ecosystem: Part of the official gRPC ecosystem with extensive community support
- RESTful API generation: Automatically generates RESTful APIs from gRPC services
Cons of grpc-gateway
- More complex setup: Requires additional configuration and tooling
- Limited to HTTP/JSON: Primarily focused on HTTP/JSON gateway, less flexible for other protocols
- Performance overhead: Introduces an additional layer between client and gRPC service
Code Comparison
grpc-gateway:
syntax = "proto3";
package example;
import "google/api/annotations.proto";
service ExampleService {
rpc Echo(StringMessage) returns (StringMessage) {
option (google.api.http) = {
post: "/v1/example/echo"
body: "*"
};
}
}
python-betterproto:
from betterproto import Message, field
class StringMessage(Message):
value: str = field(1)
class ExampleService:
async def echo(self, message: StringMessage) -> StringMessage:
return StringMessage(value=message.value)
The grpc-gateway example shows how to define RESTful endpoints in the protobuf definition, while python-betterproto demonstrates a more Pythonic approach to defining messages and services.
The best way of working with Protocol Buffers.
Pros of buf
- Comprehensive Protocol Buffers ecosystem with tools for linting, breaking change detection, and code generation
- Language-agnostic approach, supporting multiple programming languages
- Robust CLI tool for managing Protocol Buffer workflows
Cons of buf
- Steeper learning curve due to its broader feature set
- May be overkill for simpler projects that only require Python-specific functionality
- Requires additional setup and configuration compared to Python-specific solutions
Code Comparison
buf:
buf lint
buf generate
buf breaking --against '.git#branch=main'
python-betterproto:
from betterproto import Message, String
class MyMessage(Message):
name: String = String(1)
Summary
buf offers a comprehensive suite of tools for working with Protocol Buffers across multiple languages, making it ideal for larger, multi-language projects. It provides advanced features like linting and breaking change detection but may have a steeper learning curve.
python-betterproto, on the other hand, is specifically tailored for Python developers, offering a simpler, more Pythonic approach to working with Protocol Buffers. It's easier to set up and use for Python-only projects but lacks the broader ecosystem and language support of buf.
Choose buf for complex, multi-language projects with advanced Protocol Buffer needs, and python-betterproto for simpler, Python-focused applications requiring a more streamlined approach.
Protocol Buffers library for idiomatic .NET
Pros of protobuf-net
- Mature and widely adopted in the .NET ecosystem
- Supports a broader range of .NET platforms and versions
- Offers more advanced features like custom type handling and inheritance support
Cons of protobuf-net
- Limited to .NET languages, not cross-platform like BetterProto
- May have a steeper learning curve for developers new to Protocol Buffers
- Less focus on generating idiomatic Python code
Code Comparison
BetterProto (Python):
@dataclass
class Person(betterproto.Message):
name: str = betterproto.string_field(1)
age: int = betterproto.int32_field(2)
protobuf-net (C#):
[ProtoContract]
public class Person
{
[ProtoMember(1)]
public string Name { get; set; }
[ProtoMember(2)]
public int Age { get; set; }
}
Both libraries aim to simplify working with Protocol Buffers, but they target different ecosystems. BetterProto focuses on providing a Pythonic interface with type hints and dataclasses, while protobuf-net integrates seamlessly with .NET conventions and features. The choice between them largely depends on the target platform and language preferences of the development team.
Convert designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual CopilotREADME
Better Protobuf / gRPC Support for Python
:octocat: If you're reading this on github, please be aware that it might mention unreleased features! See the latest released README on pypi.
This project aims to provide an improved experience when using Protobuf / gRPC in a modern Python environment by making use of modern language features and generating readable, understandable, idiomatic Python code. It will not support legacy features or environments (e.g. Protobuf 2). The following are supported:
- Protobuf 3 & gRPC code generation
- Both binary & JSON serialization is built-in
- Python 3.7+ making use of:
- Enums
- Dataclasses
async
/await
- Timezone-aware
datetime
andtimedelta
objects - Relative imports
- Mypy type checking
- Pydantic Models generation (see #generating-pydantic-models)
This project is heavily inspired by, and borrows functionality from:
- https://github.com/protocolbuffers/protobuf/tree/master/python
- https://github.com/eigenein/protobuf/
- https://github.com/vmagamedov/grpclib
Motivation
This project exists because I am unhappy with the state of the official Google protoc plugin for Python.
- No
async
support (requires additionalgrpclib
plugin) - No typing support or code completion/intelligence (requires additional
mypy
plugin) - No
__init__.py
module files get generated - Output is not importable
- Import paths break in Python 3 unless you mess with
sys.path
- Import paths break in Python 3 unless you mess with
- Bugs when names clash (e.g.
codecs
package) - Generated code is not idiomatic
- Completely unreadable runtime code-generation
- Much code looks like C++ or Java ported 1:1 to Python
- Capitalized function names like
HasField()
andSerializeToString()
- Uses
SerializeToString()
rather than the built-in__bytes__()
- Special wrapped types don't use Python's
None
- Timestamp/duration types don't use Python's built-in
datetime
module
This project is a reimplementation from the ground up focused on idiomatic modern Python to help fix some of the above. While it may not be a 1:1 drop-in replacement due to changed method names and call patterns, the wire format is identical.
Installation
First, install the package. Note that the [compiler]
feature flag tells it to install extra dependencies only needed by the protoc
plugin:
# Install both the library and compiler
pip install "betterproto[compiler]"
# Install just the library (to use the generated code output)
pip install betterproto
Betterproto is under active development. To install the latest beta version, use pip install --pre betterproto
.
Getting Started
Compiling proto files
Given you installed the compiler and have a proto file, e.g example.proto
:
syntax = "proto3";
package hello;
// Greeting represents a message you can tell a user.
message Greeting {
string message = 1;
}
You can run the following to invoke protoc directly:
mkdir lib
protoc -I . --python_betterproto_out=lib example.proto
or run the following to invoke protoc via grpcio-tools:
pip install grpcio-tools
python -m grpc_tools.protoc -I . --python_betterproto_out=lib example.proto
This will generate lib/hello/__init__.py
which looks like:
# Generated by the protocol buffer compiler. DO NOT EDIT!
# sources: example.proto
# plugin: python-betterproto
from dataclasses import dataclass
import betterproto
@dataclass
class Greeting(betterproto.Message):
"""Greeting represents a message you can tell a user."""
message: str = betterproto.string_field(1)
Now you can use it!
>>> from lib.hello import Greeting
>>> test = Greeting()
>>> test
Greeting(message='')
>>> test.message = "Hey!"
>>> test
Greeting(message="Hey!")
>>> serialized = bytes(test)
>>> serialized
b'\n\x04Hey!'
>>> another = Greeting().parse(serialized)
>>> another
Greeting(message="Hey!")
>>> another.to_dict()
{"message": "Hey!"}
>>> another.to_json(indent=2)
'{\n "message": "Hey!"\n}'
Async gRPC Support
The generated Protobuf Message
classes are compatible with grpclib so you are free to use it if you like. That said, this project also includes support for async gRPC stub generation with better static type checking and code completion support. It is enabled by default.
Given an example service definition:
syntax = "proto3";
package echo;
message EchoRequest {
string value = 1;
// Number of extra times to echo
uint32 extra_times = 2;
}
message EchoResponse {
repeated string values = 1;
}
message EchoStreamResponse {
string value = 1;
}
service Echo {
rpc Echo(EchoRequest) returns (EchoResponse);
rpc EchoStream(EchoRequest) returns (stream EchoStreamResponse);
}
Generate echo proto file:
python -m grpc_tools.protoc -I . --python_betterproto_out=. echo.proto
A client can be implemented as follows:
import asyncio
import echo
from grpclib.client import Channel
async def main():
channel = Channel(host="127.0.0.1", port=50051)
service = echo.EchoStub(channel)
response = await service.echo(echo.EchoRequest(value="hello", extra_times=1))
print(response)
async for response in service.echo_stream(echo.EchoRequest(value="hello", extra_times=1)):
print(response)
# don't forget to close the channel when done!
channel.close()
if __name__ == "__main__":
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
which would output
EchoResponse(values=['hello', 'hello'])
EchoStreamResponse(value='hello')
EchoStreamResponse(value='hello')
This project also produces server-facing stubs that can be used to implement a Python gRPC server. To use them, simply subclass the base class in the generated files and override the service methods:
import asyncio
from echo import EchoBase, EchoRequest, EchoResponse, EchoStreamResponse
from grpclib.server import Server
from typing import AsyncIterator
class EchoService(EchoBase):
async def echo(self, echo_request: "EchoRequest") -> "EchoResponse":
return EchoResponse([echo_request.value for _ in range(echo_request.extra_times)])
async def echo_stream(self, echo_request: "EchoRequest") -> AsyncIterator["EchoStreamResponse"]:
for _ in range(echo_request.extra_times):
yield EchoStreamResponse(echo_request.value)
async def main():
server = Server([EchoService()])
await server.start("127.0.0.1", 50051)
await server.wait_closed()
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
JSON
Both serializing and parsing are supported to/from JSON and Python dictionaries using the following methods:
- Dicts:
Message().to_dict()
,Message().from_dict(...)
- JSON:
Message().to_json()
,Message().from_json(...)
For compatibility the default is to convert field names to camelCase
. You can control this behavior by passing a casing value, e.g:
MyMessage().to_dict(casing=betterproto.Casing.SNAKE)
Determining if a message was sent
Sometimes it is useful to be able to determine whether a message has been sent on the wire. This is how the Google wrapper types work to let you know whether a value is unset, set as the default (zero value), or set as something else, for example.
Use betterproto.serialized_on_wire(message)
to determine if it was sent. This is a little bit different from the official Google generated Python code, and it lives outside the generated Message
class to prevent name clashes. Note that it only supports Proto 3 and thus can only be used to check if Message
fields are set. You cannot check if a scalar was sent on the wire.
# Old way (official Google Protobuf package)
>>> mymessage.HasField('myfield')
# New way (this project)
>>> betterproto.serialized_on_wire(mymessage.myfield)
One-of Support
Protobuf supports grouping fields in a oneof
clause. Only one of the fields in the group may be set at a given time. For example, given the proto:
syntax = "proto3";
message Test {
oneof foo {
bool on = 1;
int32 count = 2;
string name = 3;
}
}
On Python 3.10 and later, you can use a match
statement to access the provided one-of field, which supports type-checking:
test = Test()
match test:
case Test(on=value):
print(value) # value: bool
case Test(count=value):
print(value) # value: int
case Test(name=value):
print(value) # value: str
case _:
print("No value provided")
You can also use betterproto.which_one_of(message, group_name)
to determine which of the fields was set. It returns a tuple of the field name and value, or a blank string and None
if unset.
>>> test = Test()
>>> betterproto.which_one_of(test, "foo")
["", None]
>>> test.on = True
>>> betterproto.which_one_of(test, "foo")
["on", True]
# Setting one member of the group resets the others.
>>> test.count = 57
>>> betterproto.which_one_of(test, "foo")
["count", 57]
# Default (zero) values also work.
>>> test.name = ""
>>> betterproto.which_one_of(test, "foo")
["name", ""]
Again this is a little different than the official Google code generator:
# Old way (official Google protobuf package)
>>> message.WhichOneof("group")
"foo"
# New way (this project)
>>> betterproto.which_one_of(message, "group")
["foo", "foo's value"]
Well-Known Google Types
Google provides several well-known message types like a timestamp, duration, and several wrappers used to provide optional zero value support. Each of these has a special JSON representation and is handled a little differently from normal messages. The Python mapping for these is as follows:
Google Message | Python Type | Default |
---|---|---|
google.protobuf.duration | datetime.timedelta | 0 |
google.protobuf.timestamp | Timezone-aware datetime.datetime | 1970-01-01T00:00:00Z |
google.protobuf.*Value | Optional[...] | None |
google.protobuf.* | betterproto.lib.google.protobuf.* | None |
For the wrapper types, the Python type corresponds to the wrapped type, e.g. google.protobuf.BoolValue
becomes Optional[bool]
while google.protobuf.Int32Value
becomes Optional[int]
. All of the optional values default to None
, so don't forget to check for that possible state. Given:
syntax = "proto3";
import "google/protobuf/duration.proto";
import "google/protobuf/timestamp.proto";
import "google/protobuf/wrappers.proto";
message Test {
google.protobuf.BoolValue maybe = 1;
google.protobuf.Timestamp ts = 2;
google.protobuf.Duration duration = 3;
}
You can do stuff like:
>>> t = Test().from_dict({"maybe": True, "ts": "2019-01-01T12:00:00Z", "duration": "1.200s"})
>>> t
Test(maybe=True, ts=datetime.datetime(2019, 1, 1, 12, 0, tzinfo=datetime.timezone.utc), duration=datetime.timedelta(seconds=1, microseconds=200000))
>>> t.ts - t.duration
datetime.datetime(2019, 1, 1, 11, 59, 58, 800000, tzinfo=datetime.timezone.utc)
>>> t.ts.isoformat()
'2019-01-01T12:00:00+00:00'
>>> t.maybe = None
>>> t.to_dict()
{'ts': '2019-01-01T12:00:00Z', 'duration': '1.200s'}
Generating Pydantic Models
You can use python-betterproto to generate pydantic based models, using pydantic dataclasses. This means the results of the protobuf unmarshalling will be typed checked. The usage is the same, but you need to add a custom option when calling the protobuf compiler:
protoc -I . --python_betterproto_opt=pydantic_dataclasses --python_betterproto_out=lib example.proto
With the important change being --python_betterproto_opt=pydantic_dataclasses
. This will
swap the dataclass implementation from the builtin python dataclass to the
pydantic dataclass. You must have pydantic as a dependency in your project for
this to work.
Configuration typing imports
By default typing types will be imported directly from typing. This sometimes can lead to issues in generation if types that are being generated conflict with the name. In this case you can configure the way types are imported from 3 different options:
Direct
protoc -I . --python_betterproto_opt=typing.direct --python_betterproto_out=lib example.proto
this configuration is the default, and will import types as follows:
from typing import (
List,
Optional,
Union
)
...
value: List[str] = []
value2: Optional[str] = None
value3: Union[str, int] = 1
Root
protoc -I . --python_betterproto_opt=typing.root --python_betterproto_out=lib example.proto
this configuration loads the root typing module, and then access the types off of it directly:
import typing
...
value: typing.List[str] = []
value2: typing.Optional[str] = None
value3: typing.Union[str, int] = 1
310
protoc -I . --python_betterproto_opt=typing.310 --python_betterproto_out=lib example.proto
this configuration avoid loading typing all together if possible and uses the python 3.10 pattern:
...
value: list[str] = []
value2: str | None = None
value3: str | int = 1
Development
- Join us on Slack!
- See how you can help → Contributing
Requirements
-
Python (3.7 or higher)
-
poetry Needed to install dependencies in a virtual environment
-
poethepoet for running development tasks as defined in pyproject.toml
- Can be installed to your host environment via
pip install poethepoet
then executed as simplepoe
- or run from the poetry venv as
poetry run poe
- Can be installed to your host environment via
Setup
# Get set up with the virtual env & dependencies
poetry install -E compiler
# Activate the poetry environment
poetry shell
Code style
This project enforces black python code formatting.
Before committing changes run:
poe format
To avoid merge conflicts later, non-black formatted python code will fail in CI.
Tests
There are two types of tests:
- Standard tests
- Custom tests
Standard tests
Adding a standard test case is easy.
- Create a new directory
betterproto/tests/inputs/<name>
- add
<name>.proto
with a message calledTest
- add
<name>.json
with some test data (optional)
- add
It will be picked up automatically when you run the tests.
- See also: Standard Tests Development Guide
Custom tests
Custom tests are found in tests/test_*.py
and are run with pytest.
Running
Here's how to run the tests.
# Generate assets from sample .proto files required by the tests
poe generate
# Run the tests
poe test
To run tests as they are run in CI (with tox) run:
poe full-test
(Re)compiling Google Well-known Types
Betterproto includes compiled versions for Google's well-known types at src/betterproto/lib/google. Be sure to regenerate these files when modifying the plugin output format, and validate by running the tests.
Normally, the plugin does not compile any references to google.protobuf
, since they are pre-compiled. To force compilation of google.protobuf
, use the option --custom_opt=INCLUDE_GOOGLE
.
Assuming your google.protobuf
source files (included with all releases of protoc
) are located in /usr/local/include
, you can regenerate them as follows:
protoc \
--plugin=protoc-gen-custom=src/betterproto/plugin/main.py \
--custom_opt=INCLUDE_GOOGLE \
--custom_out=src/betterproto/lib \
-I /usr/local/include/ \
/usr/local/include/google/protobuf/*.proto
TODO
- Fixed length fields
- Packed fixed-length
- Zig-zag signed fields (sint32, sint64)
- Don't encode zero values for nested types
- Enums
- Repeated message fields
- Maps
- Maps of message fields
- Support passthrough of unknown fields
- Refs to nested types
- Imports in proto files
- Well-known Google types
- Support as request input
- Support as response output
- Automatically wrap/unwrap responses
- OneOf support
- Basic support on the wire
- Check which was set from the group
- Setting one unsets the others
- JSON that isn't completely naive.
- 64-bit ints as strings
- Maps
- Lists
- Bytes as base64
- Any support
- Enum strings
- Well known types support (timestamp, duration, wrappers)
- Support different casing (orig vs. camel vs. others?)
- Async service stubs
- Unary-unary
- Server streaming response
- Client streaming request
- Renaming messages and fields to conform to Python name standards
- Renaming clashes with language keywords
- Python package
- Automate running tests
- Cleanup!
Community
Join us on Slack!
License
Copyright © 2019 Daniel G. Taylor
Top Related Projects
Protocol Buffers - Google's data interchange format
The C based gRPC (C++, Python, Ruby, Objective-C, PHP, C#)
Public interface definitions of Google APIs.
gRPC to JSON proxy generator following the gRPC HTTP spec
The best way of working with Protocol Buffers.
Protocol Buffers library for idiomatic .NET
Convert designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual Copilot