Modern gRPC Microservices, Part 3: Managing Plugins With Buf.build
gRPCs plugin system while vast has a cumbersome management process. In this post, learn more about Buf.Build - a modern tool for managing gRPC plugins.
Join the DZone community and get the full member experience.
Join For FreeWe concluded the last article by generating a gRPC Gateway service to act as a proxy in front of our gRPC chat service. This service would convert familiar REST/HTTP requests from clients to/from our gRPC services. We also briefly discussed how the protoc
utility orchestrates various plugins to generate various artifacts: solely via stdin
and stdout
.
In the same vein, our REST/HTTP gateway was generated by using the grpc-gateway
plugin and annotating our RPC methods by http
method, bindings, and parameter details (for further customization). We also generated OpenAPI specs for our generated gateway service so it could be consumed via the SwaggerUI, too.
A key part of our workflow so far was:
- Using custom plugins (
grpc-gateway
) - Adding a call to the custom plugins in our Makefile (e.g.,
protoc
--<pluginname>_out=....
) - Copying vendor-specific .proto files for custom annotations in our service specs
To our dismay, we also discovered protoc
did not offer much in the way of package management, so when it was time to use Google's HTTP annotations, we had to manually copy a bunch of third-party/vendor .proto files to access them. Clearly, this is a burden and only worsens as codebases become larger. Even worse is when code bases have dependencies to resolve managing this burden (either from other parts of the codebases or from third-party vendors) because of a daunting and error-prone task for developers. Another area we have been carefully avoiding (but will only get harder with larger codebases) is of testing, validation, and collaboration between developers when using plugins in our gRPC services.
What is needed are tools to:
- Manage dependencies between proto packages within a project
- Manage dependencies between several plugin/annotation providers
- Generate artifacts and manage output, dirs, etc. inherently instead of having to worry about output, paths, etc.
- Validate/enforce coding conventions and standards (via linting tools)
- Improve collaboration between teams and organizations
Buf.build is one such tool.
In this article, we will provide a brief overview of Buf.build and its capabilities, and guide you through the process of migrating our canonical running example - the Onehub - to use Buf.build for generating all required artifacts. We will also remove any copied annotations and instead use Buf's package management facilities and repositories.
What Is Buf.build?
Buf.build is an open-source tool for managing Protobuf files and plugins. With Buf.build, linting, building, testing Protobuf files, and code generation is simplified. Buf.build makes it easier for developers (especially those new to Protobufs) to work with Protobufs by unifying several aspects of the related workflows under one roof.
Some of the key features of Buf.build are:
- Repository management for plugins for easy discovery and accessibility
- Linting for checking syntax errors and style violations in Protobuf files
- Testing for Protobuf files to ensure behaviors and catching any regressions
- Building Protobuf files into binary and JSON representations to be used in several environments and contexts (gRPC, REST, etc.)
- Code generation for a variety of programming languages by invoking and managing plugins (without placing the burden of installing and maintaining these plugins on the developer). This is especially valuable as a lot of repetitive tasks are automated saving valuable developer time.
There are several benefits to using Buf.build:
- By enforcing best practices via linting and testing, potential issues can be identified early on in the development cycle. Syntax errors and style violations are just some of the issues that can be caught before they make it into production. This results in maintainable and clean code.
- Builds are also sped up as only the changed and necessary parts can be rebuilt. This would speed up the inner loop (code, build, test - and commit) of development improving time-to-market.
- Since plugins and annotations can now be managed in a single source (or set of repositories), sharing and collaboration between developers (within and across organizations) is easier and less error/conflict-prone. No longer would developers have to copy third-party protos into their workspace and risk versions going out of sync.
- Increasing adoption in several organizations to manage complex code bases also provides a feedback loop of more frequent updates, features, and support for the wider community.
Getting Started
Now that we have given a brief overview of Buf.build, let us migrate our service to using this. Detailed instructions are provided at the Buf.build documentation pages and can be used for any updates. This article is specifically focused on updating an existing service (e.g., our Onehub) to use Buf.build.
Installation
First, install the Buf.build CLI for your platform by following the instructions on the Buf.build installation page.
For example, for OSX:
brew install bufbuild/buf/buf
Configure buf.yaml
In your protos root folder (Onehub/protos):
buf mod init
This would create a buf.yaml
file. Buf
uses this file to determine and resolve import paths for all protos within this directory tree. The protos can be tested by running buf build
.
On a new project, this would be error-free. However, since we are now already using external dependencies (google.api.*) you may see the following error:
onehub/v1/topics.proto:8:8:google/api/annotations.proto: does not exist
Rightly, this indicates an external dependency (google.api extensions), but the dependency has not been specified. To fix this, add a new dependency in the buf.yaml file:
In the buf.yaml file, add a deps
section:
- buf.yaml:
version: v1
deps:
- buf.build/googleapis/googleapis
breaking:
use:
- FILE
lint:
use:
- DEFAULT
Then run buf mod update
to synchronize and lock the dependencies. Note: buf mod update
must be run every time buf.yaml
is modified (for example, we could add buf.build/grpc-ecosystem/grpc-gateway
to the list of deps
).
Running buf build
now (from the protos folder) will result in no errors.
Configure buf.gen.yaml
The buf.yaml
file created earlier is used to mark the root of the Protobuf files of a buf
module (as you can guess - our module will eventually be called onehub
). However, we have still not marked/identified our actual module (or its root). This module config is buf.gen.yaml
and is placed in a file in the root directory of your project.
Let us create this file (in the onehub
project root directory):
# cd onehub
touch buf.gen.yaml
This yaml
will contain information about the module such as version, plugins being used, as well as other required bits of information (like the go_package_prefix
). Our buf.gen.yaml file looks like this:
buf.gen.yaml
version: v1
managed:
enabled: true
plugins:
- plugin: go
out: gen/go
opt: paths=source_relative
- plugin: go-grpc
out: gen/go
opt: paths=source_relative,require_unimplemented_servers=false
- plugin: grpc-gateway
out: gen/go
opt: paths=source_relative
# Python
- plugin: buf.build/protocolbuffers/python
out: gen/python
- plugin: buf.build/grpc/python
out: gen/python
# Plain JS
- plugin: buf.build/bufbuild/es
out: gen/js
- plugin: buf.build/bufbuild/connect-web
out: gen/js
# generate openapi documentation for api
- plugin: buf.build/grpc-ecosystem/openapiv2:v2.16.0
out: gen/openapiv2
opt: allow_merge=true,merge_file_name=services
version: v1
managed:
enabled: true
plugins:
- plugin: go
out: gen/go
opt: paths=source_relative
- plugin: go-grpc
out: gen/go
opt: paths=source_relative,require_unimplemented_servers=false
- plugin: grpc-gateway
out: gen/go
opt: paths=source_relative
# Python
- plugin: buf.build/protocolbuffers/python
out: gen/python
- plugin: buf.build/grpc/python
out: gen/python
# Plain JS
- plugin: buf.build/bufbuild/es
out: gen/js
- plugin: buf.build/bufbuild/connect-web
out: gen/js
# generate openapi documentation for api
- remote: buf.build/grpc-ecosystem/plugins/openapiv2:v2.6.0-1
out: gen/openapiv2
opt: allow_merge=true,merge_file_name=services
Notice how adding a new plugin (for a new generator) is as adding another "plugin" entry in the plugins section of the buf.gen.yaml file. Just for fun, we are also generating Python and JS-generated artifacts, too!
As you may have observed, some plugins have an absolute name (e.g., "go
", "go-grpc
") and others have something that looks like a URL (buf.build/protocolbuffers/python).
As mentioned earlier, Buf.build has a powerful plugin repository that can host and serve plugins for use. Buf.build can also work with local plugins. In a previous tutorial, we had manually installed protoc-gen-go
and protoc-gen-go-grpc
plugins in the $GOBIN
folder. If a plugin does not point to a URL (hosting the plugin), Buf.build simply invokes a local version of the plugin. The options (via the "opt" attributes) are passed to the protoc
tool as a flag for the plugin (e.g., --go_opt=paths=source_relative
).
Now isn't that much simpler and more intuitive than installing each plugin locally and then adding commands to a Makefile? (Sorry, oh mighty Makefile. We shall not forget your sacrifice!)
To generate all the artifacts, simply run buf generate
from the root of your module (onehub
). But before we do this, one more config file must be created.
Configure buf.work.yaml
If we had run the above command at this point we would see the following error:
protos/onehub/v1/topics.proto:7:8:onehub/v1/models.proto: does not exist
Our protos folder had a single "module" in it. Buf. build enables developers to work with multiple modules in the same workspace. This is done by marking Onehub as a "workspace". Create buf.work.yaml
at the project root (onehub
) with the following contents:
version: v1
directories:
- protos
This tells Buf.build that all Protobuf definitions can be rooted in any of the directories given in the directories
attribute in the workspace definition file. Now we can generate our artifacts:
rm -Rf gen # just to convince ourselves
buf generate
This would generate all the artifacts in the gen folder - but without needing a Makefile and the various manual protoc
commands.
buf.work.yaml
allows us to have multiple (local) modules within the same workspace, enabling more complicated structures. For example, if our buf.work.yaml
file's directories
attribute was:
version: v1
directories:
- protos/stores_api
- protos/radiostations_api
onehub
├── buf.gen.yaml
├── buf.work.yaml
├── proto
└── radiostations_api
└── v2
└── channels.proto
└── showhosts.proto
└── stores_api
└── v1
└── orders.proto
└── shoppingcart.proto
For an in-depth guide on how workspaces work, check out the documentation.
Other Features
Linting
Buf.build also comes with an out-of-the-box style guide for your Protobuf definitions, which can be used for consistency across teams. Let us try it.
buf lint
We got lucky! But despite best intent, several stylistic and structural errors can creep in, which Buf.build does a great job of detecting and recommending a fix. A full list of lint rules can be found here.
Advanced Features
We have only scratched the surface of Buf.build. Buf.build has several advanced features:
- Configurable linting rules: Allowing developers to customize linting rules, severity levels, etc. to suit the needs of teams
- Breaking change detection: An important part of service definition is to ensure that evolving them does not result in breaking changes (for older clients). Buf.build provides checks for forward and backward compatibility of your service definitions.
- Buf Schema Registry: Buf.build offers a power schema registry and repositories to host your own custom plugins and annotations.
See Buf.build documentation for a full and in-depth list of all features and capabilities.
Conclusion
Buf.build vastly simplifies managing Protobuf files and all related code. Among its toolchain are a fast and easy-to-use linter, builder, and generator for Protobuf files targeting a variety of languages, protocols, and platforms. This has made it an ideal tool for managing complex codebases. As we will see in the future posts in this series, adding newer models, services, and validations is all that much easier.
Give it a go!
Published at DZone with permission of Sriram Panyam. See the original article here.
Opinions expressed by DZone contributors are their own.
Comments