Published on
Mon May 29, 2023

Part 3 - Managing gRPC plugins with


We concluded the last part by generating a grpc gateway service to act as a proxy in front of our grpc chat service. This service would convert familiar REST/HTTP requests from clients to/from our grpc services. We also briefly discussed how the protoc utility orchestrates various plugins to generate various artifacts - solely via stdin and stdout.

In the same vein our REST/HTTP gateway was generated by using the grpc-gateway plugin and annotating our rpc methods by http method, bindings and parameter details (for further customization). We also generated OpenAPI specs for our generated gateway service so it could be consumed via the SwaggerUI too.

A key part of our workflow so far was:

  • using custom plugins (grpc-gateway)
  • adding a call to the custom plugins in our Makefile (eg protoc --<pluginname>_out=....)
  • copying vendor specific .proto files for custom annotations in our service specs.

To our dismay we also discovered protoc did not offer much in the way of package management so when it was time use Google’s http annotations, we had to manually copy a bunch of third party/vendor proto files to access them. Clearly this is a burden and only worsens as codebases become larger. Even worse when code bases have dependencies to resolve (either from other parts of the codebases or from third party vendors) managing this burden because a daunting and error-prone task for developers. Another area we have been carefully avoiding (but will only get harder with larger codebases) is that of testing, validation, collaboration between developers when using plugins in our grpc services.

Ideally our tool of choice should:

  • manage dependencies between proto packages within a project
  • manage dependencies between several plugin/annotation providers
  • generate artifacts and manage output dirs etc inherently instead of having to worry about output paths etc
  • validate/enforce coding conventions and standards (via linting tools)
  • improve collaboration between teams and organizations. is one such tool.

In this article we will provide a brief overview of, its capabilities and guide you through the process of migrating our canonical running example - the onehub - to use for generating all required artifacts. We will also remove any copied annotations and instead use Buf’s package management facilities and repositories.

What is is an open source tool for managing protobuf files and plugins. With inting, building, testing protobuf files and code generation is simplified. makes it easier for developers (especially new to protobufs) to work with protobufs by unifying several aspects of the related workflows under one roof.

Some of the key features of are:

  • Repository management for plugins for easy discovery and accessibility
  • Linting for checking syntax errors and style violations in protobuf files
  • Testing for protobuf files to ensure behaviours and catching any regressions
  • Building protobuf files into binary and JSON representations to be used in several environments and contexts (gRPC, REST etc)
  • Code generation for a variety of programming languages by invoking and managing plugins (without placing the burden of installing and maintaining these plugins on the developer). This is especially valuable as a lot of repetitive tasks are automated saving valuable developer time.

There are several benefits to using

  • By enforcing best practices via linting and testing, potential issues can be identified early on in the development cycle. Syntax errors, style violations are just some of the issues that can be caught before they make it into production. This results in maintainable and clean code.
  • Builds are also sped up as only the changed and necessary parts can be rebuilt. This would speed up the (code, build, test - and commit) inner loop of development improving time-to-market.
  • Since plugins and annotations can now be managed in a single source (or set of repositories) sharing and collaboration between developers (within and across organizations) is easier and less error/conflict prone. No longer would developers have to copy third-party protos into their work space and risk versions going out of sync.
  • Increasing adoption in several organization to manage complex code bases also provides a feedback loop of more frequent updates, features and support for the wider community.

Getting Started

Now that we have given a brief overview of let us migrate our service to using this. Detailed instructions are provided at the documentation pages and can be used for any updates. This article is specifically focussed on updating an existing service (eg our onehub) to use


First install the cli for your platform by following the instructions at installation page.

eg, For OSX:

brew install bufbuild/buf/buf

Configure buf.yaml

In your protos root folder (onehub/protos):

buf mod init

This would create a buf.yaml file. Buf uses this file to determine and resolve import paths for all protos within this directory tree. The protos can be tested by running buf build.

On a new project this would be error free. However since we are now already using external dependencies (google.api.*) you may see the following error:

onehub/v1/topics.proto:8:8:google/api/annotations.proto: does not exist

Rightly this indicates an external dependency (google.api extensions) but the dependency has not been specified. To fix this add a new dependency in the buf.yaml file:

In the buf.yaml file, add a deps section:


version: v1
    - FILE

Then run buf mod update to synchronize and lock the dependencies. Note: buf mod update must be run every time buf.yaml is modified (for example we could add to the list of deps).

Running buf build now (from the protos folder) will result in no errors.

Configure buf.gen.yaml

The buf.yaml file created earlier is used to mark the root of the protobuf files of a buf module (as you can guess - our module will eventually be called onehub). However we have still not marked/identified our actual module (or its root). This module config is buf.gen.yaml and is placed in file in the root directory of your project.

Let us create this file (in the onehub project root directory):

# cd onehub
touch buf.gen.yaml

This yaml will contain information about the module such as version, plugins being used as well as other required bits of information (like the go_package_prefix). Our buf.gen.yaml file looks like:

version: v1
  enabled: true
  - plugin: go
    out: gen/go
    opt: paths=source_relative
  - plugin: go-grpc
    out: gen/go
    opt: paths=source_relative,require_unimplemented_servers=false
  - plugin: grpc-gateway
    out: gen/go
    opt: paths=source_relative
  # Python
  - plugin:
    out: gen/python
  - plugin:
    out: gen/python
  # Plain JS
  - plugin:
    out: gen/js
  - plugin:
    out: gen/js
  # generate openapi documentation for api
  - remote:
    out: gen/openapiv2
    opt: allow_merge=true,merge_file_name=services

Notice how adding a new plugin (for a new generator) is as adding another “plugin” entry in the plugins section of buf.gen.yaml file. Just for fun we are also generating python and JS generated artifacts too!

As you may have observed, some plugins have an absolute name (eg “go”, “go-grpc”) and others have something that looks like a URL (, ….). as mentioned earlier has a powerful plugin repository which can host and serve plugins for use. can also work with local plugins. In a previous tutorial we had manually installed to protoc-gen-go and protoc-gen-go-grpc plugins in the $GOBIN folder. If a plugin does not point to a URL (hosting the plugin) simply invokes a local version of the plugin. The options (via the “opt” attributes) are passed to the protoc tool as a flag for the plugin (eg --go_opt=paths=source_relative).

Now isn’t that much more simplier and intuitive than installing each plugin locally and then adding commands to a Makefile (sorry oh mighty Makefile. We shall not forget your sacrifice!)

To generate all the artifacts simply run buf generate from the root of your module (onehub). But before we do this, one more config file must be created.


If we had run the above command at this point we would see the following error:

protos/onehub/v1/topics.proto:7:8:onehub/v1/models.proto: does not exist

Our protos folder had a single “module” in it.’s enables developers to work with multiple modules in the same workspace. This is done by marking onehub as a “workspace”. Create at the project root (onehub) with the following contents:

version: v1
  - protos

This tells that all protobuf definitions can be rooted in any of the directories given in the ‘directories’ attribute in the workspace definition file. Now we can generate our artifacts:

rm -Rf gen        # just to convince ourselves
buf generate

This would generate all the artifacts in the gen folder - but without needing a Makefile and the various manual protoc commands. allows us to have multiple (local) modules within a same workspace enabling more complicated structures like. For example if our file’s ‘directories’ attribute was:

version: v1
  - protos/stores_api
  - protos/radiostations_api
├── buf.gen.yaml
├── proto
    └── radiostations_api
        └── v2
            └── channels.proto
            └── showhosts.proto
    └── stores_api
        └── v1
            └── orders.proto
            └── shoppingcart.proto

For an indepth guide on how workspaces work checkout the documentation.

Other Features

Linting also comes with an out of the box style guide for your protobuf definitions which can be used for consistency across teams. Let us try it.

buf lint

We got lucky! But Despite best intent several stylistic and structural errors can creep in which does a great job in detecting and recommending a fix for. A full list lint rules can be found here.

Advanced features

We have only scratched the surface of has several advanced features:

  • Configurable linting rules: Allowing developers to customize linting rules, severity levels etc to suite the needs of teams.
  • Breaking change detection: An important part of service definition is to ensure that evolving them do not result in breaking changes (for older clients). provides checks for forward and backward compatibility of your service definitions.
  • Buf Schema Registry: offers a power schema registry and repositories to host your own custom plugins and annotations.
  • and many more.

See documentation for a full and indepth list of all features and capabilities.

Conclusion vastly simplifies managing protobuf files and all related code. Among its tool chain are a fast and easy to use linter, builder and generator for protobuf files targetting a variety of languages, protocols and platforms. This has made it an ideal tool for managing complex codebases.

Give it a go!