duplicate the protobufs across projects - this may seem counter intuitive, but as they are a protocol description and gRPC services are possible to evolve in a backwards compatible way it can be fine, tricky part is to keep track of what version of a service a certain copy is, good part is that it does not tie it to a certain language package management if your services are polyglot
package/publish the protobuf files in a jar and use a “protobuf” dependency to pull it into downstream projects - this way you release versions, and downstream projects know exactly which version they depend on/support, while still allowing completely different languages/JDK versions/Akka versions (or even some other gRPC client lib)
package both protobuf messages and generated classes in a client jar and depend on the native Java/Scala classes for interacting with the service, this can be convenient but creates the tightest coupling between the consuming services and the upstream service and can make it harder to evolve them independently which is an important aspect for micro services.
Note that even with a good strategy to share the protos it can be good to avoid using the same message types of the upstream service for other things than interacting with it, see: API Best Practices | Protocol Buffers Documentation
Hi @zlaja , I am implementing AKKA grpc call to other micro services.
These microservices are client side, not our end. We have only mutilple proto buf files only.
So here i have some confusion how to make two grpc call, one after another. if you are available . please let me know time. will setup the call to discuss further. karthikalogs@gmail.com