Sometimes we need to create a consumer for the 3rd party Kafka topic which should also produce records to another topics (or even the same topic).
Lagom 1.4.x provides an ability to create service desciptors to consume 3rd party topics as described here. However since Lagom lacks of the API to produce topic records without an event stream from the PersistentEntity
we need to use the lower level akka-stream-kafka
API to achieve this. It seems OK until tests are not running :). The fact is that in the test environment run the Lagom consumer API relies on the in-memory implementation of the broker API but akka-stream-kafka
(it is being used at producing part of the consumer code) expects to the real Kafka broker. It is possible to use embedded Kafka such as kafka-junit for the producers using akka-stream-kafka
.
But altogether it looks tricky. Maybe someone knows better approach to test these “publishing while consumption” cases ? Maybe there is a way to force the Lagom topic consumer API to use the real (e.g. embedded) Kafka broker in test runs? Or maybe something else ?
When I write complicated logic in my topic subscriber, I break it out into its own Akka Graph, then test the graph in isolation from Kafka. This method might not enable the best test coverage however.
I would also interested to know if starting Kafka during tests is possible. I already understand it’s possible to configure my test/resources/application.conf use on a local server.
I haven’t worked with a 3rd party Kafka topic before, this does sound like a interesting scenario.
- I also have the processing graph and the branch (a flow) in it that’s just for producing( i.e. invokes
... .via(Producer.flow(producerSettings)). ...
) - Although Lagom is able to run Kafka in dev environment (
sbt runAll
) it doesn’t start it in test env (sbt test
). It seems there is alternative kafka-junit - “3rd party Kafka topic” means that not only Lagom services can produce records to it but any Kafka client