I’m trying to evaluating lagom, but it keeps yelling errors and warnings. I’ve tried to find the cause for several days but I’m not progressing. I hope this forum is the correct place to ask. It’s my last resort…
At first, the application seems to start fine:
sbt:file-heaven> runAll
[info] Updating lagom-internal-meta-project-kafka...
[info] Done updating.
[info] Starting Kafka
[info] Updating lagom-internal-meta-project-cassandra...
[info] Done updating.
[info] Starting Cassandra
....................................................
[info] Cassandra server running at 127.0.0.1:4000
[info] Updating lagom-internal-meta-project-service-locator...
[info] Done updating.
[warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible:
[warn] * io.netty:netty-handler:4.1.29.Final is selected over 4.1.13.Final
[warn] +- com.typesafe.netty:netty-reactive-streams:2.0.0 (depends on 4.1.13.Final)
[warn] +- com.lightbend.lagom:lagom-client_2.12:1.4.9 (depends on 4.1.13.Final)
[warn] * org.scala-lang.modules:scala-parser-combinators_2.12:1.1.0 is selected over {1.0.4, 1.0.6, 1.1.1}
[warn] +- com.lightbend.lagom:lagom-logback_2.12:1.4.9 (depends on 1.1.1)
[warn] +- com.lightbend.lagom:lagom-api_2.12:1.4.9 (depends on 1.0.4)
[warn] +- com.typesafe.play:cachecontrol_2.12:1.1.3 (depends on 1.0.4)
[warn] +- com.typesafe:ssl-config-core_2.12:0.3.6 (depends on 1.0.4)
[warn] +- com.typesafe.play:play_2.12:2.6.20 () (depends on 1.0.4)
[warn] * com.google.guava:guava:22.0 is selected over {16.0, 19.0, 20.0}
[warn] +- com.lightbend.lagom:lagom-api_2.12:1.4.9 (depends on 22.0)
[warn] +- com.typesafe.play:play_2.12:2.6.20 () (depends on 22.0)
[warn] +- org.reflections:reflections:0.9.11 (depends on 20.0)
[warn] +- com.google.inject:guice:4.1.0 (depends on 19.0)
[warn] +- com.fasterxml.jackson.datatype:jackson-datatype-guava:2.8.11 (depends on 16.0)
[warn] * com.typesafe.akka:akka-stream_2.12:2.5.18 is selected over {2.4.20, 2.5.9, 2.5.17}
[warn] +- com.lightbend.lagom:lagom-logback_2.12:1.4.9 (depends on 2.5.18)
[warn] +- com.lightbend.lagom:lagom-api_2.12:1.4.9 (depends on 2.5.18)
[warn] +- com.typesafe.play:play-streams_2.12:2.6.20 () (depends on 2.5.17)
[warn] +- com.typesafe.play:play-ws-standalone_2.12:1.1.10 (depends on 2.5.9)
[warn] +- com.typesafe.akka:akka-http-core_2.12:10.0.14 () (depends on 2.4.20)
[warn] * com.typesafe:ssl-config-core_2.12:0.3.6 is selected over 0.2.2
[warn] +- com.typesafe.akka:akka-stream_2.12:2.5.18 () (depends on 0.3.6)
[warn] +- com.typesafe.play:play-ws-standalone_2.12:1.1.10 (depends on 0.2.2)
[warn] * com.typesafe.akka:akka-actor_2.12:2.5.18 is selected over {2.4.20, 2.5.17}
[warn] +- com.lightbend.lagom:lagom-logback_2.12:1.4.9 (depends on 2.5.18)
[warn] +- com.typesafe.akka:akka-stream_2.12:2.5.18 () (depends on 2.5.18)
[warn] +- com.lightbend.lagom:lagom-api_2.12:1.4.9 (depends on 2.5.18)
[warn] +- com.typesafe.akka:akka-slf4j_2.12:2.5.18 () (depends on 2.5.18)
[warn] +- com.typesafe.play:play_2.12:2.6.20 () (depends on 2.5.17)
[warn] +- com.typesafe.akka:akka-parsing_2.12:10.0.14 () (depends on 2.4.20)
[warn] Run 'evicted' to see detailed eviction warnings
2018-12-04T22:45:48.322Z [info] akka.event.slf4j.Slf4jLogger [] - Slf4jLogger started
2018-12-04T22:45:51.315Z [info] com.lightbend.lagom.discovery.ServiceLocatorServer [] - Service locator can be reached at http://localhost:9008
2018-12-04T22:45:51.316Z [info] com.lightbend.lagom.discovery.ServiceLocatorServer [] - Service gateway can be reached at http://localhost:9000
[info] Service locator is running at http://localhost:9008
[info] Service gateway is running at http://localhost:9000
[info] Updating shapeless-json...
[info] Done updating.
[info] Updating auth-api...
[info] Done updating.
[warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible:
[warn] * org.scala-lang.modules:scala-parser-combinators_2.12:1.0.4 is selected over {1.0.6, 1.1.1}
[warn] +- com.lightbend.lagom:lagom-api_2.12:1.4.9 (depends on 1.0.4)
[warn] +- com.typesafe:ssl-config-core_2.12:0.3.6 (depends on 1.1.1)
[warn] +- com.typesafe.play:play_2.12:2.6.20 () (depends on 1.0.6)
[warn] Run 'evicted' to see detailed eviction warnings
[info] Updating auth-impl...
[info] Done updating.
[warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible:
[warn] * com.github.jnr:jnr-constants:0.9.9 is selected over 0.9.0
[warn] +- org.lmdbjava:lmdbjava:0.6.1 (depends on 0.9.9)
[warn] +- com.github.jnr:jnr-posix:3.0.27 (depends on 0.9.0)
[warn] * io.netty:netty:3.10.6.Final is selected over 3.10.5.Final
[warn] +- com.typesafe.akka:akka-remote_2.12:2.5.18 () (depends on 3.10.6.Final)
[warn] +- org.apache.zookeeper:zookeeper:3.4.10 (depends on 3.10.5.Final)
[warn] * io.netty:netty-handler:4.1.29.Final is selected over {4.0.44.Final, 4.1.13.Final}
[warn] +- com.lightbend.lagom:lagom-persistence-cassandra-core_2.12:1.4.9 (depends on 4.1.29.Final)
[warn] +- com.typesafe.netty:netty-reactive-streams:2.0.0 (depends on 4.1.13.Final)
[warn] +- com.lightbend.lagom:lagom-client_2.12:1.4.9 (depends on 4.1.13.Final)
[warn] +- com.datastax.cassandra:cassandra-driver-core:3.2.0 (depends on 4.0.44.Final)
[warn] * org.scala-lang.modules:scala-parser-combinators_2.12:1.1.0 is selected over {1.0.4, 1.0.6, 1.1.1}
[warn] +- com.typesafe.play:cachecontrol_2.12:1.1.3 (depends on 1.1.0)
[warn] +- com.lightbend.lagom:lagom-cluster-core_2.12:1.4.9 (depends on 1.1.1)
[warn] +- com.lightbend.lagom:lagom-logback_2.12:1.4.9 (depends on 1.0.4)
[warn] +- org.apache.kafka:kafka_2.12:0.11.0.1 (depends on 1.0.4)
[warn] +- com.lightbend.lagom:lagom-api_2.12:1.4.9 (depends on 1.0.4)
[warn] +- com.typesafe:ssl-config-core_2.12:0.3.6 (depends on 1.0.4)
[warn] +- com.typesafe.play:play_2.12:2.6.20 () (depends on 1.0.4)
[warn] * com.google.guava:guava:22.0 is selected over {16.0.1, 19.0, 16.0}
[warn] +- com.lightbend.lagom:lagom-scaladsl-cluster_2.12:1.4.9 (depends on 22.0)
[warn] +- com.lightbend.lagom:lagom-api_2.12:1.4.9 (depends on 22.0)
[warn] +- com.lightbend.lagom:lagom-scaladsl-play-json_2.12:1.4.9 (depends on 22.0)
[warn] +- com.typesafe.play:play_2.12:2.6.20 () (depends on 22.0)
[warn] +- com.fasterxml.jackson.datatype:jackson-datatype-guava:2.8.11 (depends on 16.0)
[warn] +- com.datastax.cassandra:cassandra-driver-core:3.2.0 (depends on 19.0)
[warn] +- org.apache.curator:curator-test:2.10.0 (depends on 16.0.1)
[warn] +- org.apache.curator:curator-client:2.10.0 (depends on 16.0.1)
[warn] * com.typesafe.akka:akka-stream_2.12:2.5.18 is selected over {2.5.7, 2.4.20, 2.5.9, 2.5.17}
[warn] +- com.lightbend.lagom:lagom-logback_2.12:1.4.9 (depends on 2.5.18)
[warn] +- com.lightbend.lagom:lagom-api_2.12:1.4.9 (depends on 2.5.18)
[warn] +- com.typesafe.akka:akka-remote_2.12:2.5.18 () (depends on 2.5.18)
[warn] +- com.typesafe.akka:akka-persistence-query_2.12:2.5.18 () (depends on 2.5.18)
[warn] +- com.typesafe.play:play-streams_2.12:2.6.20 () (depends on 2.5.17)
[warn] +- com.typesafe.play:play-ws-standalone_2.12:1.1.10 (depends on 2.5.9)
[warn] +- com.typesafe.akka:akka-http-core_2.12:10.0.14 () (depends on 2.4.20)
[warn] +- com.typesafe.akka:akka-stream-kafka_2.12:0.18 (depends on 2.5.7)
[warn] * com.typesafe:ssl-config-core_2.12:0.3.6 is selected over 0.2.2
[warn] +- com.typesafe.akka:akka-stream_2.12:2.5.18 () (depends on 0.3.6)
[warn] +- com.typesafe.play:play-ws-standalone_2.12:1.1.10 (depends on 0.2.2)
[warn] * com.typesafe.akka:akka-actor_2.12:2.5.18 is selected over {2.4.20, 2.5.17}
[warn] +- com.lightbend.lagom:lagom-logback_2.12:1.4.9 (depends on 2.5.18)
[warn] +- com.typesafe.akka:akka-stream_2.12:2.5.18 () (depends on 2.5.18)
[warn] +- com.typesafe.akka:akka-persistence_2.12:2.5.18 () (depends on 2.5.18)
[warn] +- com.lightbend.lagom:lagom-api_2.12:1.4.9 (depends on 2.5.18)
[warn] +- com.lightbend.lagom:lagom-scaladsl-play-json_2.12:1.4.9 (depends on 2.5.18)
[warn] +- com.typesafe.akka:akka-slf4j_2.12:2.5.18 () (depends on 2.5.18)
[warn] +- com.typesafe.akka:akka-remote_2.12:2.5.18 () (depends on 2.5.18)
[warn] +- com.typesafe.akka:akka-testkit_2.12:2.5.18 () (depends on 2.5.18)
[warn] +- com.typesafe.play:play_2.12:2.6.20 () (depends on 2.5.17)
[warn] +- com.typesafe.akka:akka-parsing_2.12:10.0.14 () (depends on 2.4.20)
[warn] Run 'evicted' to see detailed eviction warnings
23:46:05.189 [info] play.core.server.AkkaHttpServer [] - Listening for HTTP on /0:0:0:0:0:0:0:0:61221
[info] Compiling 3 Scala sources to /home/user/development/scala/lagom/file-heaven2/shapeless-json/target/scala-2.12/classes ...
[info] Done compiling.
[info] Compiling 1 Scala source to /home/user/development/scala/lagom/file-heaven2/auth-api/target/scala-2.12/classes ...
[info] Done compiling.
[info] Compiling 5 Scala sources to /home/user/development/scala/lagom/file-heaven2/auth-impl/target/scala-2.12/classes ...
[info] Done compiling.
23:46:54.344 [info] akka.event.slf4j.Slf4jLogger [] - Slf4jLogger started
23:46:54.370 [info] akka.remote.Remoting [sourceThread=ForkJoinPool-2-worker-1, akkaSource=akka.remote.Remoting, sourceActorSystem=auth-impl-application, akkaTimestamp=22:46:54.369UTC] - Starting remoting
23:46:54.650 [info] akka.remote.Remoting [sourceThread=ForkJoinPool-2-worker-1, akkaTimestamp=22:46:54.648UTC, akkaSource=akka.remote.Remoting, sourceActorSystem=auth-impl-application] - Remoting started; listening on addresses :[akka.tcp://auth-impl-application@127.0.0.1:44025]
23:46:54.653 [info] akka.remote.Remoting [sourceThread=ForkJoinPool-2-worker-1, akkaTimestamp=22:46:54.651UTC, akkaSource=akka.remote.Remoting, sourceActorSystem=auth-impl-application] - Remoting now listens on addresses: [akka.tcp://auth-impl-application@127.0.0.1:44025]
23:46:54.715 [info] akka.cluster.Cluster(akka://auth-impl-application) [sourceThread=ForkJoinPool-2-worker-1, akkaTimestamp=22:46:54.715UTC, akkaSource=akka.cluster.Cluster(akka://auth-impl-application), sourceActorSystem=auth-impl-application] - Cluster Node [akka.tcp://auth-impl-application@127.0.0.1:44025] - Starting up, Akka version [2.5.18] ...
23:46:54.818 [info] akka.cluster.Cluster(akka://auth-impl-application) [sourceThread=ForkJoinPool-2-worker-1, akkaSource=akka.cluster.Cluster(akka://auth-impl-application), sourceActorSystem=auth-impl-application, akkaTimestamp=22:46:54.817UTC] - Cluster Node [akka.tcp://auth-impl-application@127.0.0.1:44025] - Registered cluster JMX MBean [akka:type=Cluster,port=44025]
23:46:54.818 [info] akka.cluster.Cluster(akka://auth-impl-application) [sourceThread=ForkJoinPool-2-worker-1, akkaTimestamp=22:46:54.817UTC, akkaSource=akka.cluster.Cluster(akka://auth-impl-application), sourceActorSystem=auth-impl-application] - Cluster Node [akka.tcp://auth-impl-application@127.0.0.1:44025] - Started up successfully
23:46:54.903 [info] akka.cluster.Cluster(akka://auth-impl-application) [sourceThread=auth-impl-application-akka.actor.default-dispatcher-4, akkaTimestamp=22:46:54.891UTC, akkaSource=akka.cluster.Cluster(akka://auth-impl-application), sourceActorSystem=auth-impl-application] - Cluster Node [akka.tcp://auth-impl-application@127.0.0.1:44025] - No seed-nodes configured, manual cluster join required
23:46:55.152 [info] akka.cluster.Cluster(akka://auth-impl-application) [sourceThread=auth-impl-application-akka.actor.default-dispatcher-4, akkaSource=akka.cluster.Cluster(akka://auth-impl-application), sourceActorSystem=auth-impl-application, akkaTimestamp=22:46:55.151UTC] - Cluster Node [akka.tcp://auth-impl-application@127.0.0.1:44025] - Node [akka.tcp://auth-impl-application@127.0.0.1:44025] is JOINING itself (with roles [dc-default]) and forming new cluster
23:46:55.155 [info] akka.cluster.Cluster(akka://auth-impl-application) [sourceThread=auth-impl-application-akka.actor.default-dispatcher-4, akkaTimestamp=22:46:55.154UTC, akkaSource=akka.cluster.Cluster(akka://auth-impl-application), sourceActorSystem=auth-impl-application] - Cluster Node [akka.tcp://auth-impl-application@127.0.0.1:44025] - Cluster Node [akka.tcp://auth-impl-application@127.0.0.1:44025] dc [default] is the new leader
23:46:55.167 [info] akka.cluster.Cluster(akka://auth-impl-application) [sourceThread=auth-impl-application-akka.actor.default-dispatcher-4, akkaTimestamp=22:46:55.166UTC, akkaSource=akka.cluster.Cluster(akka://auth-impl-application), sourceActorSystem=auth-impl-application] - Cluster Node [akka.tcp://auth-impl-application@127.0.0.1:44025] - Leader is moving node [akka.tcp://auth-impl-application@127.0.0.1:44025] to [Up]
23:46:55.819 [info] akka.cluster.singleton.ClusterSingletonManager [sourceThread=auth-impl-application-akka.actor.default-dispatcher-17, akkaSource=akka.tcp://auth-impl-application@127.0.0.1:44025/user/cassandraOffsetStorePrepare-singleton, sourceActorSystem=auth-impl-application, akkaTimestamp=22:46:55.818UTC] - Singleton manager starting singleton actor [akka://auth-impl-application/user/cassandraOffsetStorePrepare-singleton/singleton]
23:46:55.843 [info] akka.cluster.singleton.ClusterSingletonManager [sourceThread=auth-impl-application-akka.actor.default-dispatcher-17, akkaSource=akka.tcp://auth-impl-application@127.0.0.1:44025/user/cassandraOffsetStorePrepare-singleton, sourceActorSystem=auth-impl-application, akkaTimestamp=22:46:55.843UTC] - ClusterSingletonManager state change [Start -> Oldest]
23:46:55.844 [info] com.lightbend.lagom.internal.persistence.cluster.ClusterStartupTaskActor [sourceThread=auth-impl-application-akka.actor.default-dispatcher-5, akkaTimestamp=22:46:55.843UTC, akkaSource=akka.tcp://auth-impl-application@127.0.0.1:44025/user/cassandraOffsetStorePrepare-singleton/singleton/cassandraOffsetStorePrepare, sourceActorSystem=auth-impl-application] - Executing cluster start task cassandraOffsetStorePrepare.
23:46:55.968 [info] akka.cluster.singleton.ClusterSingletonManager [sourceThread=auth-impl-application-akka.actor.default-dispatcher-7, akkaTimestamp=22:46:55.967UTC, akkaSource=akka.tcp://auth-impl-application@127.0.0.1:44025/system/sharding/AuthEntityCoordinator, sourceActorSystem=auth-impl-application] - Singleton manager starting singleton actor [akka://auth-impl-application/system/sharding/AuthEntityCoordinator/singleton]
23:46:55.969 [info] akka.cluster.singleton.ClusterSingletonManager [sourceThread=auth-impl-application-akka.actor.default-dispatcher-7, akkaTimestamp=22:46:55.967UTC, akkaSource=akka.tcp://auth-impl-application@127.0.0.1:44025/system/sharding/AuthEntityCoordinator, sourceActorSystem=auth-impl-application] - ClusterSingletonManager state change [Start -> Oldest]
23:46:56.064 [info] akka.cluster.singleton.ClusterSingletonManager [sourceThread=auth-impl-application-akka.actor.default-dispatcher-17, akkaTimestamp=22:46:56.063UTC, akkaSource=akka.tcp://auth-impl-application@127.0.0.1:44025/system/sharding/kafkaProducer-token-topicCoordinator, sourceActorSystem=auth-impl-application] - Singleton manager starting singleton actor [akka://auth-impl-application/system/sharding/kafkaProducer-token-topicCoordinator/singleton]
23:46:56.064 [info] akka.cluster.singleton.ClusterSingletonManager [sourceThread=auth-impl-application-akka.actor.default-dispatcher-17, akkaTimestamp=22:46:56.063UTC, akkaSource=akka.tcp://auth-impl-application@127.0.0.1:44025/system/sharding/kafkaProducer-token-topicCoordinator, sourceActorSystem=auth-impl-application] - ClusterSingletonManager state change [Start -> Oldest]
23:46:56.267 [info] akka.cluster.singleton.ClusterSingletonManager [sourceThread=auth-impl-application-akka.actor.default-dispatcher-4, akkaSource=akka.tcp://auth-impl-application@127.0.0.1:44025/user/readSideGlobalPrepare-UserIndexReadSideProcessor-singleton, sourceActorSystem=auth-impl-application, akkaTimestamp=22:46:56.266UTC] - Singleton manager starting singleton actor [akka://auth-impl-application/user/readSideGlobalPrepare-UserIndexReadSideProcessor-singleton/singleton]
23:46:56.267 [info] akka.cluster.singleton.ClusterSingletonManager [sourceThread=auth-impl-application-akka.actor.default-dispatcher-4, akkaTimestamp=22:46:56.266UTC, akkaSource=akka.tcp://auth-impl-application@127.0.0.1:44025/user/readSideGlobalPrepare-UserIndexReadSideProcessor-singleton, sourceActorSystem=auth-impl-application] - ClusterSingletonManager state change [Start -> Oldest]
23:46:56.278 [info] com.lightbend.lagom.internal.persistence.cluster.ClusterStartupTaskActor [sourceThread=auth-impl-application-akka.actor.default-dispatcher-6, akkaTimestamp=22:46:56.278UTC, akkaSource=akka.tcp://auth-impl-application@127.0.0.1:44025/user/readSideGlobalPrepare-UserIndexReadSideProcessor-singleton/singleton/readSideGlobalPrepare-UserIndexReadSideProcessor, sourceActorSystem=auth-impl-application] - Executing cluster start task readSideGlobalPrepare-UserIndexReadSideProcessor.
23:46:56.339 [info] akka.cluster.singleton.ClusterSingletonManager [sourceThread=auth-impl-application-akka.actor.default-dispatcher-18, akkaSource=akka.tcp://auth-impl-application@127.0.0.1:44025/system/sharding/UserIndexReadSideProcessorCoordinator, sourceActorSystem=auth-impl-application, akkaTimestamp=22:46:56.338UTC] - Singleton manager starting singleton actor [akka://auth-impl-application/system/sharding/UserIndexReadSideProcessorCoordinator/singleton]
23:46:56.341 [info] akka.cluster.singleton.ClusterSingletonManager [sourceThread=auth-impl-application-akka.actor.default-dispatcher-18, akkaTimestamp=22:46:56.338UTC, akkaSource=akka.tcp://auth-impl-application@127.0.0.1:44025/system/sharding/UserIndexReadSideProcessorCoordinator, sourceActorSystem=auth-impl-application] - ClusterSingletonManager state change [Start -> Oldest]
23:46:56.776 [info] akka.cluster.singleton.ClusterSingletonProxy [sourceThread=auth-impl-application-akka.actor.default-dispatcher-6, akkaTimestamp=22:46:56.775UTC, akkaSource=akka.tcp://auth-impl-application@127.0.0.1:44025/user/cassandraOffsetStorePrepare-singletonProxy, sourceActorSystem=auth-impl-application] - Singleton identified at [akka://auth-impl-application/user/cassandraOffsetStorePrepare-singleton/singleton]
23:46:57.229 [info] play.api.Play [] - Application started (Dev)
[info] Service auth-impl listening for HTTP on 0:0:0:0:0:0:0:0:61221
[info] (Service started, press enter to stop and go back to the console...)
After ~ 5-10 seconds of silence, lots of messages of these kind start to appear:
23:47:06.360 [warn] akka.cluster.sharding.ShardRegion [sourceThread=auth-impl-application-akka.actor.default-dispatcher-2, akkaTimestamp=22:47:06.360UTC, akkaSource=akka.tcp://auth-impl-application@127.0.0.1:44025/system/sharding/UserIndexReadSideProcessor, sourceActorSystem=auth-impl-application] - Trying to register to coordinator at [ActorSelection[Anchor(akka://auth-impl-application/), Path(/system/sharding/UserIndexReadSideProcessorCoordinator/singleton/coordinator)]], but no acknowledgement. Total [20] buffered messages. [Coordinator [Member(address = akka.tcp://auth-impl-application@127.0.0.1:44025, status = Up)] is reachable.]
23:47:16.451 [error] akka.cluster.sharding.PersistentShardCoordinator [sourceThread=auth-impl-application-akka.actor.default-dispatcher-7, akkaTimestamp=22:47:16.451UTC, akkaSource=akka.tcp://auth-impl-application@127.0.0.1:44025/system/sharding/AuthEntityCoordinator/singleton/coordinator, sourceActorSystem=auth-impl-application] - Persistence failure when replaying events for persistenceId [/sharding/AuthEntityCoordinator]. Last known sequence number [0]
I do also get these errors:
12/04 23:39:48 ERROR[Native-Transport-Requests-3] o.a.c.t.m.ErrorMessage - Unexpected exception during request
java.lang.RuntimeException: java.util.concurrent.ExecutionException: org.apache.cassandra.exceptions.ConfigurationException: Column family ID mismatch (found 74bcebe0-f815-11e8-b6f5-a9db75e01e2a; expected 6d47bde0-f815-11e8-b6f5-a9db75e01e2a)
at org.apache.cassandra.utils.FBUtilities.waitOnFuture(FBUtilities.java:385) ~[cassandra-bundle.jar:0.59-SNAPSHOT]
at org.apache.cassandra.service.MigrationManager.announce(MigrationManager.java:570) ~[cassandra-bundle.jar:0.59-SNAPSHOT]
at org.apache.cassandra.service.MigrationManager.announceNewColumnFamily(MigrationManager.java:377) ~[cassandra-bundle.jar:0.59-SNAPSHOT]
at org.apache.cassandra.service.MigrationManager.announceNewColumnFamily(MigrationManager.java:362) ~[cassandra-bundle.jar:0.59-SNAPSHOT]
at org.apache.cassandra.service.MigrationManager.announceNewColumnFamily(MigrationManager.java:342) ~[cassandra-bundle.jar:0.59-SNAPSHOT]
at org.apache.cassandra.cql3.statements.CreateTableStatement.announceMigration(CreateTableStatement.java:89) ~[cassandra-bundle.jar:0.59-SNAPSHOT]
at org.apache.cassandra.cql3.statements.SchemaAlteringStatement.execute(SchemaAlteringStatement.java:123) ~[cassandra-bundle.jar:0.59-SNAPSHOT]
at org.apache.cassandra.cql3.QueryProcessor.processStatement(QueryProcessor.java:224) ~[cassandra-bundle.jar:0.59-SNAPSHOT]
at org.apache.cassandra.cql3.QueryProcessor.process(QueryProcessor.java:255) ~[cassandra-bundle.jar:0.59-SNAPSHOT]
at org.apache.cassandra.cql3.QueryProcessor.process(QueryProcessor.java:240) ~[cassandra-bundle.jar:0.59-SNAPSHOT]
at org.apache.cassandra.transport.messages.QueryMessage.execute(QueryMessage.java:116) ~[cassandra-bundle.jar:0.59-SNAPSHOT]
at org.apache.cassandra.transport.Message$Dispatcher.channelRead0(Message.java:517) [cassandra-bundle.jar:0.59-SNAPSHOT]
at org.apache.cassandra.transport.Message$Dispatcher.channelRead0(Message.java:410) [cassandra-bundle.jar:0.59-SNAPSHOT]
at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) [cassandra-bundle.jar:0.59-SNAPSHOT]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) [cassandra-bundle.jar:0.59-SNAPSHOT]
at io.netty.channel.AbstractChannelHandlerContext.access$600(AbstractChannelHandlerContext.java:35) [cassandra-bundle.jar:0.59-SNAPSHOT]
at io.netty.channel.AbstractChannelHandlerContext$7.run(AbstractChannelHandlerContext.java:348) [cassandra-bundle.jar:0.59-SNAPSHOT]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_192]
at org.apache.cassandra.concurrent.AbstractLocalAwareExecutorService$FutureTask.run(AbstractLocalAwareExecutorService.java:162) [cassandra-bundle.jar:0.59-SNAPSHOT]
at org.apache.cassandra.concurrent.SEPWorker.run(SEPWorker.java:109) [cassandra-bundle.jar:0.59-SNAPSHOT]
at java.lang.Thread.run(Thread.java:748) [na:1.8.0_192]
Caused by: java.util.concurrent.ExecutionException: org.apache.cassandra.exceptions.ConfigurationException: Column family ID mismatch (found 74bcebe0-f815-11e8-b6f5-a9db75e01e2a; expected 6d47bde0-f815-11e8-b6f5-a9db75e01e2a)
at java.util.concurrent.FutureTask.report(FutureTask.java:122) ~[na:1.8.0_192]
at java.util.concurrent.FutureTask.get(FutureTask.java:192) ~[na:1.8.0_192]
at org.apache.cassandra.utils.FBUtilities.waitOnFuture(FBUtilities.java:381) ~[cassandra-bundle.jar:0.59-SNAPSHOT]
... 20 common frames omitted
Caused by: org.apache.cassandra.exceptions.ConfigurationException: Column family ID mismatch (found 74bcebe0-f815-11e8-b6f5-a9db75e01e2a; expected 6d47bde0-f815-11e8-b6f5-a9db75e01e2a)
at org.apache.cassandra.config.CFMetaData.validateCompatibility(CFMetaData.java:941) ~[cassandra-bundle.jar:0.59-SNAPSHOT]
at org.apache.cassandra.config.CFMetaData.apply(CFMetaData.java:895) ~[cassandra-bundle.jar:0.59-SNAPSHOT]
at org.apache.cassandra.config.Schema.updateTable(Schema.java:687) ~[cassandra-bundle.jar:0.59-SNAPSHOT]
at org.apache.cassandra.schema.SchemaKeyspace.updateKeyspace(SchemaKeyspace.java:1464) ~[cassandra-bundle.jar:0.59-SNAPSHOT]
at org.apache.cassandra.schema.SchemaKeyspace.mergeSchema(SchemaKeyspace.java:1420) ~[cassandra-bundle.jar:0.59-SNAPSHOT]
at org.apache.cassandra.schema.SchemaKeyspace.mergeSchema(SchemaKeyspace.java:1389) ~[cassandra-bundle.jar:0.59-SNAPSHOT]
at org.apache.cassandra.schema.SchemaKeyspace.mergeSchemaAndAnnounceVersion(SchemaKeyspace.java:1366) ~[cassandra-bundle.jar:0.59-SNAPSHOT]
at org.apache.cassandra.service.MigrationManager$1.runMayThrow(MigrationManager.java:588) ~[cassandra-bundle.jar:0.59-SNAPSHOT]
at org.apache.cassandra.utils.WrappedRunnable.run(WrappedRunnable.java:28) ~[cassandra-bundle.jar:0.59-SNAPSHOT]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_192]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[na:1.8.0_192]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[na:1.8.0_192]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[na:1.8.0_192]
at org.apache.cassandra.concurrent.NamedThreadFactory.lambda$threadLocalDeallocator$0(NamedThreadFactory.java:81) ~[cassandra-bundle.jar:0.59-SNAPSHOT]
... 1 common frames omitted
Whenever I interact with my service I do get this warning:
00:07:16.921 [warn] org.apache.kafka.clients.NetworkClient [] - Error while fetching metadata with correlation id 1 : {token-topic=LEADER_NOT_AVAILABLE}
00:07:17.102 [warn] org.apache.kafka.clients.NetworkClient [] - Error while fetching metadata with correlation id 3 : {token-topic=LEADER_NOT_AVAILABLE}
00:07:17.363 [warn] org.apache.kafka.clients.NetworkClient [] - Error while fetching metadata with correlation id 4 : {token-topic=LEADER_NOT_AVAILABLE}
00:07:17.515 [warn] org.apache.kafka.clients.NetworkClient [] - Error while fetching metadata with correlation id 5 : {token-topic=LEADER_NOT_AVAILABLE}
00:07:17.843 [warn] org.apache.kafka.clients.NetworkClient [] - Error while fetching metadata with correlation id 6 : {token-topic=LEADER_NOT_AVAILABLE}
Here is an exemplary statup log:
My code can be found here.
- I’ve implemented a simple authentication service where users can register and login. The registered users are persistent entites.
- A cassandra read side builds a table where each row contains the uuid, name and email of a user. I’m using it to look up user uuids by their name or email.
- Whenever a user logs in, the service response with a random generated token. This token will be used to authenticate at any microservice. Therefore it gets published to a message broker topic, so other services can build there own store of valid tokens.