I am using the ByteBuffer based serialization mechanism suggested for use with the Akka Artery remoting. Snapshots are failing as the data to be persisted grows. The problem is that the data is very dynamic, can’t even measure or pre-empt what the largest buffer size a persistent actor state will be.
How can I acquire a ByteBuffer size from a BufferPool as cited in the https://doc.akka.io/docs/akka/current/remoting-artery.html#serialization code section?
import akka.serialization.ByteBufferSerializer;
import akka.serialization.SerializerWithStringManifest;
class ExampleByteBufSerializer extends SerializerWithStringManifest
implements ByteBufferSerializer {
@Override
public int identifier() {
return 1337;
}
@Override
public String manifest(Object o) {
return “serialized-” + o.getClass().getSimpleName();
}
@Override
public byte[] toBinary(Object o) {
// in production code, acquire this from a BufferPool
final ByteBuffer buf = ByteBuffer.allocate(256);
toBinary(o, buf);
buf.flip();
final byte[] bytes = new byte[buf.remaining()];
buf.get(bytes);
return bytes;
}
@Override
public Object fromBinary(byte[] bytes, String manifest) {
return fromBinary(ByteBuffer.wrap(bytes), manifest);
}
@Override
public void toBinary(Object o, ByteBuffer buf) {
// Implement actual serialization here
}
@Override
public Object fromBinary(ByteBuffer buf, String manifest) {
// Implement actual deserialization here
return null;
}
}
NB::
public byte[] toBinary(Object o) {
// in production code, acquire this from a BufferPool
final ByteBuffer buf = ByteBuffer.allocate(256);
…