Hi Team,
We have one issue on too much latancy on gemfire API (get/put). After debug we found that it is caused by serialization/deserialization cost too much time (additional 200 -300 ms) when data get loaded into client local cache.
Our data model contains two maps of objects and issue will happen if map size larger than 3000 approximately.
What is the best practice to solve our issue? We think it over and have below options, please advise.
- change data model and reduce model size -- this is not achievable in short period as business reason
- change local cache copy-on-read as false which seems disable serialization on local cache -- however it does not work on multiple thread
- Or do we have more efficient serialization tool could be used to gain better performance?
Below is how we implement gemfire serialization in JAVA :
==========================================================
import org.apache.geode.DataSerializable;
import org.apache.geode.DataSerializer;
public class A implements DataSerializable{
private Map<String, Matcher> matcherMap;
private Map<String, Matchee> matcheeMap;
@Override
public void fromData(DataInput in) {
this.matcherMap = DataSerializer.readHashMap(in);
this.matcheeMap = DataSerializer.readHashMap(in);
}
@Override
public void toData(DataOutput out) {
DataSerializer.writeHashMap(matcherMap, out);
DataSerializer.writeHashMap(matcheeMap, out);
}
}
=====================================================
public class Matcher implements DataSerializable{
...
@Override
public void fromData(DataInput in) {
...
}
@Override
public void toData(DataOutput out) {
...
}
}
=====================================================
public class Matchee implements DataSerializable{
...
@Override
public void fromData(DataInput in) {
...
}
@Override
public void toData(DataOutput out) {
...
}
}