Does having large number of properties in an Entity effect datastore read/write performance?
I have couple of entities with properties numbering in the range of 40 - 50. All these properties are unindexed. These entities are a part of a larger entitygroup tree structure, and are always retrieved by using their key. None of the properties (except the key property) are indexed. I am using Objectify to work with entities on BigTable.
I want to know if there is any performance impact in reading or writing an entity with large number of properties from/to BigTable.
Since these large entities are only fetched by their keys are never participate in any query, I was wondering if I should serialize the entity pojo and store as a blob. It is pretty straightforward to do this in Objectify using the @Serialized annotation. I understand that by serializing my entity and storing it as a blob, I render the blob totally opaque to any other program or non-Java code, but this is not a concern.
I am yet to benchmark the performance difference, but before doing so, I want to know if anybody has done this before or has any advice/opinion to share.
there is always an overhead for number of properties. and serializing won't help much as it just moves processing from one point to another.
i have entities with number of property up to 25 and i fetch them almost on all request by key. the performance difference is negligible for me. hardly +- 1ms. performance problems normally occurs on query parts. number of unindexed property wont count much in performance. while indexed property can significantly delayed put due to modification of index.
if you must, you can break up property in to multiple table if you not going to need them at once.
Going purely by what little I know of how it works, I'd say having a bunch of unindexed properties wouldn't be any different from having the whole thing serialized.