I am +1 to implement that as an compressing serializer decorator.
For concerns that it's not worth to compress little things and there should be a decision:
- The serializer can do:
public function encode($data) {
$data = $this->serializer->encode($data);
if (sizeof($data) > $this->threshold) {
$data = '1' . 'gzcompress($data, $this->compression_level);
}
else {
$data = '0' . $data;
}
return $data;
}
public function decode($data) {
$encoded = substr($data, 0, 1);
$data = substr($data, 1);
if ($encoded === "1") {
$data = gzuncompress($data);
}
return $this->serializer->decode($data);
}
and that's it already.
Of course in a next step we could also abstract out a CompressorInterface, but that's not needed for a simple first start.
----
Originally I got this idea when I bumped into a shared hosting situation that had max_allowed_packet set to a value of around 2MB and the contents of the serialized admin menu turned out to be just a bit larger than 2MB. The hoster refuses to change this value,so I had to come up with a different solution: cache the data before storing it.
I did some tests on my local machine and it turns out that it is also faster for almost every size of cache data. I.e: writing to the cache is faster starting around 100 to 500KB. Reading the cache is about same speed for (uncompressed) sizes of 5K and faster for higher sizes. This is a setup where the database is on the same machine as the webserver. I did not test it in a situation where the database is on a different server but expect it there to be even faster.
Advantages:
- Performance.
- Less database storage.
- Less network traffic.
- Less chance of running into the dreaded "MySQL Max Allowed Packet Size Exceeded" message.
Disadvantages:
- Slightly more complex cache internals.
Needs work
11.0 🔥
Last updated
It affects performance. It is often combined with the Needs profiling tag.
Not all content is available!
It's likely this issue predates Contrib.social: some issue and comment data are missing.