Data compression is the compacting of info by decreasing the number of bits which are stored or transmitted. In this way, the compressed info will need much less disk space than the initial one, so extra content can be stored using identical amount of space. You can find different compression algorithms that work in different ways and with some of them only the redundant bits are removed, which means that once the info is uncompressed, there is no decrease in quality. Others delete excessive bits, but uncompressing the data afterwards will lead to lower quality compared to the original. Compressing and uncompressing content needs a significant amount of system resources, particularly CPU processing time, therefore every hosting platform that employs compression in real time should have ample power to support that feature. An example how data can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" how many sequential 1s or 0s there should be instead of keeping the actual code.
Data Compression in Shared Web Hosting
The compression algorithm which we use on the cloud hosting platform where your new shared web hosting account will be created is named LZ4 and it's used by the revolutionary ZFS file system which powers the platform. The algorithm is a lot better than the ones other file systems employ as its compression ratio is much higher and it processes data considerably faster. The speed is most noticeable when content is being uncompressed as this happens more quickly than info can be read from a hard drive. Consequently, LZ4 improves the performance of every site hosted on a server that uses this particular algorithm. We take full advantage of LZ4 in one more way - its speed and compression ratio make it possible for us to generate multiple daily backup copies of the entire content of all accounts and store them for thirty days. Not only do these backups take less space, but also their generation does not slow the servers down like it can often happen with other file systems.