Data compression is the compacting of information by decreasing the number of bits that are stored or transmitted. As a result, the compressed info will need considerably less disk space than the original one, so more content might be stored using identical amount of space. You can find various compression algorithms that work in different ways and with many of them just the redundant bits are deleted, which means that once the info is uncompressed, there's no loss of quality. Others delete unneeded bits, but uncompressing the data at a later time will lead to reduced quality in comparison with the original. Compressing and uncompressing content requires a significant amount of system resources, particularly CPU processing time, therefore every Internet hosting platform that employs compression in real time should have enough power to support this attribute. An example how info can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" how many consecutive 1s or 0s there should be instead of storing the entire code.

Data Compression in Shared Web Hosting

The ZFS file system which runs on our cloud Internet hosting platform uses a compression algorithm identified as LZ4. The latter is considerably faster and better than every other algorithm you will find, particularly for compressing and uncompressing non-binary data i.e. web content. LZ4 even uncompresses data quicker than it is read from a hard disk, which improves the overall performance of Internet sites hosted on ZFS-based platforms. Because the algorithm compresses data really well and it does that quickly, we're able to generate several backups of all the content stored in the shared web hosting accounts on our servers every day. Both your content and its backups will require reduced space and since both ZFS and LZ4 work very fast, the backup generation will not change the performance of the servers where your content will be stored.