Data compression is the compacting of information by lowering the number of bits that are stored or transmitted. In this way, the compressed info will take less disk space than the original one, so much more content could be stored using the same amount of space. There are many different compression algorithms which function in different ways and with many of them just the redundant bits are deleted, therefore once the info is uncompressed, there's no loss of quality. Others remove unneeded bits, but uncompressing the data afterwards will lead to lower quality compared to the original. Compressing and uncompressing content takes a significant amount of system resources, and in particular CPU processing time, therefore every hosting platform which employs compression in real time must have enough power to support that feature. An example how information can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" what number of consecutive 1s or 0s there should be instead of storing the whole code.
Data Compression in Cloud Website Hosting
The ZFS file system which operates on our cloud Internet hosting platform uses a compression algorithm named LZ4. The aforementioned is considerably faster and better than every other algorithm you will find, particularly for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data faster than it is read from a hard drive, which improves the performance of sites hosted on ZFS-based platforms. Because the algorithm compresses data really well and it does that very quickly, we are able to generate several backup copies of all the content stored in the cloud website hosting accounts on our servers on a daily basis. Both your content and its backups will take reduced space and since both ZFS and LZ4 work very quickly, the backup generation will not change the performance of the servers where your content will be kept.