Data compression is the compacting of information by lowering the number of bits which are stored or transmitted. In this way, the compressed info needs substantially less disk space than the initial one, so additional content might be stored using identical amount of space. You will find various compression algorithms which function in different ways and with a number of them only the redundant bits are erased, therefore once the info is uncompressed, there's no loss of quality. Others erase excessive bits, but uncompressing the data later will result in reduced quality compared to the original. Compressing and uncompressing content takes a huge amount of system resources, especially CPU processing time, so any web hosting platform which employs compression in real time needs to have ample power to support that feature. An example how info can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" what number of sequential 1s or 0s there should be instead of saving the entire code.
Data Compression in Shared Web Hosting
The compression algorithm used by the ZFS file system which runs on our cloud web hosting platform is named LZ4. It can improve the performance of any website hosted in a shared web hosting account on our end since not only does it compress info more effectively than algorithms used by other file systems, but it also uncompresses data at speeds that are higher than the hard disk drive reading speeds. This is achieved by using a great deal of CPU processing time, that is not a problem for our platform considering that it uses clusters of powerful servers working together. A further advantage of LZ4 is that it allows us to generate backups quicker and on reduced disk space, so we will have a couple of daily backups of your databases and files and their generation won't affect the performance of the servers. This way, we can always restore all content that you could have deleted by accident.