The term data compression identifies reducing the number of bits of info which needs to be stored or transmitted. This can be achieved with or without the loss of data, which means that what will be removed during the compression shall be either redundant data or unneeded one. When the data is uncompressed subsequently, in the first case the information and the quality will be the same, while in the second case the quality will be worse. You can find various compression algorithms which are more effective for different type of info. Compressing and uncompressing data in most cases takes lots of processing time, so the server carrying out the action must have adequate resources in order to be able to process your info quick enough. One simple example how information can be compressed is to store just how many consecutive positions should have 1 and just how many should have 0 within the binary code as an alternative to storing the particular 1s and 0s.
Data Compression in Web Hosting
The ZFS file system that operates on our cloud hosting platform employs a compression algorithm named LZ4. The latter is a lot faster and better than every other algorithm you can find, particularly for compressing and uncompressing non-binary data i.e. web content. LZ4 even uncompresses data quicker than it is read from a hard disk, which improves the overall performance of websites hosted on ZFS-based platforms. As the algorithm compresses data really well and it does that very fast, we're able to generate several backups of all the content kept in the web hosting accounts on our servers on a daily basis. Both your content and its backups will require less space and since both ZFS and LZ4 work very quickly, the backup generation will not influence the performance of the web hosting servers where your content will be kept.