The term data compression means lowering the number of bits of info which should be saved or transmitted. This can be done with or without losing information, so what will be erased at the time of the compression will be either redundant data or unneeded one. When the data is uncompressed subsequently, in the first case the content and its quality shall be the same, while in the second case the quality shall be worse. You'll find different compression algorithms that are more effective for various sort of data. Compressing and uncompressing data normally takes a lot of processing time, so the server executing the action needs to have sufficient resources to be able to process your info quick enough. One simple example how information can be compressed is to store just how many sequential positions should have 1 and how many should have 0 inside the binary code instead of storing the particular 1s and 0s.
Data Compression in Shared Hosting
The compression algorithm used by the ZFS file system which runs on our cloud hosting platform is named LZ4. It can boost the performance of any Internet site hosted in a shared hosting account on our end since not only does it compress data more efficiently than algorithms used by various other file systems, but it uncompresses data at speeds which are higher than the HDD reading speeds. This is achieved by using a lot of CPU processing time, which is not a problem for our platform because it uses clusters of powerful servers working together. An additional advantage of LZ4 is that it enables us to generate backup copies more quickly and on less disk space, so we will have multiple daily backups of your databases and files and their generation won't influence the performance of the servers. In this way, we can always restore all of the content that you may have removed by mistake.