The term data compression refers to reducing the number of bits of information that needs to be stored or transmitted. This can be achieved with or without the loss of data, which means that what will be erased during the compression will be either redundant data or unneeded one. When the data is uncompressed later on, in the first case the information and its quality will be the same, while in the second case the quality will be worse. There're different compression algorithms that are better for different sort of data. Compressing and uncompressing data normally takes lots of processing time, therefore the server carrying out the action needs to have sufficient resources in order to be able to process the data quick enough. One simple example how information can be compressed is to store just how many consecutive positions should have 1 and just how many should have 0 in the binary code rather than storing the actual 1s and 0s.
Data Compression in Cloud Hosting
The ZFS file system which runs on our cloud Internet hosting platform uses a compression algorithm called LZ4. The aforementioned is significantly faster and better than every other algorithm you will find, especially for compressing and uncompressing non-binary data i.e. web content. LZ4 even uncompresses data quicker than it is read from a hard disk drive, which improves the performance of Internet sites hosted on ZFS-based platforms. As the algorithm compresses data quite well and it does that quickly, we are able to generate several backup copies of all the content stored in the cloud hosting
accounts on our servers every day. Both your content and its backups will take reduced space and since both ZFS and LZ4 work very fast, the backup generation will not influence the performance of the servers where your content will be kept.
Data Compression in Semi-dedicated Servers
The ZFS file system which runs on the cloud platform where your semi-dedicated server
account will be created uses a powerful compression algorithm called LZ4. It's one of the best algorithms out there and positively the most efficient one when it comes to compressing and uncompressing web content, as its ratio is very high and it can uncompress data at a faster rate than the same data can be read from a hard disk drive if it were uncompressed. Thus, using LZ4 will speed up any website that runs on a platform where this algorithm is enabled. The high performance requires lots of CPU processing time, that's provided by the large number of clusters working together as part of our platform. In addition, LZ4 makes it possible for us to generate several backups of your content every day and save them for one month as they will take a reduced amount of space than regular backups and will be generated much more quickly without loading the servers.