Data compression is the decrease of the number of bits that need to be stored or transmitted and the process is quite important in the web hosting field since data recorded on HDDs is usually compressed to take less space. You will find various algorithms for compressing information and they offer different efficiency depending on the content. Some of them remove just the redundant bits, so that no data can be lost, while others remove unneeded bits, which leads to worse quality when the particular data is uncompressed. The process employs plenty of processing time, which means that a web hosting server has to be powerful enough so as to be able to compress and uncompress data immediately. An illustration how binary code may be compressed is by "remembering" that there're five consecutive 1s, for example, in contrast to storing all five 1s.

Data Compression in Shared Hosting

The ZFS file system that is run on our cloud Internet hosting platform uses a compression algorithm named LZ4. The latter is a lot faster and better than every other algorithm you'll find, particularly for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data quicker than it is read from a hard drive, which improves the performance of sites hosted on ZFS-based platforms. Since the algorithm compresses data very well and it does that quickly, we are able to generate several backups of all the content kept in the shared hosting accounts on our servers every day. Both your content and its backups will require less space and since both ZFS and LZ4 work very quickly, the backup generation will not affect the performance of the servers where your content will be stored.

Data Compression in Semi-dedicated Hosting

In case you host your Internet sites in a semi-dedicated hosting account from our company, you can experience the advantages of LZ4 - the powerful compression algorithm employed by the ZFS file system that is behind our advanced cloud hosting platform. What distinguishes LZ4 from all of the other algorithms out there is that it has a higher compression ratio and it is much quicker, in particular with regard to uncompressing website content. It does that even quicker than uncompressed information can be read from a hard drive, so your websites will perform faster. The higher speed is at the expense of using a great deal of CPU processing time, that is not an issue for our platform because it consists of multiple clusters working together. In combination with the better performance, you'll also have multiple daily backups at your disposal, so you could recover any deleted content with just a couple of clicks. The backups are available for an entire month and we can afford to store them because they take considerably less space compared to standard backups.