Data compression is the compacting of data by reducing the number of bits which are stored or transmitted. This way, the compressed information will require substantially less disk space than the initial one, so additional content could be stored on the same amount of space. There're many different compression algorithms which work in different ways and with a lot of them just the redundant bits are deleted, so once the data is uncompressed, there is no loss of quality. Others delete unneeded bits, but uncompressing the data later on will result in reduced quality in comparison with the original. Compressing and uncompressing content needs a large amount of system resources, in particular CPU processing time, so any hosting platform that uses compression in real time must have adequate power to support that feature. An example how information can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" how many consecutive 1s or 0s there should be instead of storing the whole code.
Data Compression in Shared Hosting
The ZFS file system which runs on our cloud web hosting platform employs a compression algorithm named LZ4. The latter is considerably faster and better than every other algorithm you'll find, especially for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data quicker than it is read from a hard drive, which improves the performance of websites hosted on ZFS-based platforms. Because the algorithm compresses data quite well and it does that quickly, we're able to generate several backups of all the content stored in the shared hosting accounts on our servers every day. Both your content and its backups will take less space and since both ZFS and LZ4 work very quickly, the backup generation will not affect the performance of the servers where your content will be stored.
Data Compression in Semi-dedicated Hosting
The semi-dedicated hosting plans which we offer are created on a powerful cloud platform that runs on the ZFS file system. ZFS works with a compression algorithm known as LZ4 that outperforms any other algorithm available on the market in terms of speed and data compression ratio when it comes to processing web content. This is valid particularly when data is uncompressed because LZ4 does that more rapidly than it would be to read uncompressed data from a hard disk and owing to this, sites running on a platform where LZ4 is present will work faster. We can benefit from the feature although it requires quite a great deal of CPU processing time because our platform uses a huge number of powerful servers working together and we never make accounts on just a single machine like many companies do. There's a further reward of using LZ4 - considering the fact that it compresses data really well and does that very fast, we can also generate several daily backups of all accounts without influencing the performance of the servers and keep them for 30 days. In this way, you will always be able to recover any content that you erase by accident.