Google Shares Zopfli
Google encourages its engineers to work on personal projects as part of their “20 percent time” and occasionally some of those are made public and open sourced for third party developers. That’s why the firm has open sourced its Zopfli compression algorithm, claiming it produces three to eight percent smaller files compared to zlib.
Google’s Zopfli algorithm is based on the Deflate algorithm but has been optimised to produce smaller file sizes at the expense of compression speed. The firm said the compression library, written in C, is based on iterative entropy modelling and a shortest path algorithm, adding that it is bit-stream compatible, meaning that it can be used with gzip, Zip and most importantly HTTP requests.
Lode Vandevenne, a software engineer on Google’s Compression Team who implemented the Zopfli algorithm said, “Due to the amount of CPU time required – two to three orders of magnitude more than zlib at maximum quality – Zopfli is best suited for applications where data is compressed once and sent over a network many times.”
Ultimately Vandevenne’s algorithm might be costly when it comes to CPU cycles for compression – he claims there is no performance hit in decompression – but the fact is that CPU cycles are significantly cheaper than network bandwidth. Developers such as Opera have worked hard on web compression to speed up webpage rendering in markets where 3G connectivity is patchy or non-existent.