vk_logo twitter_logo facebook_logo youtube_logo telegram_logo telegram_logo

Google будет использовать Zopfli как новый алгоритм сжатия данных для ускорения загрузки трафика 2

Дата публикации: 04.03.2013
  1. Пользователь
  2. Заметки пользователей
Количество просмотров: 4669

На досуге один из сотрудников Google создал новый алгоритм для сжатия данных. Интересно все таки эти люди проводят свободное время.

Новость такая:

Google publishes Zopfli as open-source compression algorithm to speed up Web downloads

Compression is about 100 times slower than conventional methods but compresses about 5% better, Google said

 Google is open-sourcing a new general purpose data compression library called Zopfli that can be used to speed up Web downloads.

The Zopfli Compression Algorithm, which got its name from a Swiss bread recipe, is an implementation of the Deflate compression algorithm that creates a smaller output size compared to previous techniques, wrote Lode Vandevenne, a software engineer with Google's Compression Team, on the Google Open Source Blog on Thursday.

"The smaller compressed size allows for better space utilization, faster data transmission, and lower Web page load latencies. Furthermore, the smaller compressed size has additional benefits in mobile use, such as lower data transfer fees and reduced battery use," Vandevenne wrote.

The more exhaustive compression techniques used achieve higher data density but also make the compression a lot slower. This does not affect the decompression speed though, Vandenne wrote.

Zopfli is a compression-only library and existing software can be used to decompress the data, he said. Zopfli is compatible with Zip, PNG, gzip and HTTP requests among others, Vandevenne added.

Zopfli's output is generally 3% to 8% smaller compared to zlib, another compression library based on the Deflate compression algorithm, according to Vandevenne. "We believe that Zopfli represents the state of the art in Deflate-compatible compression," he said.

"This compressor takes more time (~100x slower), but compresses around 5% better than zlib and better than any other zlib-compatible compressor we have found," Google said on Zopfli's Google Code page. The code is available under Apache License 2.0.

The new compression library however requires 2 to 3 orders of magnitude more CPU time than zlib at maximum quality. Therefore, it is best suited for applications where data is compressed once and sent over the network many times, such as static content for the Web, Vandevenne said.

Vandevenne and his colleague Jyrki Alakuijala, a Google software engineer who also worked on the project, recommend in their research paper to use Zopfli "for compression of static content and other content where data transfer or storage costs are more significant than the increase in CPU time."

"By open sourcing Zopfli, thus allowing webmasters to better optimize the size of frequently accessed static content, we hope to make the Internet a bit faster for all of us," Vandevenne said.

Непонятно только вот что: в тексте сказано что он сжимает в сто раз медленнее. Это ведь плохо когда медленнее... Или неважно?


От редакции: если у вас есть чем поделиться с коллегами по отрасли, приглашаем к сотрудничеству
Ссылка на материал, для размещения на сторонних ресурсах

Обсудить на форуме

Оставлять комментарии могут только зарегистрированные пользователи