HTTP Compression Speeds up the Web Page 3
A Browser that is capable of receiving compressed content indicates this in all of its requests for documents by supplying the following request header field when it asks for something....
Most popular Web Servers are still unable to do this final step.
- The Apache Web Server which has 61 percent of the Web Server market is still incapable of providing any real-time compression of requested documents even though all modern browsers have been requesting them and capable of receiving them for more than two years.
Microsoft's Internet Information Server is equally
deficient. If it finds a pre-compressed version of a requested document it
might send it but has no real-time compression capability.
IIS 5.0 uses an ISAPI filter to support GZIP compression. It works as follows. The user requests a page, the server sends the page and then stores a copy of it "compressed" in a temporary folder. The next time a user requests the page it sends the one stored in the temp directory.
What it then tries to do is constantly check that the pages in the temp directory are always current, and if not gets a current page and then compresses it.
- IBM's WebSphere Server has some limited support for real-time compression but it has "appeared" and "disappeared" from the marketplace through various release versions of WebSphere.
- The very popular Squid proxy server from NLANR also has no dynamic compression capabilities even though it is the de-facto standard proxy-caching software used just about everywhere on the Internet.
The original designers of the HTTP protocol really did not foresee the current reality with so many people using the protocol that every single byte would count. The heavy use of pre-compressed graphics formats such as .GIF and the relative difficulty to further reduce the graphics content makes it even more important that all other exchange formats be optimized as much as possible. The same designers also did not foresee that most HTTP content from major online vendors would be generated dynamically and so there really is no real chance for there to ever be a "static" compressed version of the requested document(s). Public IETF Content-Encoding is still not a "complete" specification for the reduction of Internet content but it does work and the performance benefits achieved by using it are both obvious and dramatic.
What is GZIP?