IIS6 and IIS7 gzip compression for static files

  
                  

First explain why you should write this article, and entangled in this "small problem." Firstly, the gzip compression of the static file is very useful to improve the access speed of the website, and effectively reduce the time-taken of the crawling static page of the spider. At the same time, it does not cause the 200 0 64 crawling of the Baidu spider like the dynamic file compression. The problem, on the one hand, the fast speed of the website is conducive to improving the user experience. On the other hand, the google administrator blog has clearly stated that the website speed is one of the factors of ranking this year, and for the foreign host to do Baidu Chinese station optimization, time- The unsatisfactory taking will lead to less crawling of Baidu spiders. Guo Ping mentioned in the blog post how the loading speed of the blog article affects the SEO effect. The total time for spiders to crawl the website is fixed for a fixed period of time. Then, the crawling speed goes up, the number of crawling pages will be more, and vice versa.

The final reason for the difference between the two hosts returning gzip results is that the iis version is not too small for my guessed cache folder settings

In fact, iis7 has a better static compression than iis6. Big update, in IIS6, static compression is done on a different thread, so after receiving an HTTP request, the first HTML version sent to the browser is uncompressed, and IIS6 will start This file is compressed using a different thread and the compressed version is stored in the cache folder of the compressed file for a long time. In the past, that is, on the IIS6 server, after the compression was completed, all HTTP requests to the compressed version of the static file, IIS6 will directly call the compressed version directly from the cache folder and return it to the browser.

But in IIS7, compression is done on the main thread, and in order to save the cost of compression, IIS7 does not compress all versions of static files that are frequently accessed by users for all HTTP requests. Save, which is why I did not compress the first time before, and the short-term access again returned a compressed version, but the reason for the uncompressed version returned in a few minutes. Here we can understand that IIS7 does not actually save the compressed version to the cache folder, but only save it in the server memory, or temporarily save the compressed version to the cache folder, and delete it after a while.

And IIS7 defines what files are frequently accessed. The method that conforms to the compression standard is the following two properties in system.webServer/serverRuntime, frequentHitThreshold and frequentHitTimePeriod. If IIS receives access to a static file for more than the frequentHitThreshold threshold during the frequentHitTimePeriod period, IIS7 compresses the static file like IIS6 and saves the compressed version in the cache folder of the compressed file for a long time. Inside. If a user accesses a file on the website and a cached version of the file already exists in the cache folder, IIS7 will not judge the logic of frequentHitThreshhold but return the compressed version directly to the browser.

This setting is really painful, but the official response from Microsoft is to use this to improve server performance. . . So if you want IIS7 to be able to compress like IIS6, there are two solutions, of course, both modify the defaultHitThreshold and frequentHitTimePeriod two values:

The first is to add the following content in the web.config , set the frequentHitThreshold to 1, and the frequentHitTimePeriod modulation for 10 minutes

<system.webServer>

<serverRuntime enabled="true"

frequentHitThreshold="1"< Br>

frequentHitTimePeriod="00:10:00"/>

</system.webServer>

The second method is to open %windir%\\system32\\inetsrv\\ Appcmd.exe, and then enter the following command string in the command line interface, and then enter

set config -section:system.webServer/serverRuntime -frequentHitThreshold:1

Microsoft official recommendation is not The radical approach is not to lower the frequentHitThreshold but to increase the frequentHitTimePeriod, which makes the server performance more moderate. It should be mentioned here that for friends who have VPS, it is recommended to set it manually, and if the web host user can set it, it depends on the service provider. I can't change it in tragedy. Let's try it out

Copyright © Windows knowledge All Rights Reserved