How to avoid the trouble caused by the website replacement server? The reason

  

First, let's look at the following phenomena:

1, DNS resolution problem. When we transfer the server, we need to change the operation/IP, and we need to re-dissolve it to the new server ip. That is to say, we transfer the space or the server, and the domain name needs to be re-parsed. Then there will be such a situation. The global effective time of the DNS is generally It is 24 hours (up to 72 hours), then in this time, the ip address of each local ping domain will be different (due to the difference in DNS effectiveness), sometimes the domain name can be resolved, you can access, 5 minutes It took effect, but when friends from other provinces helped me ping, it was still the previous ip. If during this time, your old station can't access, or your website data is deleted, your website will be affected, because the search engine will crawl again when there is a dead link (unstable Space may be reduced), I believe friends who often do domain name space should be very familiar.

2, spider ip address of the domain name will be pointed to the cache (usually the default is 3600 seconds). That is to say, for quick access, the spider saves the response time of the server. It will cache data for each domain name. It doesn't matter how long the cache data is kept. (Depending on the situation, it is also related to setting the domain name resolution. ), because once we have ip replacement, the search engine spiders are still using the cached IP address to access, we found that our website can not be accessed, then there will be dead links, we all know that dead links and websites can not be accessed, Will lead to website keyword rankings and website weights as well as the website's credit rating decline!

then we should do to solve the problems above it? When

transfer server try to take less time spiders visit, or there is a advance before the transfer server.

how I know spiders access time period which is the least of it? Very simple, we just need to set our IIS log to generate one log per hour, 24 logs a day. Then use the IIS batch inspection tool to detect, then we can see which time period spider crawling is the smallest, we will solve that hour.

when replacing servers and server space to ensure that the original can continue to access (this is said earlier advance). Replace the server when

original server and ensure that space can continue to access. That is to say, our data needs to be updated synchronously. Of course, it takes time to synchronize the update, so that the original program can still be accessed. That is to say, we resolve the ip address to a new ip address, and it takes effect for a period of time. It has something to do with the region, so we have to ensure that the original ip address is at least 24 hours, until the new ip address can be fully effective globally. Of course, if you are not doing a foreign trade website, you only need to look at China! In fact, as long as you seriously observe the IIS log, you can find that Baidu has Unicom spiders and Baidu spiders, because Baidu also has a lot of servers distributed throughout the country, and each server has the function of crawling websites.

you need to synchronize two program updates.

your original space that can be accessed by ip address, then you need to be synchronized.

PS: Considering the domestic filing problem, it is recommended not to change operators frequently, so as to avoid unnecessary troubles

Copyright © Windows knowledge All Rights Reserved