Portal Home > Knowledgebase > Articles Database > Wordpress blogs being crawled to death
Wordpress blogs being crawled to death
Posted by glace, 06-08-2011, 09:43 PM |
A very very common issue (affecting about 1 in 50 users) is that some customers use plugins for Wordpress that load remote content into their own blog. This creates about 20-30.000 pages. Now the issue is that these blogs are then being crawled to death by search engine bots. Each access is one PHP and MySQL execution and this happens 20-30.000 times within a few minutes.
Does anyone have an idea how to handle this as a host ? To just suspend the site seems kind of unfair. Afterall the customer has done nothing illegal. And he has not overloaded the server either. Rather the crawlers are doing a ****** job by massively overloading the server. Is there some way to limit the crawler's access rate or something ?
|
Posted by mellow-h, 06-09-2011, 05:42 AM |
Advise your client to add wp-supercache plugin. This would create html pages for all these posts. As these pages would never change, this would be a perfect solution to reduce the cpu & mysql usage.
|
Posted by glace, 06-09-2011, 08:53 AM |
Awesome ! Thanks a lot for the help !
|
Posted by brianoz, 06-10-2011, 07:54 AM |
A sitemap plugin might help too, as Google could check that rather than manually crawling ...
|
Add to Favourites Print this Article
Also Read