Search engine crawling errors
sreerag
Joined: 2010-01-17
Posts: 11 |
Posted: Sun, 2012-08-19 15:02 |
Google search engine crawling unwanted url in my site and creating large number of server error and 404 error examples of such url's are How to block ?g2_view=rss.SimpleRender , ?g2_view=panorama.Panorama&g2_itemId=,?g2_view=webdav.WebDavMount&g2_itemId=,?g2_view=panorama.Panorama&g2_itemId= crawling by search engine ??? |
|
Posts: 1642
Put them into your robots.txt.
Here is a selection from mine
--
dakanji.com
Posts: 11
Thanks a lot sir , Which robots txt file i need to update ??
I have three robots.txt file , one is in the root directory ((home) and rest 2 are public_html
http://cinespot.net/robots.txt
http://cinespot.net/gallery/robots.txt
Posts: 1642
public_html
--
dakanji.com
Posts: 1
Is there any way instead of blocking from robots.txt