Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: mademetop
About 10 days ago my site went down due to a credit card problem and a mistake on the part of my web hosting company. As soon as I noticed there was an issue I phoned them and got the site back up. By this point my site had already been down for 2 days and had been dropped from all the major search engines.
I have checked everyday for my site to be re-indexed and, for the most part, it has - Yahoo and Microsoft have fully re-indexed the site, but Google (where we were getting 80% of our click throughs from and most of our work) has only partially re-indexed my site. Out of 33 URLs, webmaster tools tells me it has only indexed 21 of them - whats more my home page is one of the pages that isn't indexed.
My home page not being in Google has caused traffic to my site to drop by 60% in the last week alone - January, which is the busiest time of the year, is silent and we have had no new work at all - very distressing.
How can I get the pages re-indexed by Google again? If and when this happens can I expect the same SERP results as before? Should I site tight and hope that google re-indexes my home page (and other pages) or should I fill out a site re-inclusion request? Will this have a negative affect? Can anyone offer any advice to help me out?! :(
PM me if you need the site URL...
Thanks in advance
I do use adwords, but have always thought that organic ranking is better than paid for listings. My webstats tell me that the bounce rate is higher for users that come to my site via adwords than organic, so I really want to get my site back up organically.
I have a few questions - in my google webmaster tools I have a load of crawl errors all from when my hosting company took my site down - at the moment its something like 400, but it was over 1000 last week. It seems they used a robots.txt file to restrict access to bots. Will this slow down re-indexing? Does google and other search engines 'remember' that a robots.txt was once used and what pages it was used on?
The main reason for asking is that it has been 10 days or so since my site was back up and G still hasn't indexed 10 keys pages of my site - my index.html is still not back and it's confusing me why this would be. Could the fact a robots.txt was used be the answer - is there like a 'cool off' period for google to come back and crawl the pages that were refused?
Thanks again :)