| 12:28 am on Jul 18, 2014 (gmt 0)|
I'm surprised that it would completely disappear from the results in such a short time. Unless perhaps googlebot saw the Godaddy "Expired Domain" page and knows what it means. But even then, that's a quick reaction.
Anyway, the site should come back pretty quickly. But I would like to ask a few questions:
-- How big is the site?
-- How old is the site?
-- How frequently does googlebot crawl the site?
-- Did it disappear from Bing's results too?
| 12:30 am on Jul 18, 2014 (gmt 0)|
Haven't done one of these in a while, but all things being equal, it should come back pretty quickly. Maybe a few days to a week?
| 12:41 am on Jul 18, 2014 (gmt 0)|
|Maybe a few days to a week? |
I've got my "been there; done that" card for this one and within a week of renewing it was right back on top where it was previously.
|I'm surprised that it would completely disappear from the results in such a short time. |
I'm not surprised -- Tried to have a meaningful discussion about Google's speed of reaction a little while ago, but unfortunately that didn't pan out -- Based on what I've seen and read: Quick down, slow up is G's current MO.
| 12:53 am on Jul 18, 2014 (gmt 0)|
The reason I'm surprised is because I've seen several cases where a server was down for 2-3 days, but the site stayed in the SERPs and even kept its rankings.
| 12:57 am on Jul 18, 2014 (gmt 0)|
Here's what I know as I've made some classic blunders in the past.
When you accidentally knock your site offline, or remove a directory via robots.txt, etc. that Google may quickly remove it from displaying in the index, but the while site including rankings, are still there, cached, and waiting to be restored for a finite number of days.
Google has this built in to protect the index against sites being accidentally lost from the index when they're moved to a new server, server down time, domain failures, etc.
My experience with this kind of thing was that all the important pages returned very quickly so it may be a situation where the pages have to be crawled once before returning to the index, not sure, but that would explain how things appear to return that each page is validated first. Obviously, not all pages have the same importance to the crawler so the least crawled pages may take some time to recover.
| 1:15 am on Jul 18, 2014 (gmt 0)|
|The reason I'm surprised is because I've seen several cases where a server was down for 2-3 days, but the site stayed in the SERPs and even kept its rankings. |
When a server goes down there's no [or a likely different if the server is function at all] response code served in the header to a bot [or person], which are generally "can't find the site" with no response code in some cases, or "503 unavailable" in others.
When a site is not renewed there's a response code related to the page(s) accessed that's presented by the registrar isn't "503 unavailable" or "can't find it, no clue" [much like a 404], so it's entirely possible the algo responds differently in each situation, much the same way it can take longer for a 404ed page to disappear from the results than a 410ed page, since a 404 is basically "can't find it, dunno" [it could mean it was removed, but it could also mean an FTP client stalled after deleting the remote copy and before the new one was uploaded and the page will be back in 30 seconds or less, or it could mean there was a glitch on the server, or --insert a bunch of other things here-- ] while 410 is explicit and says, "Hey, this page was removed purposely and there was an effort made to tell you to stop sending people by returning a 410."
| 12:07 pm on Jul 18, 2014 (gmt 0)|
have you checked GWT for clues?
"fetch as googlebot"?
| 11:24 pm on Jul 21, 2014 (gmt 0)|
Sorry for the long delay in replying.
I'm still not showing up for my normal searches and my traffic has dropped about 50% :( - starting to get very worried that 3 hour slip-up is going to have very long lasting effects.
I checked GWT but I'm not sure what I'm supposed to be looking for. Can you please explain what to do when I'm there, or what clues will help?
Also open to any other suggestions on being pro-active on how to fix this, if there is anything I can do.
| 11:41 pm on Jul 21, 2014 (gmt 0)|
try "fetch as googlebot".
check everything related to crawling.
crawl stats, crawl errors, everything...
check your web server access log to see what googlebot is requesting.
| 9:31 pm on Jul 22, 2014 (gmt 0)|
Thanks phranque, I fetched as googlebot and the only thing that stood out to me as possibly different, is it says "Partial" listed under "Status". Is that normal?
I do see my pages are indexed still, when I do a search for "site:mysite.com" but they just aren't ranking at all any more. (Couldn't find any results in the first 10 pages for my sites normal terms, that I usually ranked around 2nd or 3rd for.)
Everything else appears to be normal in terms of crawl stats, crawl errors, etc.
I actually went through all the pages of GWT in relation to this site, (not just crawl stuff) and nothing stood out really except that 1 thing: "Partial".
Googlebot is still coming to my site daily according to server logs.
Any suggestions or ideas?
| 5:15 pm on Aug 6, 2014 (gmt 0)|
Well, it's now been over 3 weeks since this happened and my site still hasn't returned to it's previous rankings.
Traffic has dropped about 40% - as well as income.
Make sure your credit card on file is up to date people! Learn from my mistake.
| 3:09 pm on Aug 15, 2014 (gmt 0)|
Just in case anyone is wondering - I just now today returned to the Google results and traffic is back to normal!
It took almost exactly 1 month to the day, from when the domain expired, to finally return to the results as normal.
| 5:08 pm on Aug 15, 2014 (gmt 0)|
+1 thanks for the update
| 2:08 am on Aug 16, 2014 (gmt 0)|
Definitely, thanks for the update and glad you came back.
I guess it used to happen faster, but it's good to know it's still happening, cause more than one of us has forgotten to update the CC with a registrar before -- In my case, I only had one domain registered there [I thought I had everything consolidated into one account] and I didn't remember the account login or have a clue the domain was there and needed to be renewed, because I had changed my main email address, so I didn't get any of the notices they likely sent either.
| 1:47 pm on Aug 16, 2014 (gmt 0)|
Uh, you don't 'not let your credit card expire'. You renew for the maximum. You did renew for 10 years, right?
| 9:05 pm on Aug 16, 2014 (gmt 0)|
Still surprised you took so long to get back in the serps. Had a similar problem 4 weeks ago - site was off line for 2 days. When Google realised, I dropped right out of the SERPS although by then the site was back up and running. However, the next day I was back in the serps with traffic at 90% of what it had been previously.
| 4:20 pm on Aug 18, 2014 (gmt 0)|
|Uh, you don't 'not let your credit card expire'. You renew for the maximum. You did renew for 10 years, right? |
You read my mind.
This pattern of behavior for money making domains continues to baffle me as I have my primary cash cows 10 and 15 years out.
FWIW, I've had domains bounce before, for a variety of reasons, and they all came right back. However, the last time this happened to me the pages were restored to their previous spot AFTER Googlebot crawled each page to make sure it was what it had been.
I've been told by a well known Google engineer that we all know, love and distrust, that Google holds domains in cache for a period of time.
If your server is temporarily down, domain name expired, or your site crashes, or even had a software glitch that spits out a bunch of bad pages, that when it's restored in a reasonable time they reset it back the where it was.
But, like I stated above, it restored the pages as it crawled so my important pages were put back in a hurry and the rest took weeks to restore as they didn't seem to get all excited and recrawl the site just because I messed up.
Also, if you changed any of your domain contact information during this fiasco, that might make them think the site changed hands and is a potential fake.
Just a thought.
| 4:55 pm on Aug 19, 2014 (gmt 0)|
I have purchased over 2000 expired domains at Godaddy auctions and never had a reset on any.
Back before Google notified through WMT I did get a few with manual reviews but those were easy to fix as the old websites were toast!
I would think something else happened at the same time.
That said, if Googlebot crawled at the right time the placeholder for the registrar could be the website which changes everything about what you had.