| 9:59 am on Oct 7, 2006 (gmt 0)|
Really bad site: results for me... Over 2000 pages of unique content, site: shows only 8... not even 2 pages of results...
So not so great news for me...
Is it logical the NYT, the BBC and other authority sites to trust my site(s) and link to my pages and Google not even ranking us in the top 400 for our keyword which is included in the domain?
This is ridiculous... They have so many test datacentres to try their tricks... I just can't believe this happened accidentally...
What about google looking to gain some of their adwords losses?
And their continued silence on these issues is very weird indeed. Matt is trying to confuse SEO's (in my hummble opinion), by talking about all other things apart from his specialty which is spam.
WE have made google what it is and now it seems like we're dearly paying for it. The big boys seem to be ok... It is us who have to get the snake out of the hole (as an old greek saying suggests) without knowing where the hole is.
If google were so nice as we used to know them 3-4 years ago they would have addressed some of the issues raised here, instead of addressing the conservatives in Britain. You see, that's where the money is...
| 10:18 am on Oct 7, 2006 (gmt 0)|
... someone mentioned spidering like mad...
Happening with me. Strange thing is, I just saw that googlebot is requesting pages with http:/www.sitename.com/page.asp etc.. A single slash!
What in the world is that!
| 10:43 am on Oct 7, 2006 (gmt 0)|
It could be that you are hitting different datacenters each time.
Why not try the same on Google Datacenters Watch Tool and see whether different DCs groups are showing different results?
| 12:24 pm on Oct 7, 2006 (gmt 0)|
Regional results [ UK & AU ] are better than yesterday, but still not back to normal. Lots of supps and old data.
We're in Sydney.
| 2:01 pm on Oct 7, 2006 (gmt 0)|
For me and as of Saturday, October 7 ...
The site:www.mysite.com command is perfect. It lists every single page of my site correctly. No pages left out, no supplementals and nothing extra which shouldn't be there.
I believe the link command is worthless anyway, so never worry about it.
The allinurl: command also works well. It shows several pages which should not be affiliated with my site as supplemental. All the rest are mine and listed properly.
Personally, I see nothing wrong at all! All is as it should be for my site.
| 2:13 pm on Oct 7, 2006 (gmt 0)|
I checked several of the DCs that yesterday, and for the last week or more, had totally broken results and today they appear to be back to normal.
| 2:42 pm on Oct 7, 2006 (gmt 0)|
It looks like everything has been fixed to me. Of course this could be temporary...
| 7:20 am on Oct 8, 2006 (gmt 0)|
System: The following message was spliced on to this thread from: http://www.webmasterworld.com/google/3112830.htm [webmasterworld.com] by tedster - 11:37 am on Oct. 8, 2006 (EDT -4)
I have been in a bit of a panic the last day while patiently waiting to post a message well here it goes. I have been reading much about a Google update being in progress and from what I see I would save this does appear to be the case but my site's issue is with back links. When searching for backlinks in Google there are ZERO results and wanted to check to see if anyone else is having this issue.
The odd thing about the zero results as well is that when I check backlinks with the many SEO tools out there most of the sites list my backlink number (27,500) which makes me wonder if this is an isolated incident but I have tried from multiple IPs (in different cities) in Google directly and continue to get the same issue. Please help! Any information would be much appreciated.
Speaking of the subdomains when checking backlinks for them out of a total of 3 subdomains only 2 of them have backlinks each with one and that sole link being from one of the subdomains on the stie.
Now all other search engines (MSN, Yahoo, etc.) have backlinks to the site.
Some background information that may help. We recently changed the domain name over to be site.com/home and have since been building back links to the new domain while having the old domain (site.com) coded with a 301 redirect via press releases, directory submissions, social bookmarking sites, & link exchanges with the creation of a links page(unfortunately :) ). Now our site gets crawled roughly every 4/5 days so clearly the redirect has been noted by Google so I doubt the redirect would be the issue.
However, we recently went live with a good old Google Sitemap for the main domain. We had the subdomains with sitemaps submitted to Google for about a month or so. Now since Google Sitemap does not currently support sitemaps in sub-folders like ours we figured creating one for the main domain (site.com) with a 301 Redirect would be okay - could this have created anything negative in the eyes of the Google Gods?
Now if your still reading now I am sure you are somewhat intrigued and hopefully having some sort of an answer (wishful thinking, i know). Also, if this is something not caused by the G Update, does anyone have any other ideas? as well as the best method to contact Google about this issue.
Thank you very much!
| 5:25 pm on Oct 8, 2006 (gmt 0)|
Google never lists more than about 5% to 10% of all of the links that they actually know about.
They also look at link quality, so most of the links that you do have are probably not even helping you in any way.
| This 39 message thread spans 2 pages: < < 39 ( 1  ) |