Forum Moderators: Robert Charlton & goodroi
I should explain 'mostly' above. My site had thousands of pages indexed.
Now when I do site: Google does show an accurate page count (in the thousands which it used to have) but when I try and scroll through the pages, Google only shows me about 150 page are in the index.
These lost pages are not in the index (despite the accurate site: ) because when I search for snippets that are unique to those pages, Google returns no results.
My homepage is still in the index along with those 150 pages. My homepage still ranks for a very competitive term. It is second for a query with 240K results (last week this same query showed 2.4 million results).
My referrals from Google are now less than 1% of what they were prior to 7pm EST.
I am hoping this is an issue of Google temporarily losing a site.
I really don't think this is a penalty issue, just a lost site issue. Would like to know if anyone else is experiencing the same thing.
Thanks
gomer
Now when I do site: Google does show an accurate page count (in the thousands which it used to have) but when I try and scroll through the pages, Google only shows me about 150 page are in the index.
"In order to show you the most relevant results, we have omitted some entries very similar"
Are you receiving the omitted link at the bottom of those 150 pages?
Vimes.
whats the page totals for your site on these DC's
72.14.207.107
64.233.189.107
the first for me shows correct figures the second is bloated.
Vimes.
These 4 sites are all listed in my webmaster account and still show pagerank, if that is any consolation.
I will investigate more and report back if I find anything.
Atomic, did you lose the site you mentioned around the same time? when you lost it, did all traffic vanish? It is now 6am EST on December 11th, no return of the tens of thousands of pages.
Vimes, it is all DC's not just the ones you mentioned in your post above.
We're still second on a search term that has 240K results (which used to have 2.4 million).
site:example.com still shows the correct page count, except they are not in the index.
site:example.com/* shows the correct number in the main index as well (non-supplemental) but these pages are not there.
Have been reading at other forums and this has happened to other sites as well.
My clients are freaking out but there isn't much I can do. We have been very white hat and this has really thrown us for a loop. I see no reason for the sites to have beeen dropped completely. I hope this is a glitch and the sites "pop back in" in the next 48 hours. If not, I guess I better brace for that "Google coal" in my stocking this holiday season.
I did nothing fishy with the site, I followed all WH approach. A number of times my site also hit the homepage of digg.
I don't know whats going on with Google here. Though the PR is as it is, so I am hoping its a glitch, not a ban.
inurl:example.com proxy
If I am missing something on how to find proxy hijacks, please let me know. If I got it right, since I have only one page hijacked, I am thinking that is not the issue for me.
This is a first for me. I have taken the following steps and would suggest those of you who have suffered this problem do the same:
1. Add this to all of your headers:
<base href="http://www.yoursite.com/" />
and if you see an attempted hijack...
2. Block the site via .htaccess:
RewriteCond %{HTTP_REFERER} yourproblemproxy\.com
3. Block the IP address of the proxy
order allow,deny
deny from 11.22.33.44
allow from all
4. Do your research and file a spam report with Google.
[google.com...]
[edited by: tedster at 5:41 am (utc) on Dec. 12, 2007]
[edit reason] removed specifics [/edit]
Proxy Server URLs Can Hijack Your Google Ranking - how to defend? [webmasterworld.com]
Tedster or anyone, is there a query I can do to tell if my pages have been hit by a proxy hijack?
I tried:
inurl:example.com -site:example.com
Can I just look through those results and look for proxy jacks like that?
If I am just seeing one page hijacked, do I have anything to worry about? We will block that domain.
I don't feel this is the cause of my site dropping out of the index but I do need to look into this fully to make sure.
Thanks.
Also, if I search for my domain name then there are no results found, unlike the proxy hijacks
Commands that still work are
link: inurl: and related:
everything else returns 0 results.
Is there any other way of checking for proxy hijacks?
I have a idea that this is related to advertising that we have took part in within the last 6 months.
"keyword-1" before 740,000,000 now 77,000,000
"keyword-2" 24,300,00 now 172,000
"keyword-3" 86,600,000 now 12,200,000
"keyword-4" 107,000,000 now 12,000,000
I could continue all day with this and it would show the same thing, 90% of the supposedly indexed pages are gone.
One of our sites lost a bit more than 90% but we are still being crawled at our normal rate which is 2000-3000 pages a day. Our domain still ranks for pretty competitive terms and pass weight to others and those pages we still have in the index.
#*$! is going on?
[edited by: tedster at 5:38 pm (utc) on Dec. 12, 2007]
[edit reason] no specific searches, please [/edit]
Stiphen, yes, the query counts are also down sharly but this seems to have been happening even a few weeks before the sites disappeared.
Unlike some others on here, site: still works for my site, the pages are just not in the index.
site:mysite.com
url:mysite.com
when I do inurl:www.mysite.com I get a proxy site.
When I do link:mysite.com it shows my backlinks.
Yet I still have pagerank across all the datacenters. In my webmaster account it says:
No pages from your site are currently included in Google's index. Indexing can take time. You may find it helpful to review our information for webmasters and webmaster guidelines.
OUr plan is to wait a while and just see if this situation resolves itself.
For sites that I know are highly penalized or very weak, I still see their pages in the index, they just don't rank for anything.
Since I am not cloaking, using hidden text etc, I expect these pages to at least come back in the index at some point.
I would say check googlebot activity in your server logs, see if there is any. A proxy url becomes a problem when the real googlebot indexes your content through a proxy server and uses their IP address. This is why doing the reverse/forward DNS check defends against the problem.