Forum Moderators: Robert Charlton & goodroi
The following article may help you understand how Google updates its index regularly:
[#*$!.com...]
If I enter my unique website name my site turns up about 80th place out of 20,000 results all referring to my site.
If I enter very competitive financial search terms that are related to my site I pop up around 30th out of 6,000,000 results when I have always been on the 1st page (for around 3 years on and off).
Personally I think my site has received a penalty, as there have been many attempts to hijack my site, and there are several sites out there that have copied my site word for word, graphic for graphic.
Although these sites have been band in Google, I think they may have been around just long enough to give me a penalty for 6 weeks.. Well Im 7 weeks and counting now but Im expecting my site to come back soon.
sabine7777:
Try selecting a "unique phrase in your site using quotes" and search in Google, MSN and Yahoo to see if any one has been ripping you off.
Why would this help restore a listing? How would someone b " messing" and cause a drop in listings?
Check out some of these threads [google.com]
Oh, and welcome, Gavolar!
P.S.
The script I added to my pages is the following:
<script type="text/javascript">
if (parent.frames.length > 0) {
parent.location.href = self.document.location
}
</script>
Inserted between the head tags.
Five year old site based on a form of artistic writing (as usual, clean as a whistle) dropped from Google... pages still in the index, but nothing showing no matter what the search is.
Must be a bug... I hope....
We still have a pr of 6 on index.html and 4 or 5 on all others. Then, on 8/23 we were gone. poof. just like that.
We had changed nothing up to that time, but now we are not in the top 100 serps (stopped looking after that)
A search for site:http://www.mycompany.com shows even more pages then previously and our back links have slightly increased as well.
Here is the worst part, the other results for the serps are showing one of my competitors and the others (for a U.S. site related term) go like this:
#1 Australian company
2 a competitor here in the states
3 Australian company
4 UK company
5 Maltese company
6 French company
7 UK company
8 UK Company
9 Peruvian Company
10 Chinese Company
it goes on and on....
sigh.
"<script type="text/javascript">
if (parent.frames.length > 0) {
parent.location.href = self.document.location
}
</script> "if google views the page through a frame, could this possibly be seen as a javascript redirect eg spam?
I was wondering the very same thing yesterday. I've used a similar thing on my sites for years and recently have been taking a beating in Google.
<body onload="if (self!= top) top.location = self.location;"> Could be part of the problem, but then again it could be something totally unrelated... I have no idea.
I think what's going on is an update or algo tweak.
I can't see them doing because of advertising reasons
Right. These "updates" come near the end of the quarters to match the season and fashion trends.
U$ 200,000,000.00 in advertising profits has absolutely NOTHING to do with it.
That's why we give Google plenty of free advice here, because they're out there to help us and be our friends.
Every example I've seen points the blame at Google re-adding millions of Supplemental pages, and/or penalizing pages that have too many Supplemental copies. Also, sites showing ridiculously inflated page counts are the ones effected, while I haven't seen any sites with normal page counts effected (though that might be a coincidence).
Nothing to do with an algo here. Lots of sites are being penalized because their content has been stolen sometime in the past two years, and Google insists on keeping the copies in its database forever, even though the pages don't really exist in most cases. (Other things can be going on of course too.)
1. Certain pages seems stuck in time. The dates Nov. 1, 2004 - Dec 2. 2004, and Feb. 05 - 2005 are the key dates for our site. All the supplementals list these dates as the cache dates.
2. Why are there inflated page counts? Not just inflated but wildy inflated. If a site has 10,000 pages the "Site:" command will list 100,000. What are the other 90,000 pages - links - etc... made up of? There must be a database record for these 90,000 elements? And if you submit a sitemap to Google with the exact 10,000 pages on the site - why does the count not adjust? You are telling Google exactly what pages exist on the site.
3. It seems common sense the there should be no difference between www and non-www (www.mysite.#*$! and mysite.#*$!). Why have a difference between the two?
Google has tons of computig power. I would suggest the following activities for them.
- Clean up supplemental index. Go out and find out if these pages exist any more. Get the junk out of the index. Do not worry about competing on size. Worry about relevance and accuracy.
- Do not see non-www and www as two separate sites.
- Get the inflated pages counts back in control. Why 100,000 for for a 10,000 page site? Why are some sites completely accurate - while others are way out of sync?
Just my thoughts.
This could be correct Steve. We added an entire library to a section of the site a couple of months ago, and now it is hit.
The problem is that the library is very very popular, so we are not about to delete it again, whatever happens with Google.
The situation is a real mess.
I'm still holding up though, which is always good. I haven't gotten, or asked for a link in 3+ months. Not even to those directories that Google says is OK to add our sites to. I'm playing it extremely safe till I reach a certain amount in the bank.