Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: open
P.S. I won't be posting as often (gotta work, ya know :), but I will be checking this post and chiming in when there's something I can add.
<snip>took out what I thought was a problem with Google but was a problem with me</snip>
[edited by: ogletree at 8:10 pm (utc) on June 16, 2003]
After using your suggestion of &filter=0 I found my site!
I was upset to find that another web site has copied my content almost word for word, including pages titles.
Can you shed some light on how this filter works?
If it finds two sites with very similar or duplicate content does it just drop them both? Or does it make any judgement on which site copied from the other and penalize accordingly?
This filter has killed me for having original content that was copied by someone else.
[edited by: mrbrad at 7:54 pm (utc) on June 16, 2003]
Is everyones site in this new google update. My two sites don't seem to be appearing for my chosen keywords.
For months my sites have been on google with "xxxxxx" keyword but for some reason i have been completely taken of. Not even on the last page, but whats funny is that on one of the pages (mysite.com/site) not the frontpage is on the index which is ranking around 280. My 2 sites used to rank 3 and 5,
Google as indexed my pages but seems to have forgotten to include my frontpages. thats what i think.
Is this google update fininshed? Are just waiting for fi centre to be live soon?
No one here knows for sure but it has been my experience that things stay pretty close to the same before going live.
They do tend to add in some spam filters but that's about it. Maybe this time will be different and the index will change around before going live, but I would not hold your breath.
(Joking of course - this does raise an interesting problem for Search Engines trying to cut out on domains with duplicate content - I thought that the one with the higher number of backlinks/PR etc tended to win? - if someone managed to copy your site and gained more backlinks then I suppose you have a problem)
These are showing up fine using the "links:" command on www2 and www3, but there appears to be no boost at all in my rankings. Is that because PR hasn't yet been recalculated and factored in, or am I just wishful thinking ;)
Anyone else seeing this?
Any input would be appreciated :-)
Also watched some of the biggest players of widgets go from 3500 down to 320.
So I guess the right approach for the situation would be to say that things are becoming equal?
Leveling the playing fields and somewhat the relevance.
And its only a matter of time before everyone figures out how to crack the system "again" and google will run off to do the dance "again"
Same here. I was really jazzed that I got several PR7 links to my site that are directly related to my topic. Before Dominic, I was PR3. Then after Dominic, I was PR0, which didn't concern me (too much) as the site was still fairly new, and GG told us not to worry about PR for a while. So during the long wait between Dominic and Esmerelda, I got quite a few backlinks, including the PR7 links, and was anxious to see what the new dance would bring. Now, checking my backlinks and PR on -fi datacenter, the backlinks are showing up (including the really good ones!), it is no longer a PR0 (hooray), but it back to being a PR3 (darn!). All that work for nothing. Sigh...
This is a terrible frustration. I feel like my site has been unfairly punished.
This filter makes it easy for a dishonest person to destroy their competition.
Unfortunately, it is wishful thinking. SERPS may still shift a bit before the index goes live on WWW, but the links you see are certainly factored into the new index.
It is a frustrating update, as it seems a lot of hard work and link gathering seems to be going unnoticed in the rankings.
- more new stuff in (as announced around Dominic), but not really up-to-date until now
- link count diminished, external and internal, less doubles (as announced around Dominic)
- specific key phrases often better than more general key words (as announced around Dominic)
- sub pages sometimes performing better than index pages (as announced around Dominic), sometimes with very curious results though
- over-optimization doing worser (in my small environment; as announced around Dominic)
Sure it's waaaay too early ...
So what is the criteria that Google is using to select sites for this 'special treatment'? Still no answer to that - the evidence is actually much better at illustrating which criteria ISN'T being used.
I have the details of a variety of sites of all shapes and sizes. Granted, some look a little close to the wire on a few things, but some are very boring and appear to be whiter than white.
I've looked pretty closely at the latter in particular, as 'over-optimization' has been suggested. No evidence at all to support that theory.
For one, the keyword in question has a pretty low density, and although the keyword appears in a fairly high percentage of incoming links, it certainly isn't overwhelming.
Some of the sub-pages doing well at their expense are more than peculiar. Not just because they are contents/links/etc pages, but because for some I can see no links (internal or expernal) that include the keyword. Why Google is selecting these for elevation is anybody's guess.
It could be the sample size, but none of the sites I have been passed are older than say 15 months. I can't really see how that is factor, but it is worth noting anyway.
Maybe a good nights sleep with help clarify a few thing: my head is spinning with links and anchor text.