I am very excited. On fe I have all my new content and URL system. I have more than 2 backlinks for the first time. I actually saw the new results when I went to www.google.com. But they are gone now. Google is now using some of the descriptions that I put on the top of my pages. I should be getting some great results when this is done. I can't wait.
<snip>took out what I thought was a problem with Google but was a problem with me</snip>
[edited by: ogletree at 8:10 pm (utc) on June 16, 2003]
After using your suggestion of &filter=0 I found my site!
I was upset to find that another web site has copied my content almost word for word, including pages titles.
Can you shed some light on how this filter works?
If it finds two sites with very similar or duplicate content does it just drop them both? Or does it make any judgement on which site copied from the other and penalize accordingly?
This filter has killed me for having original content that was copied by someone else.
[edited by: mrbrad at 7:54 pm (utc) on June 16, 2003]
|I have no idea how Google got these pages |
just try a search on -fi for 'google sucks' and see what you come up with..and again with the old index..Funny lil' twists!
Google caches mysite.com and www.mysite.com seperately if them find them seperately. mysite.com shows a cache from May 30, while www.mysite.com shows one from a couple days ago. I can't use a 301 because I use frontpage and the extensions don't allow 301s or mod rewrites, so I figure I'm screwed.
Is everyones site in this new google update. My two sites don't seem to be appearing for my chosen keywords.
For months my sites have been on google with "xxxxxx" keyword but for some reason i have been completely taken of. Not even on the last page, but whats funny is that on one of the pages (mysite.com/site) not the frontpage is on the index which is ranking around 280. My 2 sites used to rank 3 and 5,
Google as indexed my pages but seems to have forgotten to include my frontpages. thats what i think.
Is this google update fininshed? Are just waiting for fi centre to be live soon?
[edited by: ogletree at 8:09 pm (utc) on June 16, 2003]
<<Is this google update fininshed? Are just waiting for fi centre to be live soon? >>
No one here knows for sure but it has been my experience that things stay pretty close to the same before going live.
They do tend to add in some spam filters but that's about it. Maybe this time will be different and the index will change around before going live, but I would not hold your breath.
Try adding &filter=0
You may have fallen victim to the duplicate content filter as I have because some other lazy spammer out there copied my site word for word.
The sites that i have alot more links then some of the sites in the top ten for that particular keyword, this google update is wack.
If you could shed some light to whats happening, maybe i have done something wrong
sorry how does this &filter=0 filter thing word i can't seem to use it.
I know we have a 1 in 9 chance of seeing the new update on WWW but it seems every time I check it's on WWW. Does anybody else think the Esmeralda results are staying on WWW for quite a while.
mrbrad, if someone copied your web pages, I would contact them and ask them to remove your content. After that, you could assert your copyright or something like that. When Google sees duplicate pages, we do our best to break ties and determine which is the official/correct page, but the most reliable way is to make sure that no one has copied your content without your permission.
Could you post a search string exactly as you searched it with &filter=0.
I can't get it to work either.
Thanks in advance.
Is it really that easy to get a site banned in Google? - OK - I am off to create some duplicates of my competitors :)
(Joking of course - this does raise an interesting problem for Search Engines trying to cut out on domains with duplicate content - I thought that the one with the higher number of backlinks/PR etc tended to win? - if someone managed to copy your site and gained more backlinks then I suppose you have a problem)
During the last couple of months, I have acquired quite a number of good quality, clean, backlinks, some of which had very good PR.
These are showing up fine using the "links:" command on www2 and www3, but there appears to be no boost at all in my rankings. Is that because PR hasn't yet been recalculated and factored in, or am I just wishful thinking ;)
Anyone else seeing this?
Any input would be appreciated :-)
Hi Guys there is another thread going regarding deleting the filter with &filter=0 - look at the below for those questions:-
A mod or Bret may decide to merge these later though.
Try searching link: followed by domain rather than links: - links: is just a regular search (ie pages that have the word links: and your domain in it) while link: is a search for links to the domain.
I think the changes are here to stay. I am starting to smile again. Not because my links just dropped another 50%, but because I noticed some of my competitors drop from 98000 links to about 2000.
Also watched some of the biggest players of widgets go from 3500 down to 320.
So I guess the right approach for the situation would be to say that things are becoming equal?
Leveling the playing fields and somewhat the relevance.
And its only a matter of time before everyone figures out how to crack the system "again" and google will run off to do the dance "again"
Same here. I was really jazzed that I got several PR7 links to my site that are directly related to my topic. Before Dominic, I was PR3. Then after Dominic, I was PR0, which didn't concern me (too much) as the site was still fairly new, and GG told us not to worry about PR for a while. So during the long wait between Dominic and Esmerelda, I got quite a few backlinks, including the PR7 links, and was anxious to see what the new dance would bring. Now, checking my backlinks and PR on -fi datacenter, the backlinks are showing up (including the really good ones!), it is no longer a PR0 (hooray), but it back to being a PR3 (darn!). All that work for nothing. Sigh...
Thank you for your response. I have contacted these web sites but Im afraid it may be easier to just change around the content on my site. The sites that copied me are hosted in other countries and I cant sit around an wait for them to change their sites, if at all.
If I change my page titles and text within the page would it be possible that freshbot will pick it up soon and let me pass through this filter?
This is a terrible frustration. I feel like my site has been unfairly punished.
This filter makes it easy for a dishonest person to destroy their competition.
<<Is that because PR hasn't yet been recalculated and factored in, or am I just wishful thinking ;) >>
Unfortunately, it is wishful thinking. SERPS may still shift a bit before the index goes live on WWW, but the links you see are certainly factored into the new index.
It is a frustrating update, as it seems a lot of hard work and link gathering seems to be going unnoticed in the rankings.
I meant "link:" command. As I say, the links are showing up, but I just can't believe that they have been factored into the new index.
If they have been taken into account, then wow, PR must have much less importance in the algo this time around.....
@ wbienek: welcome t0 WebmasterWorld, great posting!
>This filter makes it easy for a dishonest person to destroy their competition.
Interesting Google requires this be done by snail mail.
Always happens when I'm offline :(
- more new stuff in (as announced around Dominic), but not really up-to-date until now
- link count diminished, external and internal, less doubles (as announced around Dominic)
- specific key phrases often better than more general key words (as announced around Dominic)
- sub pages sometimes performing better than index pages (as announced around Dominic), sometimes with very curious results though
- over-optimization doing worser (in my small environment; as announced around Dominic)
Sure it's waaaay too early ...
this update seems very complete page-wise from what I am watching, listing basically all my new sites and all their pages :-)
if backlinks are complete, then I am really impressed - -fi now lists a very small fraction of the links I collected, but they are the really valuable ones in terms of people intending them as reference, though, how any algorithm could find that out, is beyond my imagination. I am not talking of guestbook links. It has nothing to do with pagerank or obvious linkfarms either.
So I really think, backlinks are not all in...
Karsten, Try this search and see what happens:
allinurl:domain.de -ahshdfh (make these last letters whatever you want as long as they're not in the site).
>> sub pages sometimes performing better than index pages <<
So what is the criteria that Google is using to select sites for this 'special treatment'? Still no answer to that - the evidence is actually much better at illustrating which criteria ISN'T being used.
I have the details of a variety of sites of all shapes and sizes. Granted, some look a little close to the wire on a few things, but some are very boring and appear to be whiter than white.
I've looked pretty closely at the latter in particular, as 'over-optimization' has been suggested. No evidence at all to support that theory.
For one, the keyword in question has a pretty low density, and although the keyword appears in a fairly high percentage of incoming links, it certainly isn't overwhelming.
Some of the sub-pages doing well at their expense are more than peculiar. Not just because they are contents/links/etc pages, but because for some I can see no links (internal or expernal) that include the keyword. Why Google is selecting these for elevation is anybody's guess.
It could be the sample size, but none of the sites I have been passed are older than say 15 months. I can't really see how that is factor, but it is worth noting anyway.
Maybe a good nights sleep with help clarify a few thing: my head is spinning with links and anchor text.
Again, only the index page plus one page "product_01".
And searching with "www" in the searchterm, the index page plus one page "product_02".
|but the most reliable way is to make sure that no one has copied your content without your permission. |
GG, even with permission, google has no way of determining which is the original and which is the copy...
The best policy I'm afraid is only to give permission to a trusted source who agrees to block google in robots.txt.
Or just deny permission.
| This 249 message thread spans 9 pages: < < 249 ( 1 2  4 5 6 7 8 9 ) > > |