Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: open
P.S. I won't be posting as often (gotta work, ya know :), but I will be checking this post and chiming in when there's something I can add.
The better question might be, "Does PR play a significant role in the ranking of pages anymore"?
Seriously, though. GoogleGuy is not going to answer questions about how your pages are scored. There really is no reason to ask why your page isn't doing well in the new index when it is a question he simply cannot answer.
GG, since you are commenting on weird SERPS, any thoughts on [www3.google.com...]
Even the "horse links search engine" ranks above you guys now. :)
The example seems silly but in reality is a symptom of a MUCH larger problem on how Google is ranking sites now...
[edited by: mfishy at 5:49 pm (utc) on June 16, 2003]
Read this again and again. Its correct after watching the new algorithm and comparing over 60 clients of mine at various stages...
There are LOTS of examples like this where google returns poor results. The reason is google is starting to IGNORE what a page says about itself if it says it too much (What you call spamming or over optimization) and if the links pointing to a page say the same thing <A href='www.yourdomain.com'>What You Say</a> they're either ignored or not included. This is their solution to 'link spamming' They figure a link that says something only once or twice is 'original' and links that are identical are just part of link campaigns.
I still think that this is a fairly big problem for google not being able to rank them the same as they are obviously the same page for every site!
Im sure im not the only one with a site like that and could explain alot of why people moan that their site is not listed as well as they think for certain words.
I wonder if perhaps GG could answer or ask the tech guys at the plex?
If not its my question for Googleguy at the next Q & A!
[edited by: dazz at 5:57 pm (utc) on June 16, 2003]
I wonder if we are going to see the new data set move slowly accross the datacentres like in the Dominique period - pre Dom I would have expected some other datacentres showing some of the new data by now.
I would also be interested in if we can compare PR after Dominque to PR before Dom or is it like backlinks - comparing oranges to apples?
For example before Dominique a site was showing PR4, during Dominique the PR went to Zero (due to being a new site I guess?), now the PR looks like it is going to be a 3.. Can we say that this site has gone down a rank over a three month period - or would we say that the site has gone up 3 ranks and not to worry about PR before Dominique?
1. The hidden text filters still aren't in play...I wonder how much longer before these puppies are let out?
I just found a major regional newspaper with 20K+ of hidden text/links on every single page - nothing fancy just the text/link color set to white on a white background for all sorts of irrelevant stuff.
2. DeepFreshBot doesn't seem to crawl as deep as DeepBot. Maybe DeepFreshBot needs more steroids or a better map to find its way around?
3. A lot of minor search engines and search facilities are getting indexed with dynamic URL's generating true backlinks with the title text of the site as anchor text. At the moment this only seems to be having a minor effect on SERP's, but as more pages of search results are generated we will have another form of spamming to compete with/be forced to take advantage of.
Yes, may work in theory, but stands to post two major problems... suppose Jim's Widget World is linked to a LOT with that exact text, since it is the name of the site... so, the site would not come up in a search for Jim's Widget World? So, if CNN, AT&T, AOL, etc are all linked to 90% of the time with their name in the link text, under the above theory, that means they wouldn't come up for their own names. Not really a great solution.
What if a site sells widget covers? Chances are, that will be in the link text a lot... there are way too many ways this would DECREASE the quality of the SERP's.
In addition, it leaves the door WIDE open for sabatoge. All I gotta do is make a ridiculous amount of links to competitors site with the target keyphrase in them and I get them penalized for the search term? Again, not a super solution.
If this is what google has done, they seriously need to rethink it or at least add some way for this to only count on recipricol links, since we can't control what link text other sites link to us with.
[edited by: chrisnrae at 6:16 pm (utc) on June 16, 2003]
Am I actually asking google to ignore my index page unknowingly? I just changed servers and this was in there automatically.
I also think, speaking of penalty, that the NEW game includes 'mini' penalties. Dont think of a PENALTY as one big slap where you're at the very bottom or not in the index. It seems that various PHRASES attached to a SITE get the 'PENALTY'.
(Again, this is just a theory - one that I have observed on over 60 client sites on various subjects)
You have a site about neckties.
You go to overture and find out when it comes to neckties, the phrase "silk neck tie" does pretty well and since you sell this type of tie you decide to OPTIMIZE for it.
You title your site : Silk Neck Tie
You don't spam your metas, but only use the phrase 'Silk Neck Tie Sales'
You include 'SILK NECK TIE' in your body and alts and the whole nine yard to a percentage of over 20-40% according to http://www.keyworddensity.com
You then contact hundreds of sites with good PR and get them to link to you with :
<A href='www.yoursite.com'>Silk Neck Tie</a>
Bam.. Goofy sites barely about silk neck ties come up before you.. But strangely enough you come up great for other things like:
wholesale neck tie(mentioned once or twice and maybe 1 or 2 links <a href=www.yourdomain.com>wholesale neck tie</a>
(you wanted to come up a little for that so you asked a few sites for that link and mentioned it a few times on your homepage)
See whats happening now?
There are LOTS of examples like this where google returns poor results. The reason is google is starting to IGNORE what a page says about itself if it says it too much (What you call spamming or over optimization) and if the links pointing to a page say the same thing <A href='www.yourdomain.com'>What You Say</a> they're either ignored or not included. This is their solution to 'link spamming' They figure a link that says something only once or twice is 'original' and links that are identical are just part of link campaigns
You seem pretty sure of that. I have pages that are #1 for their search term and every single incoming link (they're all internal) uses the same anchor text (which is exactly the keywords they are listed under). They're still #1 in the new index.
For any site in general the title of the site will be the most common link text, so just about every site would be penalized under your theory.
Best come up with a new one to explain your problems.
[edited by: bnc929 at 6:29 pm (utc) on June 16, 2003]
Uh, I'm no old pro, but in the time I have been paying attention, the datacenters have never all gone live at once with the new index.
The 'dance' is the spreading of the new information across the 9 datacenters.
Although there are 9 datacenters, two are either inactive or those two point to other datacenters from the remaining seven.
Is your site super optimized for ony a few (or one) phrase? The connection requires the high optimization of a phrase PLUS the corresponding phrase link..
Show me some sites that this is the case for and I'll admit the theory is wrong..
20-40% overall optimization.
The same link phrase seems to be fine as long as the site optimization for that phrase isn't too high.
The problem is too widespread to be caused by that, and according to the cache, hardly of the sites I have been sent so far have changed in that area.
I'm still trying to find a pattern. I have to say though that so far, none of the sites I have been sent are old, really established sites. That's the only correlation thus far, but it's early days in my analysis.
I think all you can do at present is sit tight and see what unfolds (I'm far from being the only one exploring and trying to find a cause). Certainly, I wouldn't start making big changes to your site yet.
Dominic was the first update where google started to make it difficult for optimizers, especially newer ones, who were finding it increasingly easy to raise their ranks by following a few simple "rules". "Optimizers" were hit as well as spammers. They tweaked their algo to make it more complex (somehwat similar to what wibeinek is talking about) and are now tweaking it to make it more relevant after the roadworks.
This accounts for most complaining happening at WebmasterWorld where many SEO's, especially part time ones (like me!) hang out, but much less evidence that joe user found the results less useful. It may also account for index pages going missing in ranks as this is the one most optimize for. It also accounts for strange results people reported, especially in highly competitive areas, as the algo change was significant. and it also may account for the fact that most people who were complaning had done a lot of work on SEO recently, added pages or had new sites using tips discussed here. Thats a generalization of course but it seemed the case in many posts.
Just a theory and i dont hold dearly too it, but heh this is fun thing to discuss. Please rip it apart.
Also, again, the site title and products featured comes up. Suppose a site sells oranges... you can guarantee almost all of it's link text will include that word... in the theory, it wouldn't come up for oranges.
If a site sells widget cases and NO other products, and is called Widget Case World - you can bet widget case will be in 100% of their backlinks... and google will penalize them on the widget case phrase, when that is what their site is all about and the only thing they sell?
I honestly can't see google being blind enough to miss the major problems that would arise by using the above theory. But, then again, I never expected google to be drunk for yet another month either ;).
Is your site super optimized for ony a few (or one) phrase? The connection requires the high optimization of a phrase PLUS the corresponding phrase link..
They're individual pages, not entire sites. The individual pages have PRs of 5's or 6's, all incoming links are internal ones, all incoming links use very competitive short two word phrase. The onpage content is heavily optimized for this same phrase, with the keywords in the title, multiple header tags, body content, and meta tags (not that meta tags matter with Google).
There are around 30 such pages. Each optimized for it's own two word phrase. All pages that have been around for more than a couple weeks (I just added about 20 of them) are highly ranked on their phrase (which is very competitive), many are #1.
If there ever comes a time when internal links are devalued, or links with duplicate anchor text are devalued, or PageRank/anchor text in general is devalued, then these pages should take a ranking hit. As long as they do not it is my conclusion that there has not been any major change to any of the above.
Correct. Google is not going to penalize for that. I have always maintained look at Google to optimize. For example, check out their services page (/options) from the index and Google has kw density of 38% and I feel sure that most links have Google in their anchor text.
I certainly won't do that, because it has merit. However, in the back of my mind I keep thinking about BrandX.
No I've not gone mad... bear with me.
BrandX is a product (fiction - substitute what you want). The point is that it is widely known by this name. Almost exclusively known by it.
It is obviously going to have BrandX plastered all over the site itself, but also 99% of links to the site will have BrandX as the anchor text.
Given that, how can Google reasonably decide that it is somehow 'over optimized'? How does it set a threshold without netting major products which share this scenario?
I just can't see it.
Maybe I will if it turns out to be the only answer though!
Drop in backlinks has been significant over a couple of my sites - but the links still exist - google just doesn't count them anymore.
Yet - checking some other 'dubious' sites - backlinks have gone up in -fi - page after page of guestbook links.
I really doubt, for the reasons stated above (BrandX etc) that this would ever be considered a viable spam filter.
However what you may run into is a case of diminishing returns.
Possible on-page content scores might resemble a logarithmic curve, with keyword density on the x axis and score/rank on the y axis. So the marginal benefit you obtain for each additional keyword instance grows smaller and smaller. This would be one way to make sure that keyword density doesn't out muscle other ranking factors.
Just a hypothesis.
I looked at a couple dozen blogging sites belonging to A-list bloggers, and it looks like most of them have lost up to half of their backlinks with this new update.
Typically these sites have a blogroll on the home page pointing to other A-list bloggers.
Also, the fact that the blogging software is available only in several places, means that many bloggers don't have their own domain, but rather an ID number that becomes their directory at some central blogging site.
Trying to separate the physical domain space from the logical domain space might be difficult in these cases. If I were Google and trying to reduce blog noise, I'd specifically identify those centralized blogging sites in a config file. But Google's whiz kids probably thought they could do it by algo instead of by config file, and it doesn't work as well as they thought it would.
(This theory is, of course, pure speculation -- but so is almost everything else in this thread.)
An example of where the theory is wrong is this:
A page over optimized (or highly optimized, depending on your manner of SEO) for a phrase without backlinks might be considered spam... but it does have backlinks.. but Google hasn't calculated backlinks yet... so when they do, the pages will no longer be 'penalized' for that phrase..
Its too early to really be drawing conclusions. Even conclusions at the end of the dance will be flawed because as you guys know, you need MANY months before you cand find true patters!
I would LOVE to say Google has devalued the TITLE tag.. But I can't honestly, because its TOO EARLY!
I would love to say Google is giving more value to META if they're not OVERLOADED (too many words), but its too early..
I would love to say Google is watching for the same link too often, expecially if you used it in 'link farms' or 'guestbooks', but it's too early.
I would love to say that Google is using a new algorithm that looks for cross relivence, ie, a kind of thesaurus for words - ie basketball is related to sports (especially when looking at 'themes' for site linking) but that will probably never be known.
I would love to say that I'm full of crap, and that's probably the most accurate thing in this post! ;)
One GOOD thing I've noticed is that they're adding new sites again, and thats positive.. Seems the log jam is ending.. And regardless of the changes we have to make to play in the game, its a good thing that new sites have a chance of getting in now!