Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: phranque
Page A is about 'mens hair gel' and is optimised to the max for the term mens hair gel (but not too many incoming/outgoing links) and ranks in the top 30 for the term 'mens hair gel'
Page B is from a different domain on the topic of music (without many outgoing/incoming links) and has one link to page A (as well as about 4 other links) of the form <a href="http://www.blah.com/hair/gel.html">mens hair gel</a>
Now, when I look in Google for 'mens hair gel' Page B ranks higher than Page A! I think this particular piece of evidence suggests that Google is giving much more relevancy weight to pages which link *to* pages on the theme, rather than pages that are linked from other pages with an appropriate link. The strange thing is that there is absolutely *no* way either of these pages could be classified as a hub since, as I said, they don't have too many links (typically no more than 5 per page)
So, it's not a hub and it's not an authority ... what is it? Maybe we should call it a 'mini-hub'
I have seen other threads talking about similar ideas http://www.webmasterworld.com/forum3/996.htm [webmasterworld.com]
Anyone else see something like this?
Now, I wouldn't put it past a SE to say one thing and do the other, for the mere point of throwing people like us off.
I think that to conclude something like this, that you need at least a few more examples of this happening.
I agree with you here, but in my case I have a lot of evidence. This is happening across the board for several of my domains. I'm not jumping to any conclusions yet, but I always like to assume that what I think is happening is true and then try to prove or disprove myself.
BTW, this topic is also being discussed in:
"when i type my site name it will show the home page link on the 6-7 page and rank sites linking to me higher" [webmasterworld.com]
>Which one has the most PageRank?
Will check the page rank of the different pages tomorrow and let you know.
From a seemingly unrelated post:Sponsored listing, good results? [webmasterworld.com]
"Just last reindex hit me hard but I am still hoping that was some technical dificulties at Google. ( I do not see a logical reason to rank my site on 21st position under the keyword which is a combination of three very specific words and they are my business name. Most of the 20 sites before me are the sites which exchanged the link with me and that business name appears somwhere on theirs links pages.)"
Again we see the sites linking to another site ranking higher than the actual site. Perhaps someone at Google made the mistake of giving hubs way too much importance ... i.e. any site linking to a site on a theme is considered a hub and very important. Maybe.
My ranks are holding on many competitive kw's, I haven't seen any appreciable "slip", but all of my inbound links are more or less legit .
I don't think it would be to hard to come up with a baseline for inbound links and penalize those who exceed the baseline exceedingly.
This could certainly be the case... but how? On what criteria could Google determine which are spam links and which are legitimate?
OK link farms are fairly obvious, and there is usually a reasonably clear pattern to grab hold of. But the sites that I have seen suffer recently just have a randem selection of links from other sites (albeit sites owned by me as well). I have been careful not to have exactly the same links on every site, and even thrown in 'wobblers'... links to DMOZ and similar.
What you say has been a worry for some time, so it wouldn't come as a shock if this was happening... but I just can't get my head around the criteria that Google would use. Any further ideas on this?
The next index is going to be fascinating! We might have a different word for it afterwards of course.
I don't think it would be that hard to just use simple averages. Lets say for instance that the average site that is linked to from dmoz also is linked to from 50 other sites, and you have a handful of sites with over 3,000 inbound links, you could draw some fairly fast and accurate conclusions. I'm sure they may have applied a bit more logic to the equation, but that looks like the basis.
I once contemplated creating a plethora of "related" sites that would point to my main sites for the purpose of link pop, but decided that in my categories, I would probably stick out like a sore thumb and eventually end up with one. Besides, I don't really think that building sites with "find and replace" is very much fun :).
I think it helps to open the circle and get links coming in that you don't link back to. And of course those directory links are really helpul. Even cloaked doorway domains can get links from vortals, if you submit a page that provides the content the vortal is looking for.
I see no way how any algo could detect anything excessive here. And I was hit sooo hard. To some of us this changes happened with the previous update (late June). Did the last google changes have any effect on those pages? Andrew, others?
starec, I noticed the changes right when the Google started index update in late June.
This is good [webmasterworld.com...]
#1 - These search engines don't work!! It all sounds good on paper but when you go to use them their results are inconsitant at best. It's a lot like Microsoft windows in that regard...
#2 - The problem they are trying to solve is not well defined, and so their algorithms are very 'fuzzy'--by that I mean prone to constant tweaking. Like others said, perhaps the one stuffed to the gills with 'hair gel' was being penalized for trying too hard. Perhaps some outright bug made it rank the music site higher. Who can say for sure...
I never said that sites are being penalized for excessive popularity, they are being penalized for excessive links (there is a difference).
I could point 5 to 10 thousand inbound links to my sites, but I really don't think that would make any of them more "popular". It would probably just look like spam to the goog.
What is your strategy now, wait for google to hire the right engineer? :)
I don't think they made any errors, this all fits with the "hilltop" algo, that not only "weighs" inbound links, but weighs the quality of them. Weighing sheer numbers of inbound links without weighing the quality is a closing loophole, it was never a "policy". It was exploited, it skews results, and it is being addressed.
This is why we start with our directory links. The site should be good enough to pass a human editor. I consider the ODP, who is not influenced by $ because the listing is free, as my best gauge. Then, try for edu and gov links before moving on to the industry and vortal links. This shows quality above quantity although I wouldn’t dismiss any on-theme incoming links. I think those show popularity in their sheer variety and most mimic what a natural linking would occur, given time. Unfortunately, we all want a quicker fix than would come naturally and have to force the linking the best way we can. It’s not easy and often not fair but it is reality.
If I would have a chance to point 5 to 10 thousand inbound links to my site I would do this whithout hesitation (assuming that all of them are legitimate links from sites which have the same theme, have a good page rank and are listed in major directories Y, L$, ODP...) In my opinion it would never constitute a spam to google or any other engine.
My strategy for now is to wait patiently for next reindex without making any changes to my site.
I hope you are right about "hilltop" algo and I hope that Google is starting to utilize it. When I check my rankings in [teoma.com...] they are the same as on Google before that last strange reindex. I am #1 or #2 for all my keywords and that is why I am really lost with all the speculations about new Google algo.
just to touch up on the too much links or excesive links,
i have over 100 subdomains from one site that has been in google for nine months. approx. 9k pages in all, every keyword term possible, basically a big hallway page city, these sites when updated for the first time stayed in the 11-20 terms returned for the keywords they were optimized for, now this is the kicker, each page and i mean each page has all the links of all the links in that subdomain, say 120 links on every page, with this new update all my other sites that have enjoyed first page returns were pushed back to the 2 and 3 page returns and my over excesive link farm that was on the 2 an d 3 page results hit the top ten results and mostly the top five results, "every page has 120 links on it pointing back to its self"
go figure... my other real ultimate sites perfectly optimized are on the 2-3 page results and these sites are in yahoo dmoz etc..the link farm is listed nowhere and doesnot have a link from any other site i own, it is a stand alone farm on it own..
so in closing the excesive links and ect. is out the window,
in my opinion from my google experiance all our sites which have been there very long say before christmas, have gained ultimate popularity, even sites that were not optimized properly gained higher rank, but the new sites which were doing great in the top ten and five, took a hit, why i dont know, i think it is a soup of things going wrong at google affecting everyone differntly, also to note, new sites just crawled and updated on the recent update hit the top ten rite of the bat, i myself am confused because when you read these forums every one has been affected in different ways and perhaps the sites with small page count did not realy change at all.
OK nube, let's slow down and help me out with this one. When you got to the kicker, I got lost somewhere down on the farm. Help me get clear on this, step by step:
>each page and i mean each page has all the links of all the links in that subdomain, say 120 links on every page,
So this means that there are 120 pages on the subdomain, and they all link to each other from each and every page - all internal linking within a given subdomain. Is that right?
>with this new update all my other sites that have enjoyed first page returns were pushed back to the 2 and 3 page returns
All your other sites that had good listings got pushed back. Do I have that right?
>and my over excesive link farm that was on the 2 an d 3 page results hit the top ten results and mostly the top five results, "every page has 120 links on it pointing back to its self"
This one, the one that has all 120 pages, all within the same subdomain, with all the pages linked to each other, moved up to top ten, mostly top five. Do I have that right?
If so, that means that the other sites moved down, but the one with the internal links moved way up? Right?
That's not a link farm with a kazillion unrelated sites all over the web linking to each other, that is one site with heavy internal linking. That's how I interpret what you're saying.
On this most recent update, I put a site at #1 and #2 for several keywords, when I was looking over the site prior to the update, my best projection for it was to come up 10-20 based on an analysis of the competitors inbound links (they had like way more). I was pleasantly surprised by the high ranks and attribute it to the "hilltop effect", where quantity and QUALITY are part of the equation. I believe you can be penalized for excessive low quality inbound links. At least that is what I see. :).
The only site that out ranked this site is Amazon, I wonder if they have some sort of immunity?
Maybe that is googles plan, let the sharks eat each other :) .
Seriously, I think the term "penalize" may be incorrect. I think what we are seeing is that the goog is now differentiating between what I would call a "shill" page and a true authority.
To illustrate what I mean:
A shill page would look something like this:
An authority page would look something like this:
I'm not sure a "penalty" is being being assesed for the shills, but I don't believe they are being rewarded either.
My guess is you have a lot of "shills" and they no longer carry any weight.