Forum Moderators: Robert Charlton & goodroi
For my experience, I lost near half my traffic in only one day, with several good keywords dropping 20-25 + points in the SERPS.
By reading the messages on this post, it would seem that it is caused by over-linking in the internal site structure. I tried to put a 'popular pages' menu on all my pages. 10 links on all pages, I though it could be a good way to help my best 10 converting pages. I guess posting so much links raises two problems, page all look alike with a lot of links that are the same or similar, and also it raises the problem of spam-like site structure.
I dont know if all of you would agree, but it seems that clean structure being Homepage>subdir>pages is clearly the way to go, regarding the linking. I mean that all the pages should not point to a series of other same level pages?
I guess it's not good anymore to put a couple of popular links on all the pages to help them, because one third level page, should not point to another third level page? Or is the problem more that I pointed to 10 pages on all my 300 pages?
Could it also be that I've put tomuch links up at the same crawl?
Thanks for the answers, and I hope that the results are not yet satble, so I could regain a couple of places in the SERPS!
Have a nice day (quite ironic in my case but....)
Alex
I'll take the traffic loss instead. Again, I was in first place for my keyword combination, and I've completely dropped from sight.
I designed my site for my visitors. If Google can't tell a good site from a spammer network, it's Google's problem, not mine.
SEO is a necessary evil, but there is a limit to the amount of time I'm willing to invest in re-designing my site for Google. Especially since I have no idea if I'm moving in the right direction. Also, today's right direction is tomorrow's wrong turn.
I just don't know what else I can do.
I totally agree with you. With this "minor shuffling", I didn't get hit all that bad...definitely not as bad as you. However, what burns me is the amount of time that is required to maintain a website if you want to rank well on Google. Besides all the guess work involved...if you optimize for google you sacrifice rank on yahoo and msn.
Google is like that professor in college for which could never get the essay or term paper right.
Never been in the so-called sandbox, but I have about decided I am going to "sandbox" google and quit worrying about them....way too much instability. Content and ease of navigation that converts into sales is what I am after...not a #1 position on google with almost zero conversion.
I feel these recent updates are serving to penalise a lot of normal websites.
It seems that some of their assumptions about content (especially duplicate filters) are poor - but after all this is simply a mathematical algorithm that will always be inferior to human judgement on a case by case basis.
But thats Google...
It was a sad day when Yahoo et al decided to put a minor emphasis on their human reviewed core business directory.
What are PHDs for anyway....?
It is discouraging to say the least, particularly when you must be above the fold on page one to get noticed by very many.....out of a gazillion SERPs. Keeping up to date with the guess work is very time consuming and while Brett's white hat rules are 90% of it, the extra (white hat) work moves me from page 2-4 to page 1-2. With luck. For a little while. Until the next minor adjustment....
Two things astound me:
That I EVER get on page one with a 100 page Mom and Pop product.
That the SERPs are not 99% garbage.
Without minor adjustments neither of the above could happen because the first pages would be total trash, permanently. I don't know what else Google could do to keep the slime from floating to the top.
Then again, not being an AdSense Master, it would be nice to make a living selling the widget while keeping my finger nails!
<By reading the messages on this post, it would seem that it is caused by over-linking in the internal site structure.>
Here in this thread and in several other threads, we are mostly assuming and guessing and as far as internal-linking is concerned I doubt that it could trigger any spam filter in Google.
We could assume many thing about Google, but there is no logic in assuming that Google can't differentiate between inbound, outbound and internal links. Lets not underestimate the intelligence of Google engineers :-)
On most of my pages I have a menu covering not less than 100+ internal links and some of such pages have top positions on the serps for competitive keyphrases. And none of my site pages appear as supplemental results (duplicates) on Google at present.
I add the internal links menu to make my site user friendly and enable visitors to navigate my site easily and to reach target pages within no more than two clicks. And there is logic in all that. Don't you agree?
Lets not allow assumptions and guess to win over logic :-)
reseller
"I add the internal links menu to make my site user friendly and enable visitors to navigate my site easily and to reach target pages within no more than two clicks. And there is logic in all that. Don't you agree"
I completely agree and vote for it.
I understand the navigational concepts you bring up, and I tend to agree to the fact that it can be made in a practical way. But I also think there are limits in the tolerance of google for Internal linking strantegies. As regarding your argument which points not to create pages for google I would tend to think that all the search engines will eventually be more sophisticated and picky for their linking weight algos.
Over the years i've tried many link configurations for my 600 pages site. Up until now I can say that the best results have been obtained on the long term by a quite small internal linking structure. (Every page points to all the MAIN CATS and the HOME PAGE)
I've tried several times to 'promote' pages on all of my concent pages as a form of menus. But I think the junction of those menus, with an intense on page optimisation on the target page, might lead to over-optimization triggers.
Of course all should care a lot about the anchor text and link relevancy, but I think too much is like not enough, does not work so good.
The question would be more trying to find a realist way to promote popular pages, without reaching a limit that would be considered too much by the search engines, and by the user?
I can't say I know whats the good balance, but like I said we can maybe get some conclusions from the experience of my site.
I added 300 pages in the last month, then added a system of menus which added 10 internal links with 6 different anchor text sets on my 10 best pages site wide.
So it makes more than 800 links for those 10 pages added in a single update? Maybe the problem is not the quantity of links or structure, but a problem of too much added links at the same time?
A last thing about your message: you said that google can differenciate between internals, inbounds and outbound... I agree, but in what way does that cancel the possibility that google can detect too much internal links and over-optimization techniques?
What do you think?