| This 174 message thread spans 6 pages: < < 174 ( 1 2 3 4 5  ) || |
|Minor Shuffling - Incremental Indexing |
Not enough changes to be an update.
hi to all off google watchers
I see a huge changes in google serps across all the google datacenters in europe and asia but US serps are the same...
A few of my site goes up and a few goes down
Did you see the same changes?
Odd, my site is showing the pagerank from pre-last update on two datacenters. Anyone else noticing some PR fluctuation?
Yes, I just did some checking and there are definately PR differences between datacenters.
For my experience, I lost near half my traffic in only one day, with several good keywords dropping 20-25 + points in the SERPS.
By reading the messages on this post, it would seem that it is caused by over-linking in the internal site structure. I tried to put a 'popular pages' menu on all my pages. 10 links on all pages, I though it could be a good way to help my best 10 converting pages. I guess posting so much links raises two problems, page all look alike with a lot of links that are the same or similar, and also it raises the problem of spam-like site structure.
I dont know if all of you would agree, but it seems that clean structure being Homepage>subdir>pages is clearly the way to go, regarding the linking. I mean that all the pages should not point to a series of other same level pages?
I guess it's not good anymore to put a couple of popular links on all the pages to help them, because one third level page, should not point to another third level page? Or is the problem more that I pointed to 10 pages on all my 300 pages?
Could it also be that I've put tomuch links up at the same crawl?
Thanks for the answers, and I hope that the results are not yet satble, so I could regain a couple of places in the SERPS!
Have a nice day (quite ironic in my case but....)
I'm not going to re-do my whole site to please Google.
I'll take the traffic loss instead. Again, I was in first place for my keyword combination, and I've completely dropped from sight.
I designed my site for my visitors. If Google can't tell a good site from a spammer network, it's Google's problem, not mine.
SEO is a necessary evil, but there is a limit to the amount of time I'm willing to invest in re-designing my site for Google. Especially since I have no idea if I'm moving in the right direction. Also, today's right direction is tomorrow's wrong turn.
I just don't know what else I can do.
I totally agree with you. With this "minor shuffling", I didn't get hit all that bad...definitely not as bad as you. However, what burns me is the amount of time that is required to maintain a website if you want to rank well on Google. Besides all the guess work involved...if you optimize for google you sacrifice rank on yahoo and msn.
Google is like that professor in college for which could never get the essay or term paper right.
Never been in the so-called sandbox, but I have about decided I am going to "sandbox" google and quit worrying about them....way too much instability. Content and ease of navigation that converts into sales is what I am after...not a #1 position on google with almost zero conversion.
helleborine, I totally understand your frustration. In this recent "minor" update I have had sites drop and sites go up - i haven't done anything no new links no new content.
I feel these recent updates are serving to penalise a lot of normal websites.
It seems that some of their assumptions about content (especially duplicate filters) are poor - but after all this is simply a mathematical algorithm that will always be inferior to human judgement on a case by case basis.
But thats Google...
It was a sad day when Yahoo et al decided to put a minor emphasis on their human reviewed core business directory.
<<Google is like that professor in college for which could never get the essay or term paper right.>>
What are PHDs for anyway....?
It is discouraging to say the least, particularly when you must be above the fold on page one to get noticed by very many.....out of a gazillion SERPs. Keeping up to date with the guess work is very time consuming and while Brett's white hat rules are 90% of it, the extra (white hat) work moves me from page 2-4 to page 1-2. With luck. For a little while. Until the next minor adjustment....
Two things astound me:
That I EVER get on page one with a 100 page Mom and Pop product.
That the SERPs are not 99% garbage.
Without minor adjustments neither of the above could happen because the first pages would be total trash, permanently. I don't know what else Google could do to keep the slime from floating to the top.
Then again, not being an AdSense Master, it would be nice to make a living selling the widget while keeping my finger nails!
<By reading the messages on this post, it would seem that it is caused by over-linking in the internal site structure.>
Here in this thread and in several other threads, we are mostly assuming and guessing and as far as internal-linking is concerned I doubt that it could trigger any spam filter in Google.
We could assume many thing about Google, but there is no logic in assuming that Google can't differentiate between inbound, outbound and internal links. Lets not underestimate the intelligence of Google engineers :-)
On most of my pages I have a menu covering not less than 100+ internal links and some of such pages have top positions on the serps for competitive keyphrases. And none of my site pages appear as supplemental results (duplicates) on Google at present.
I add the internal links menu to make my site user friendly and enable visitors to navigate my site easily and to reach target pages within no more than two clicks. And there is logic in all that. Don't you agree?
Lets not allow assumptions and guess to win over logic :-)
"the first pages would be total trash"
I disagree ,Google's first page results are the best from all Search Engines that is why Google is the best SE in the world today ,I have thousands of examples to give you about searches here i will give just one ,If I want to find the cast or about a film or an actor from any time and write on the search box the title of the film always or the name of the actor 100% I will find what I want in the first page try -Gone with the Wind the film- or try -Oliver Hardy Biography-.You must know what you are searching Joe user today getting more IT educated and mostly hits multiple queries and not only "red widgets" he rather search -widgets less then 10 $- or to make it more clear -wines for sale year 1966-1970 -instead of cheap red wine-
"I add the internal links menu to make my site user friendly and enable visitors to navigate my site easily and to reach target pages within no more than two clicks. And there is logic in all that. Don't you agree"
I completely agree and vote for it.
I understand the navigational concepts you bring up, and I tend to agree to the fact that it can be made in a practical way. But I also think there are limits in the tolerance of google for Internal linking strantegies. As regarding your argument which points not to create pages for google I would tend to think that all the search engines will eventually be more sophisticated and picky for their linking weight algos.
Over the years i've tried many link configurations for my 600 pages site. Up until now I can say that the best results have been obtained on the long term by a quite small internal linking structure. (Every page points to all the MAIN CATS and the HOME PAGE)
I've tried several times to 'promote' pages on all of my concent pages as a form of menus. But I think the junction of those menus, with an intense on page optimisation on the target page, might lead to over-optimization triggers.
Of course all should care a lot about the anchor text and link relevancy, but I think too much is like not enough, does not work so good.
The question would be more trying to find a realist way to promote popular pages, without reaching a limit that would be considered too much by the search engines, and by the user?
I can't say I know whats the good balance, but like I said we can maybe get some conclusions from the experience of my site.
I added 300 pages in the last month, then added a system of menus which added 10 internal links with 6 different anchor text sets on my 10 best pages site wide.
So it makes more than 800 links for those 10 pages added in a single update? Maybe the problem is not the quantity of links or structure, but a problem of too much added links at the same time?
A last thing about your message: you said that google can differenciate between internals, inbounds and outbound... I agree, but in what way does that cancel the possibility that google can detect too much internal links and over-optimization techniques?
What do you think?
I did a lot of internal linking all at once. Perhaps that is a reason that the site still sits in the sandbox. Funny thing is that my readership took advantage of the links. I saw the page views per visit rise accordinly after the linking.
The Googlemeister might not like it, but peeps do.
Thanks for sharing. Very interesting post!
<But I think the junction of those menus, with an intense on page optimisation on the target page, might lead to over-optimization triggers.>
Agree. However, I operate with certain factor (# or %) of keywords/keyphrases density on each page, taking into account the content (anchor text of internal links) of the menu.
<So it makes more than 800 links for those 10 pages added in a single update? Maybe the problem is not the quantity of links or structure, but a problem of too much added links at the same time?>
The most important question is still; how does Google and other SEs look at the number of internal links on a page. Personally I believe that internal links are considered part of the content of a site. Accordingly one should ask; does adding too much content at the same time trigger a filter by SEs?
<A last thing about your message: you said that google can differentiate between internals, inbounds and outbound... I agree, but in what way does that cancel the possibility that google can detect too much internal links and over-optimization techniques?
What do you think? >
I read once (maybe on Google or a post by GG) that Google suggest around 100 outbound links per a page, but I haven't seen any specific number regarding internal links.
So I guess that the only drawback of having too many internal links on the same page is in connection with over-optimization. I.e Google take into account the anchor text when evaluating whether a page is over-optimized.
Please excuse my ignorance, but are we saying that a website with 10 pages cannot be interlinked? In other words, Google will penalize me if I have a navigation menu on every page with links to the other 9?
|I read once (maybe on Google or a post by GG) that Google suggest around 100 outbound links per a page, but I haven't seen any specific number regarding internal links. |
Reseller is correct - but just so there's no confusion, they suggest using less than 100 links per page. Not sure if they were only outbound or inbound and outbound etc. however.
As for having a menu of inbounds to certain parts of the site on every page, this should be ok. Yahoo, MSN, and Google all have it...
I saw a decline in my sites PR, anybody else notice a change in PR on Toolbar?
|I read once (maybe on Google or a post by GG) that Google suggest around 100 outbound links per a page, but I haven't seen any specific number regarding internal links. |
As I understand it, that was because Google didn't follow more than 100 links per page, not because Google penalized or filtered pages with more than 100 links.
BTW, people need to think twice before making major changes to their sites in hopes of recovering from a Google "minor shuffle" or update--unless, of course, they know they've done something shady and want to clean up their acts.
"people need to think twice before making major changes to their sites in hopes of recovering from a Google "minor shuffle" or update--unless, of course, they know they've done something shady and want to clean up their acts. "
Hi and thanks for the comment, I agree totally, the fact is I did an experiment site wide, and I think it may have been considered by google as an attemp to play him. The way I saw it, it was more the intention of promoting the links for my best selling pages, from all the pages. I'm not sure if I will remove the menu I just made, but lets say maybe...
I tried to add those 10 links from the best third level content pages. Seems to me it changed the way google sees my site, which ranked in the top 3 for years on tens of competitive keywords.
I already tried other forms of structure like I said, and I come to question the exageration of internal links that repeat themselves on all the pages.. Not that I tought I was exagerating, or ruining the user experience, I consider my site useful and relevant. But seems that you have better chances on most engines to limit the links but have them in a well planified and optimised manner.
I still think the best method is to point to subcats from home page, and sub cats point to content pages. Content pages can point between themselves AS LONG as they are in the same sub cat. If content pages point to other content pages in another subcat, I think it can be consider from a less good eye. ( at least in a template based industrial manner)
This way the pages are still two links away. Content page can go to any SUBCAT index. Then on the SUBCAT index you put a link to all the content in that subcat. *not sure i'm clear enough but...*
You can link all the subcats from every page as a legitimita ordinary menu.
For the questions about the 10 pages, it,s 800 pages that point to 6 MAIN CATS (legitimate menu) ad the I added 10 links on every pages as a popular pages menu.
UPDATE: like expected a couple pages started climbing more than 15+ positions, so the little refresh is not over. ( not in europe or whatever datacenter I check) ;) There'S still hope for a regular shake up and not total destruction..
A last thing for reseller: It is an interesting point that you function with the on page in relation with the linking... I guess it's obvious but I saw that more as a bonus and never really tought about doing really too much in concentration.
For the rest, it is not the first time I read the maybe too much links or too much content at the same time can cause unexpected results, is that really possible: I tought I had the control over my sites developpement. I guess I could have uploaded over a month's period. But hey I just didint.
More thoughts on too many internal links..
Those of us affected by this 'minor shuffle' are looking for patterns - what's different between pages that went url-only and those that didn't.
Here are some interesting patterns:
My site is layed out in subdirectory fashion. Each subdirectory (~25) has an index.htm page.
The left side border contains links to all subdirectories. The right side border contains links to each page in the subdirectory - may be 4 to 25 links. Some of the detailed subdirectory pages have more content some less content than the index page.
Here's the pattern: No index.htm pages have gone url-only. Only the detail pages went url-only, sometimes 2 or 3 and sometimes all of them.
Also sitemap.htm is ok but sitemap2.htm sitemap3.htm are url-only.
Could google be applying different rules based on the name of the page?
" In other words, Google will penalize me if I have a navigation menu on every page with links to the other 9?"
Only if the Earth is flat.
Hum I was wondering if a menu on 800 pages that appear on the web at the same time would be too fast of a growth and create a little down in rankings for search engines, as for the flat earth...
<<No index.htm pages have gone url-only. Only the detail pages went url-only, sometimes 2 or 3 and sometimes all of them.>>
Interesting.. I find that no pages contained on my main left hand menu bar or my "directory" pages (internal/external link lists) have gone URL only, while sub pages (single topic pages linked from the main pages) have gone. A few were left-over doorway style (but return linked and with nav bar), but most were not.
Perhaps the linking plays a part, but it's still not too clear.
Still some instability across datacenters, with 3 different sets of results.
Using the narrowest set of keywords that should logically bring my site in first position, the Google Directory page where my site is listed still comes up first in all three sets.
I'm getting more traffic than usual from the Google Directory page, because people looking for my site can't find it anymore.
My site isn't completely dropped, it still comes up with "allinachor" - but in checking this I found something bizarre...
I googled "allinanchor: helleborine 's free widget plans," and 4 other site come up.
These 4 other sites sell books, electronics and other krapp.
Why do they come up in a search for "allinanchor: helleborine's free widget plans?"
Naturally, I checked the source code.
Here's what I found:
|These terms only appear in links pointing to this page: <B>allinanchor helleborine free widget plans </B> |
Is this EVIL? What is this? Is it common?
Could this be the cause of my drop in rankings?
Sticky me if you must.
There is some movement in my sector again. SERPs seem to be settling back to what they were prior to last weekend.
Continued here: [webmasterworld.com...]
| This 174 message thread spans 6 pages: < < 174 ( 1 2 3 4 5  ) |