Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Internal Link Loop Help

         

ErrlyBird

3:43 pm on Nov 7, 2022 (gmt 0)

Top Contributors Of The Month



I have a way of overcomplicating things and hopefully someone here can help guide me. G search is not helping me with the info I'm looking for.

I'm working on internal link strategy and don't know the best way to go about it. From what I can tell through research, your internal links are to be somewhat like a tree. Main category linking down and those categories linking down and then linking between categories that go together.

How I have it right now is that my main category links to its sub categories. Those sub categories link another level down. And at the bottom level of the tree, I point back to the first category which I assume creates a loop. Is this bad practice? I implement the loop as a way to say "if you don't like what you've found, go back to the top level and try a different path" type thing.

I am not the greatest at explaining my issues so if there is another way I should explain this or clear up anything please let me know.

robzilla

4:06 pm on Nov 7, 2022 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



You have "loops" everywhere and that's normal, e.g. your homepage links to pages that all link back to the homepage (hopefully).

What are you aiming for with your internal link strategy?

ErrlyBird

4:31 pm on Nov 7, 2022 (gmt 0)

Top Contributors Of The Month



My goal is to just have a solid structure in place for doing current internal links and future ones as well.

Currently I am just applying internal links to things that correlate to the page I am optimizing. And then a final internal link at the dead ends that say something similar to the "not interested or didnt find what youre looking for check out our entire selection here" kind of thing.

I guess I just don't know how it works with the crawlers. I don't want to waste crawl budget or get any kind of negative side effects from have a loop. Like do the crawlers know not to go in a loop. If I have category 1 linking to category 2 and category 3, and then category 3 links down to category 4, and then 4 links back to category 1, does the crawler know that its already been there and is that bad practice?

not2easy

5:00 pm on Nov 7, 2022 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Best practice uses more consideration of users than crawlers. If it is easy to use and useful for users, the crawlers will notice that more than how considerate you have been of their time.

ErrlyBird

5:52 pm on Nov 7, 2022 (gmt 0)

Top Contributors Of The Month



@not2easy Yeah that was kind of my feelings on it as well. A co worker suggested that it has to be a tree style with dead ends but I just didn't like that as an option. I want to be helpful to users but since SEO is my job (still fairly new to it though) I wanted to make sure I also follow best practices.

MrSnuts

6:06 pm on Nov 7, 2022 (gmt 0)

5+ Year Member Top Contributors Of The Month



Think of it this way: If loops in navigation were a problem to crawlers, breadcrumb links would be a no-go, right? They are not :)
Breadcrumb links are totally recommendable to allow your users to get back to any higher level category page,
which is even better then having to go to the top level IMO.
So, no need to fear navigation loops of that kind I'd say.

not2easy

6:13 pm on Nov 7, 2022 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



From your description above, the way you are handling the 'dead ends' sounds better than leaving dead ends. Offering a way back to see other sections is better than nothing and could help them find what they want. Dead ends leaves visitors with little option - either buy this or go away. That is not pleasing the user or the business.

There is not a winning formula that must be followed, but offering a way for visitors to navigate from any page to any other page is a good practice - so long as it is clear for the visitor.

If the store has 60 categories and 410 subcategories leading to 2089 products, you do not want all those options in the navigation but once they select a category they should be able to locate the subcategory and then product selections. Each level should lead to the next without the confusion of unrelated options. It sounds like that is what you are trying to do.

Sgt_Kickaxe

8:23 pm on Nov 7, 2022 (gmt 0)



The type of link you're talking about sounds navigational so as mentioned above keep it simple for your users, think of them first.

As for internal links within the content it's best to link to pages that are related and can help the visitor with their initial intent.

ErrlyBird

8:33 pm on Nov 7, 2022 (gmt 0)

Top Contributors Of The Month



I am just so appreciative of the feedback I'm getting. Thank you guys for helping out a noobie SEO Specialist. I feel like I am on the right track. Again, thanks for all the responses, it helps me out tremendously.

robzilla

9:05 pm on Nov 7, 2022 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I guess I just don't know how it works with the crawlers. I don't want to waste crawl budget or get any kind of negative side effects from have a loop.

That's the responsibility of the crawler developers. Crawlers will adapt to how the web is built, not the other way around -- although as webmasters we unfortunately do need to take their behavior and limitations into account, but this is less of an issue these days. It's good to realize that a modern crawler will not crawl a site sequentially like, say, a link checking tool might. Instead, they will crawl and parse a single URL, then add any unknown URLs linked from that document to its database and usually also to its crawl list. Each URL is probably assigned some sort or value or priority, and the crawler may come back to crawl those later; if, when and in what order is up to them. So it won't race through your site following all links straight away; for many reasons, one of which is to avoid crashing your server.

(You might like to know that Google will also render your pages but that's a different process and doesn't happen at time of "visit".)

As for crawl budget, that's not something most webmasters need to worry about, but read the Google docs if you're interested: [developers.google.com...]