Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: open
Example Google Optimised Themed Structure
The example site is about widgets, it is called widget-world.com. The home page receives a lot of good links (and PageRank) because people like the site. Most of these links use the word 'widget' or 'widgets'. We title the home page "Widget World, we have Widgets" to capitalise on those links and their link text; we also use the words widget and widgets in the page. We reinforce this by linking back to the home page from every other page in the site, using the link text "Widget World, we have Widgets".
The next level has the colours of widgets, widget-world.com/blue/ I don't worry about keywords in URLs for Google, but other engines like them. It is titled "Blue Widgets from Widget World"; the body text mentions blue widget and blue widgets; the link text from the home page uses the link text "Blue Widgets" or "Blue Widget Collection"; all other pages in the blue widget section link back with that link text; and if we have a good range, other webmasters may link from their pages about blue widgets.
The next level has types of widgets. The widget-world.com/blue/fuzzy/ page is titled "Fuzzy Blue Widgets from Widget World",; the body text mentions blue fuzzy widget and blue fuzzy widgets; the link text from the home page uses the link text "Blue Fuzzy Widgets" or "Blue Fuzzy Widget Collection"; all other pages in the blue fuzzy widget section link back with that link text, and if we have a good collection, other webmasters may link from their pages about blue fuzzy widgets.
Below this level could be sizes, or whatever depending on the size of the site.
Rationale: As the terms widget and widgets have the most competition in the area, we are targeting the title, body text and inbound link text on those words in the highest PR page. As the colours of widgets have significant competition in the area, we are targeting the title, body text and inbound link text on those words in the next highest PR pages. As the fuzziness of widgets has a little competition in the area, we are targeting the title, body text and inbound link text on those words in the medium PR pages. As the sizes of widgets have almost no competition, we are targeting the title, body text and inbound link text on those words in the lowest PR pages. If we don't rank well enough, we need more inbound links.
Keep Important Phrases High in the Pyramid. The pyramid structure dilutes PageRank with each link, but we get to pick which pages get more than their share of PageRank. The decision to use /blue/fuzzy/large/, /blue/large/fuzzy/, /large/blue/fuzzy/, /large/fuzzy/blue/, /fuzzy/blue/large/ or /fuzzy/large/blue/ depends on the competition for the words (do more people search for blue widgets, large widgets or fuzzy widgets?) and on the mindset of potential customers (hopefully both considerations lead to the same choices). Also, Google likes matching phrases, so if more people search for "fuzzy blue widgets" than "fuzzy blue widget" then we should use "Fuzzy Blue Widgets from Widget World", not "Fuzzy Blue Widget Collection from World of Widgets".
Some people would put the second level (blue in this example) on separate domains or sub domains. Unless this can get us more directory listings or other links than we would get on one domain, it will not help in Google. If the linking is just hierarchical (the colour pages link to and are linked from the central domain, but not each other) then it shouldn't matter (hub and spoke relationships seem OK), but if the domains link to each other then there's potential for a penalty (do a site-search for PR0 for more information).
Customising the Pyramid: There are times when the pyramid needs to be adjusted. For example, a highly popular (or high margin) product might want to be linked from the home page as a 'featured product'. This will hopefully help more people to find it easily once they’re in the site, but it will also boost those pages in link based engines such as Google.
Dilemma: We can end up with repetitive phrases in the navigation. It is worth spending time discussing options for naming the navbar/home page links, compromises are generally necessary to help make links descriptive enough for robots to understand, while not bring overly repetitive for human visitors.
Summary: We don't have to choose between themes and Google optimisation. Why not have our cake and eat it?
This is a good update on the theme pyramid. I agree very much with the point about keeping your important phrases high in the pyramid. Not only does this make the most use of the page rank flowing and recirculating through your site, but it will make sense to your users. The high value words are the words that your users are thinking of when looking for sites like yours and considering buying stuff like yours.
Theme design takes some thought and planning, but when done right it makes for a very navigable and intuitive site.
On the use of subdomains in theming, they can be very helpful in separating the content. One reason to use them is to get links into each of the domains. Another reason to use them is to manage large chunks of information, while keeping the branding consistent across multiple domains.
There is no risk to using subdomains per se. There are risks in creating your own cross-linked web of sites when done excessively and with little thought to the specific relevancy of the linking pages. And if you are not linking relevant pages, then you are not theme-linking your site.
As for repetitive phrases, yes this can be a challenge. With careful planning this can be avoided or minimized as I have found in my own theme planning efforts. It is worth taking the time to minimize repetition as that can create a confusing navigation structure for your users.
Themes are quite compatible with Google. They can help with pagerank, and when Google gets to the point of combinine PageRank with themes, those who are ready will do well.
Example: www.fakesite.com/fake_page.htm was PR3 while www.fakesite.com/fake_dir/fake_page.htm was PR1. Exact same HTML (except for the links back to other content on the site), but the page received a diminished PR.
When I move a page from the root to a directory it dropped 1 PR
Example: www.mydomain/widget.htm = 4
and www.mydomain/widgets_dir/index.htm = 3
The identical page moved.
The link from the homepage to the new page was:
ie it defaulted to the new index in the sub-directory.
Marcia is your question suggesting that the following link would be better?
or have I missed something (as usual)...
This used to be true on my site, but after the last update all those lower directory pages moved up one PR. :) I did change some of the linking, but not drastically. I wish I knew exactly what happened so I could bottle it...
Also if this linking is homepage to homepage would this lessen the feedback loop affect. Smaller loss of PR for the outbound link than that when not reciprocating directly back to the specific page that linked to you?
Last - It is always good to submit the page you have linked to, since there isn't any guarantee that googlebot will hit every page and link in one update. Which reminds me - a site that had no PR (gray tool bar) is quit possibly not indexed and therefore may in fact have a relative higher PR based on future (unknown) PR performance.
Google's database continues to grows rapidly but PR max's out at 10.
If a web site that has (Example PR7) today but does not increase its link structure consistent with the growth of the database, would its not depreciate over time in PR value, PR6, PR5 ... and so on, since its a ranking is against the total archive?
wesinator, bran, Axacta and fathom; might you be describing the Toolbar 'guess' that is used when a page is not in Google's index. If so then it's not real PageRank and it doesn't help rankings. When Google spiders the page it gives real PageRank based on links.
The "unreal" PageRank, although perhaps a fleeting, temporary assignment, may still have significance. Google's crawl is queued by PageRank. If it discovers a page for the first time on a crawl (i.e., the page has no inbound links and is only linked from your higher-level page, meaning that it's "discovered" anew on every crawl), then this temporary, "fake" PR value may determine where it's placed in the current crawling queue.
Why is this important? Well, if your site is small, it may not be important. If it's large, it could be a major factor in determining whether Google gets to that page before the crawl is over.
If there are no incoming links, it's also possible (don't know for sure, but it seems reasonable), that the "fake" PR value could play a role in the ranking. I don't have any evidence one way or the other on this.
Why am I making this point? Well, lots and lots of internal pages on a site don't have inbound links from other sites. It would be nice to figure out how Google treats these pages. Most other sites link only to your home page. Most Google referrals go right to an internal page. The difference between these two is a measure of why this issue is significant.
After several months of close examination of my site and pages within my site I started to notice that all pages that are featured on my navigation bar are ranked high (all page rank 5). Now this sounds good but if the pages have no real "key words" they do not get much traffic. I also noticed that several pages also had very high page rank simply because they are linked from my "home page" or index page. A good example of this is my "link to me page". Now this page serves an important function on my site, I want people to be able to find nice graphics to help promote my site but do I need a page rank 5 on that? I have recieved no search referrals from that page, seams like a waste of page rank.
So here is what I have done. I have used:
<meta name="ROBOTS" content="NOINDEX">
On several of these pages located on my navigation bar, my think this should send more page rank to the other pages on my site like my "cooking recipes" and "cooking tips" page. Am I on the right track, again I am not worried that several of my pages are not getting indexed.
Thanks for the help.
I hope ciml didn't rip someone elses hard work. I'll post that page when I come across it. Ciml do you remember the page I'm talking about?
don't worry, all is well!
typically, widgets is a generic example used by people across the web, even outside here. ie blah.com sells widgets ;)
anyways, i like to sum up themes as "logical use of directory structure and navigation", which does go hand in hand with PR for the most part, if not all (someone split a hair).
As brother pointed out, widgets is a standard. widgets has been the standard generic terminology for "products" for as far back as we can remember. It was used in accounting classes for examples of setting up accounting and bookkeeping systems, and by marketing people talking about selling product, using generic terminology, long, long before there was even an internet. Further back than I can remember, or even care to remember.
547,000 occurrences of the word widgets with a Google search [google.com], in addition to being in my 20-year-old accounting textbook.
Brian, I've often wondered whether PageRank link dilution includes REP excluded (or dead) links. I can't think of a reliable check without making test pages, due to the low Toolbar resolution and inability to check all backlinks reliably.
My thought was that each keyword in a PPC campaign should be linked to the related theme page; so the page is very obviously relevant to the visitor, hopefully increasing conversions.
Also, does anyone else wonder if, long term, the combined use of theme and link popularity algorithms (increasing meta-tag based SERP relevancy) mean that PPC click throughs fall?
I gotta write that down: may 7th, accused of plagerizing ourselves.
It's spread to other forums now and was recently included in a mainstream story (not gonna quote it).
We also coined, "serps".
Nice post Calum!
My approach is to put important phrases in those pages' title and body (link text is pushing the idea too far, IMO), but this can end up with someone looking for 'fuzzy widget accessories' ending up in the 'link to me' page. Not ideal.
If anyone has tested whether robots exclusion of a page causes it no longer to be counted in the number of links from the linking pages then they've been keeping it quiet. Also, it's not the kind of information that Google are likely to give us.
It makes sense, Brian, but my uneducated gut feeling is that those links would still hold back some PageRank from the other pages. It would be nice to know, though.
whether robots exclusion of a page causes it no longer to be counted in the number of links from the linking pages
Good point Ciml,
from the original standpoint Google could still count linking to non-indexed pages as "voting" - therefore dilution.
I would guess Google counts them as links because Google also still counts or shows the back-links of pages with <meta name="ROBOTS" content="NOINDEX">.
When I say "penalized" I mean that Google just drops the pages that got crawled, because they also exist in the main site that has a higher PR. The reason I now disallow it is because I'm nervous that Google will retaliate against my main site if I don't.
This semi-mirror site is a PR6 site. When I disallow it completely (Disallow: /) it still maintains its PR6 status, presumably due to the 84 external links that point to it.
So a site that isn't indexed will apparently maintain its PageRank status due to external links.
I've never done it because it could be described as cloaking and/or artificial PageRank manipulation (I think I may be over sensitive to such things).
For larger sites, I think Google(bot) much prefers you limit the number of links on your page and use e.g. a java-script pull-down menu for navigation to the rest of your site (which links it does not follow), than e.g. clogging every page of your site with real links to every other page on the site (which I consider ugly for the user anyway). Just imagine being the Googlebot and having to digest nearly all the same links again on every page of the site it visits.
Also to come back to the origin of your thread, using this type of navigation from your page towards the other-themes of your site, concentrates the "real linking" to a theme - for the spider and for the surfer.
I believe eventually, when theming will become more important in Search engines, the concentration of relevant linking from theme content relevant pages (also from site external pages) to each other, will weigh heavier than if certain pages follow a directory structure linking or are all in the same (sub)directory or not.
As for the pagerank, that can be link-spread out carefully - just as theming can, irrespective of in which directory the pages are placed.
I have all my site's English pages in just one directory - because I am lazy and find it easier to manage.
The benefits I see for using sub-directories per theme are:
1. Certain search engines may give an x-percentage ranking benefit for the possible keywords or keyphrases in the directory's url name.
2. Some directories/portals list links with the original complete (keywords-containing-subdirectory) url. I believe some urls benefit in Google's ranking because the keywords are then contained in the anchortext. If I am right, Google should make some kind of discount in their algo to the effect of: if Anchortext = full or parts of Url = keywords in search query, do not count as "allinanchor" in the ranking.
3. It is easier to leave bread-crumb type of tracking on your page - which can be helpful to the surfer.
I wish I knew how to leave breadcrumbs of the last three visited pages - irrespective of which directory hopping took place.
Suppose my page with the meta noindex tag had a link to an "external-example" page. If you did a Google search for pages linking to "external-example" page, would the SERP list show my page with the meta noindex tag? If so, then we have to be pretty concerned about the privacy aspects. Can anyone verify that whether this does occur?
I am no robot meta tag specialist, however are you referring to "noindex,nofollow" compared to "noindex,follow"?
The example I was thinking of was salon.com
(originally a PR9 index page).
They put in the <META NAME="robots" CONTENT="noindex, follow"> which gives them a greyed out PR-bar. However Google shows backlinks (only 34.000 though ;))
That is your noindex page can be excluded from the Google index, yet Google will still show incoming links.