| This 68 message thread spans 3 pages: < < 68 ( 1 2  ) || |
|brainstorm: How might Google measure the site, and not just a page?|
| 8:02 am on Mar 25, 2007 (gmt 0)|
One thing we tend not to discuss here very often - and that I have heard Google reps mention over the past year or so - is that Google is devising more ways of looking at the entire domain, and not only assessing relevance url by url.
This is my conjecture, but what if Google is now working with domain-wide semantic factors a bit more strongly than in the past?
Some are seeing a strange trend toward SERPs that rank higher level pages that are actually one click away from the real "meat" - now what's that all about?
Also I've looked at enough troubled rankings recently to realize that some of the domains involved have developed into a kind of crazy-quilt of topics. As long as the scoring was very strong on just the individual url, these sites were doing great. But just maybe the disconnectedness of their "theme" is now being detected and seen as a negative.
I'm talking here about sites that throw up lots of varied pages to catch different kinds of keyword traffic, you know? They usually have "real" content, not scraped, but it's either MFA or (dare I coin another acronym?) MFO, made for organic. What Google says they want is MFV, made for the visitor.
Now obviously general news sites are also a crazy quilt of a kind, so it shouldn't just be any wide ranging diversity of topics that is problematic - that's not precise enough. But Google probably knows that their end user is often happier when the SERP sends them to a domain filled with relevant information, and not just a one-off page or even a small section.
Something about this feels like it's lurking in the back of my brain somewhere trying to break through. I am thinking more about domain-wide positive relevance signals here, rather than penalties.
Have my babblings triggered anyone's brain cells?
| 8:15 pm on Mar 31, 2007 (gmt 0)|
I agree with what you are looking at as well Tedster, and am in the midst of a bunch of experiments on some of my websites to try to make some attempt at validating it to some extent.
I had one website that had been penalized heavily over the last year in Google, essenitally I had lost all traffic to the website from Google.
The IA for this website was a mesh pattern - each sub category linked to each and other and home and each article below that, second level category, also linked to all categories as well as other articles.
About two months ago I changed to the following:
'Siloed' each category, where all second level pages in that category only link to the upper level category that page belongs to, not all other categories, using a breadcrumb format.
Limited the amount of cross silo linking especially linking from pages to the main subcategory themes that are not entirely related.
I still do link across to second level pages when it makes sense for the visitor, but try to link to other articles within the same silo from pages and not extensively to other ones.
Within 3 weeks of the change, my website started showing again as number one for its own domain name, number one for unique searches on its own homepage content and business name.
Nothing else had been changed during this time and up until now on the website. No reinclusion had been asked for.
I am seeing websites in some sectors that excessively linked home using their keyword, often sitewide and more than once per page taking hits. Is this related?
My sense reflects what others are seeing and what are you referring to Ted - that internal linking themes are playing a bigger role in determinng 'threshold' on any given website for terms.
I believe now much of what people are seeing in the way of penalties is internal linking. Whether or not Google is 'getting it right' and is accidentally penalizing websites based on this is another topic for debate for sure.
| 8:28 pm on Mar 31, 2007 (gmt 0)|
|Limited the amount of cross silo linking especially linking from pages to the main subcategory themes that are not entirely related. |
I think "related" might be the keyword here. As I said above, I find crosslinking strongly related pages from different silos to be highly effective in them playing supporting roles for each other. I would agree that indiscriminate linking amongst silos - especially in the footer as Tedster observed - or possibly in secondary nav, is probably detrimental.
| 9:38 pm on Mar 31, 2007 (gmt 0)|
Not to rain on anyone’s parade, but this has once again drifted from the original topic. I am now seeing nothing but theory on internal linking strategies, instead of "How might Google measure the site, and not just a page".
But with the stray, I will rain a little on the internal linking strategy issue, take a look at wiki, ain't no linking strategy there. It might be the most heavily cross linked site around.
Edited to add. Every site with html based menus have links to every page, from every page, & there are many well established sites that use these types of menus that also do well with G. They spider well & distribute PR well.
I still think that the site as a "whole", will have to be given some sort of grade, or weighting based on popularity of the website "voted" on by the visitors, not by easily manipulated inbound links.
Back to lurking,
Edited again to add.
After thinking a bit, the issue of how the site is interlinked could flag a site as a whole as spam, and it could be penalized as such, so perhaps it is kinda on topic.
[edited by: WW_Watcher at 10:10 pm (utc) on Mar. 31, 2007]
| 9:42 pm on Mar 31, 2007 (gmt 0)|
I think I’m getting a little more sense of how Information Architecture would work.
Subsection contents page would link home and to the pages in that section but would not link to other subtopic content pages.
The pages in each subsection would link back to the contents page and may or may not link to each other.
|I am seeing websites in some sectors that excessively linked home using their keyword, often site wide and more than once per page taking hits. Is this related? |
Here is where I’m confused. Wouldn’t you still have all pages linked to the home page? What makes it excessive?
| 10:42 pm on Mar 31, 2007 (gmt 0)|
|I am now seeing nothing but theory on internal linking strategies, instead of "How might Google measure the site, and not just a page". |
And you dont think this plays a part in how Google measures a site, or page, by the way it is linked to? Google measures a page, imho, by who and how it gets linked to, as well as on page verbage and semantics.
How Google views sitewide and pagewide internal linking has everything to do with how Google measures you and that is right on topic. :P
Annej - what I am referring to is websites who use their keyword on every page on of the website more thance once to link home. I am seeing websites in my sectors who were doing this more than once (2-3 times per page) appearing to be penalized. Of course it's impossible to confirm, but it sure fits in well with the theory of more weight to internal linking and how you link between pages.
| 10:58 pm on Mar 31, 2007 (gmt 0)|
I may be miss reading Tedsters opening post, but I think that
What is being theorised is google's attitude to a site's content as a whole, not the internal Link strategy per se
Internal linkin would signal the webmasters intent about the content of internal pages, but this would still be part of the overall content of the site
| 1:07 am on Apr 1, 2007 (gmt 0)|
I am looking at the title as how G might measure the quality of a site, both good, or bad, not just looking at a site only for the purpose of determining if it is spam site to be dumped from the SERPs. We already know, or highly suspect that G can see keyword stuffed sites, I did not believe that was what being discussed. If you wish to reduce this thread to nothing but another penalty thread, I guess that what it will be.
Open your mind grasshopper.
Back to lurking, end of my interruptions to just another penalty thread.
[edited by: WW_Watcher at 1:13 am (utc) on April 1, 2007]
| 1:13 am on Apr 1, 2007 (gmt 0)|
I don't think we are trying to solve the penalty problems here but sometimes a penalty can reveal a bit about how Google works. Some of us think it is revealing that Google might well be looking at the site as a whole.
On the question as to whether internal linking is a part of how Google measures a site I'd say yes. I am sure they are looking at the density of key words in anchor text in internal links. I'm not sure if the exact linking pattern comes into play but it's certainly worth exploring.
| This 68 message thread spans 3 pages: < < 68 ( 1 2  ) |