Forum Moderators: open
"This thread is ONLY for serious, generic discussion of changes that you are observing with the new algo in this update. As in things like "Looks to me like PR is less important this month, and anchor text of inbound links counts more.", etc. How your site is doing has no relevance here unless you can explain why you think so in terms of a general algo analysis."
This thread is NOT for y'all to say how much you think Google sucks, or alternatively how great the new SERPs are. The idea is to pick apart how Google is working, and not to criticize their quality.
Some of my sites with fully mesh style, i.e. homepage links internally to all pages and every pages link to one another by anchor texts, suffer great loss of secondary keywords in ranking, while the primary ones are still all right. This means that the internal pages that use to get great boost by internal anchor text from every page in the site do not seem to work anymore. This seems to be an exact opposite of pre-austin.
Content is irrelevant.
The Questions are:
1. Are there any Links to my Site?
2. What anchor will this links hold?
3. Where are the Links come from?
Look with allinanchor or Googlviewer, what will you see? What Links are showing to the "filtered" Websites?!The one and only question is: Where does this Link come from...
That´s what I think and see...
Tom
Logic:
1) google wants to give relevant results and hence defines key relevant factors like PR, keywords in title, H1, anchor text links and so on..
2) these relevancy factors are picked up by seo's and used to rank clients websites on top, which by SOME and AT TIMES are pushed too much and hence resulting in non relevant results in top serps
3) google then specifically analyzes tricks used by seo's and further tweaks the relevancy factors to rank pages, which have a normal flow of keywords.
Normal flow - (reduce importance of EXACT phrase in title, alt tags, page names and increase importance for phrase variations within throughout the page)
Observations are:
1) relevance has been decreased for the title tag, which use to be key factor for seo. so if there would be 10 points being given for that - its 7 now. And relevancy has been increase for body.
2) In title, rather than repeating a specific phrase, variations of the same phrase used along with the specific phrase are working.
3) Lots of importance being given to content being built around the service/ product
(as shared by others about directories being pushed as against commercial sites)
4) text links being given more importance.
(I have seen sites, which gave links to inner pages being top ranked and links given to external sites also being top ranked)
There are few doubts, which have come up. If anyone can shed light on them, would be great:
1) Would google toolbar be used to track seo’s and the type of searches they make?
( I know, it would be tough to differentiate from a ‘SEO optimizing a client site’ and a ‘university student’ who makes same searches again and again
2) Would there be different algo being applied to keywords with high competition?
(Post brandy – same site, same theme, same optimization works for few keywords and being top ranked, others don’t)
3) Would different algo being applied to individual pages and positions?
(As in top 10 – 20 results being important and seen by everyone – so algo 1 is applied,
20 – 50 are for searchers, who aren’t changing their search phrase but digging further – so algo 2 is applied)
-michael
In general:
Pages < ~10K or including indexes with framesets are being pushed down
Pages > 10K and < 35K are getting a boost (or are just boosted by other sites going down)
Pages > 35K are getting pushed down
Interesting. One of my sites is constructed with beautiful CSS - loads like a rocket. Each page has an image or two and about 500-600 words, but are pretty much all under 10k. I've seen a gentle fall off in rankings and traffic.
Maybe I'll fatten them up a little
One thing isn't everything. Your question is a non-sequitor.
Whatever Steve...you have your opinion and I have mine but I guess you just like everybody to listen to yours and everyone else question is non-sequitor.
Since I started the thread, I'd rather step back for a bit and let others post, so as not to influence the direction of the discussion. And, other members here tend to look at SERPs I don't normally, and I'd like to see if they have spotted things I haven't.
In general:
Pages < ~10K or including indexes with framesets are being pushed down
Pages > 10K and < 35K are getting a boost (or are just boosted by other sites going down)
Pages > 35K are getting pushed down
1) relevance has been decreased for the title tag, which use to be key factor for seo. so if there would be 10 points being given for that - its 7 now. And relevancy has been increase for body.2) In title, rather than repeating a specific phrase, variations of the same phrase used along with the specific phrase are working.
Im seeing interior pages do well because of title tags, so I'm not in agreement on the decrease of title tag value.
All in all it looks like pre-florida minus a tad more spam. Florida and Austin were shake down cruises for a not ready for prime time semantic algo that relied far less on anchor text. My guess is the next algo roll-out will be more like Austin than Brandy, so I wouldn't bet the farm on the present serps.
What amazes me is the number of webmasters/SEOs chasing the dog that's chasing its own tail.
- The value of TITLE and Hx tags was drastically reduced in Austin. That might have been largely responsible for the abundance of directory pages in the SERPS. The tags have been made more relevant again in Brandy but are still too low (for me as a searcher).
- The concept of SITE as opposed to mere page now matters where under basic PageRank it did not.
- Site theming is being done. As a result "broad" sites with lots of subsections are being hit. Example: I have an information site all about widgets. All pages have previously ranked well for their targetted keywords. The majority of the content is about installing and maintaining widgets, these still rank well. But the set of pages about the invention and history of widgets has lost out badly. The perceived site theme appears to be overwhelming on-page factors.
- I still believe that Google have attempted to implement LSI "on the cheap" and that the whole ideea of doing so is bound to lead to problems.
- Googlebot crawl patterns have changed. Too early to be sure but it appears that some sites are being crawled a lot more some a lot less, suggesting Google is concentrating resources more on what it sees as the "best" sites.
Have been wondering about this. One of our 10,000 page sites (Site X) is linked to from an ok PR8 with lots of other outbound links.
Site X links in turn to (10,000 page) Site Y.
(Both are on same topic with different content & GEO-targeting is different)
Site X (PR6) was last deep-crawled 3 weeks ago.
Site Y (PR5) is deep-crawled weekly.
Surely Site X linked to heavily from the PR8 site should be crawled more than Site Y with less PR?
Site Y is also doing better across the SERPs.
Incidentally, by just posting this, its occurred to me that the reason Y might be doing better in SERPs than X is because the site linking to Y is same THEME whereas PR8 linking to X is DIFFERENT THEME.....
Not necessarily. It seems logical to me that Google would crawl even a very high PR site less frequently if Googlebot recognized that the content very rarely changes. Going where fresh content is likely makes more sense.
Not necessarily. It seems logical to me that Google would crawl even a very high PR site less frequently if Googlebot recognized that the content very rarely changes. Going where fresh content is likely makes more sense.
Definitely. johnser et al -- don't make the mistake of putting all your eggs in the PR basket. It is but one factor determining how Google works.
"Going where fresh content is likely makes more sense" is absolutely logical but when both sites are updated simultaneously?
All other things equal (& they are), the only differences between the 2 sites are:
1 - The seldom crawled one has a higher PR value & is linked to from an off-Theme site
2 - The heavily-crawled lower PR site is linked to from an on-topic THEMED site.
Apart from the themes point, this is not logical GBot behaviour.
If we include the Themes concept, its entirely logical...
Don't quite get it.
<edit to clarify>
When searching for products or services by: product name, city and state, I see nothing but trash.
I'm not changing a thing on my sites.
Right now I'm doing great with Yahoo. Their algo seems to be were Google was 6-9 months ago.
[edited by: Marcia at 10:27 pm (utc) on Feb. 24, 2004]
[edit reason] Edited off-topic comments. [/edit]
One thing being everything would be a dangerous place for a search engine to be. Even one thing being entirely too obvious to too many people isn't a good thing.
What are you people seeing with the weight of anchor text on pages now and for the past couple of updates?
And how about not necessarily keyword density but number of occurrences both on-page and sitewide, and about words being used in phrasing?
What should be clear to everyone by now, but obviously isn't, is that there are many things at play simultaneously here. One thing is not everything, even the most important thing. Just because you do one thing right, even the most important thing, you may do some other things horribly, which diminish greatly the impact of the most-important thing.
I'm not seeing much evidence of whole site themeing, at least for the SERPs I have been checking. However, I am seeing increasing evidence that Google can pick up on signs keywords are appearing on the page in a way that isn't natural language, and that becoming an algo factor. Google did buy Applied Semantics, and they may have started using that technology to spot SEOed keyword stuffing. Enough evidence that it seems reasonable to assume this is the case when doing SEO.
However, I COULD BE WRONG. ;)
...there are many things at play simultaneously here.
I agree with you 100% and my post was in reply to somebody posting that 'internal (anchor) links' have more weight than used to be which I disagree and siting that 30,000 pages interlinking example to backup my disagreement in which there could be more factors involved than just internal links.
But to dismiss somebody question as non-sequitor...isn't that just reek of arrogance? I bet you don't even know what it means :D