Forum Moderators: open
Widgets: am no. 180, down from no. 130
texas widgets: 2-3
red juicy widgits: 1 - 3
I am following all the "rules" :
widgets features in 90% of anchor text of inbound links
widgets features in title
widgets features on the page, infact all over the site
widgets is clearly the theme of the site.
my overall site PR moved up a point this update and all my 2,3,4 word combinations are doing very nicely. However, am losing rank for the main word. I was no. 60 in Nov. and am down every month and am now no. 180!
How? Now my PR is better than all the sites above me!
This month I have seen many perplexing result positions and the only reason I can attribute to them is that Google seems to be giving a lot of weight to older websites. To me this is unfair and unreasonable, but it helps explain many odd results.
hmmm... your probably right -- at least on the older part. A mature site is likely more developed, more links, more content based, less garbage code, and more likely valid code.
In saying that - one site live in Novemeber, indexed in December, and this update #1 on a few keywords & keyphrase, and many results on 1st page (search depth 10).
Lots of other relatively new sites same thing, so I doubt Google has a algo calculation that determines new sites are worth less, just haven't been developed as long.
Is this fair -- well yes IMHO. Rarely in this world do you get rewarded... for doing less. ;)
A mature site is likely more developed, more links, more content based, less garbage code, and more likely valid code.
Generally I agree all round, but anyone with a "mature" site may not necessarily find that it has "less garbage code".
In my experience, older sites that have been through a few changes and additions can sometimes have a tendency to retain legacy code, like those chunks of DNA that no longer appear to have a function.
In other words, don't forget your spring-cleaning!
;)
:)
Seriously, the PR system is a mystery to me. My site has more traffic and more dynamic content (page refreshes twice a day), has more instances of the keyword on the site and over 17,000 unique content pages in the index but I have gone down from 13th to 32nd place. And it didn't change after the update.
I asked Google to look at the problem and gave them page view information that our competitors and friends publish, showing that they have less page views than us and other supporting stuff like Alexa rankings.
They were very nice about it, but they said that the algorithm was fine. They told me that I had to prove that the algorithm did not work by showing another site that was ranked badly. The thing is, the other sites seem to be ranked fine. We are the anomaly. So how can we prove the algorithm wrong if we are the only one.
It is a bit disappointing to say the least.
Seriously, the PR system is a mystery to me. My site has more traffic and more dynamic content (page refreshes twice a day), has more instances of the keyword on the site and over 17,000 unique content pages in the index but I have gone down from 13th to 32nd place. And it didn't change after the update.
hmmm... any chance of other site owners working on their sites to improve their rankings?
I can't believe the rest of the world stops so we can improve our own.
Fact 2: PR decreases -1 for each level under the front page
Problem: A site with a PR6 front page will have many PR 2 money pages, yet they will be worthless because there will be many more PR3 pages addressing the same topic.
Theory: Page Rank and Theme Pyramids – You win one and you will lose the other
How does Brett’s Theme Pyramid work with Page Rank?
If I follow the Theme Pyramid I will end up with a lot of worthless PR2 (or less) pages that will get beaten every time by some other site with a PR4 page
So is it Page Rank or the Theme Pyramid?
At the moment I would go with my conscience and build a theme pyramid, it will serve the visitor better, but I will get nowhere with Google
Fact 1: There are 4 levels in theme pyramid, from front page to the money pages
Agree
Fact 2: PR decreases -1 for each level under the front page
Not necessarily -- if a borderline PR6 but still at PR5.99 and single click away (number of total links on page dependent) can still be at PR5.11
Problem: A PR6 site will have many PR 3 money pages, yet they will be worthless because there will be many more PR4 pages addressing the same topic.
Not necessarily "worthless" - your assumption suggests that PageRank is the only game in town -- it's not. A PR3 may indeed have difficulty beat a PR8 - PR10 page but beating PR7 is quite common.
In addition, your suggestion also assumes that you have only "one potential keyword to target (or few). A user query 1 per seems quite low but these are also quite low in competitiveness, therefore targeting many per page. I have one site that get 27,000+ single uses terms/month -- that's still 27,000 uniques - not to mention those 2 a day ones, 10, 20 100, etc.
Theory: Page Rank and Theme Pyramids – You win one and you will lose the other
You win all round - but you simply can't do one part and not the other.
How does Brett’s Theme Pyramid work with Page Rank?
Forget about PageRank & Page Ranking, in advance of developing the strategy. Look at this from a marketing, sales and promotion vantagepoint. You develop a certain strategy that regardless of what that strategy is... there is risk of failure. Manging that risk and measuring outcomes produces fruition... not your 100% gut feeling that it's going to be BIG!
If I follow the Theme Pyramid I will end up with a lot of worthless PR3 (or less) pages that will get beaten every time by some other site with a PR4 page
Covered before - but from a different perspective - ranks are not that important. Explanation -- if you could be ranked at #500 for your best keyword but able to recieve 10K visitors per day - would you? Or would you give up that 10K per day in favour of being ranked #1 on only your best keyword?
So is it Page Rank or the Theme Pyramid?
"PageRank" is categorically a means of measuring, "Theming" is a strategy... thus theming is the way to go (given these choices).
At the moment I would go with my conscience and build a theme pyramid, it will serve the visitor better, but I will get nowhere with Google
I would hedge the bet in that direction.
Since I have been a WebmasterWorld member - at least a dozen newbies I can recall that followed Brett's strategy verbatim and within 3 month reported back 2K + visitors/day... that's exceptional for someone who didn't have a clue... 3 months before.
In other words why have the boys at google put this in? It may save on bandwidth, but seriously, how can it be used as factor to scrore relevancy?
Many pages which are not "Google optimised" are of large sizes, simply because the content demands it. Breaking these up into smaller pages is only done to please Google and not their "nature".
That's probably true IF you ignore one of the more important pieces of the pyramid scheme.
As Brett says, from the index "try to link to as much deep content as possible," and from topics "try to link to topics above and below the subtopic, but not to other subtopics".
If you follow that advice closely you should be able to have nice PR where you want it AND be well themed.
Has anybody noticed that a PR 0 is not a completely white bar? It actually has two green pixels in the top left. Is this significant of anything?
The pixels you're talking about replace pixels of the gray box itself, which is why they're hard to see.
The two green pixels are probably just an artifact of how the toolbar code draws the PR bar. If you look at a PR10 bar, you'll see that the bottom row and right column of green pixels are all dark green rather than light green. Their algorithm is probably something like this:
1. Draw the gray box.
2. For i = 0 to f(rank) draw columns of two light and one dark green pixel.
3. Turn the two top pixels in the last column into dark green pixels.
They just screwed up #3 by leaving out a check that there is a "last column".
I have read on here numerous times that you want your textual content on the page to out-weigh your html content.
Hope that helps.
Also for the page size, is that purely code size, or does that include the images as well?
Look at any results across any keyword - results are bias to smaller pages 1 - 32K pages predominately at the top.
If it wasn't a factor - those old 1 mb - 100 printed page long files would surely be number #1 due to the miles of content.
This is true. While on a small page of maybe 150 words one can achieve a KWD of 7% easily without it looking ridiculous to human visitors, basically impossible for a 10,000 word page.
There is just no reason why Google should favour smaller page size.
Still waiting for the latest update...5 days now...Hello, Google, wake up, we got work to do!
... a small page (less code) can have lots of graphics making it slow/heavy.
This is another reason why CSS1 or CSS2 are so valuable. Image code is removed from the page, images load as part of the style sheet and not a separate query of code (faster loading), and the deprecated of alt="" on non-linked images, you have everything to gain from this design style.
Everytime something chances in search engines, there are implication, some good, some bad... you need to appreciate both and reinvent your wheels.
think of it this way, just as an added perspective. if google generated the content themselves (instead of getting the content from the web)....and you type in "dog", how muuch info on dogs should it display?
Somehow it will be a factor, its a compromise for us.
//added
RE the "why's he up there and i'm not thing", have you checked out if they link to each other? They could be the "loop of the theme", if you know what I mean, and youre sitting somewhere outside it.
I can't seem to find increased weightage being given to pages with less k. There are many 44k, 50k, 78k, 101k sites at no. 1. But moreover, there is no order "k"wise; it's all over the place.
I'm still betting that it's keyword density and anchor text which is playing the greater part in determining position...but have to wait for the update to find out.
Not really. As a user I want relevant results.
The way I see it, smaller file size enables Google to index pages more quickly enabling them to move on and find additional content quicker, and have more room to store it since there is less code bloat.
The additional pages give the index more depth (which is good for me as a user), the time and space saves Google some cash (which helps me see less ads, and hopefully helps them survive longer, which helps me too).
Then, if by chance relevancy is easier to determine on the smaller files that's good for me too. I get more relevant results that are likely to download quick.
Will relevant bloated files be missed? Sometimes, but I'd bet that for every great bloated page there is probably one of similar quality that isn't.
That being said, I don't believe file size is something you should stress over. I have many bloated pages in the top 3 of the SERPs and I'm not gonna change them. However, when creating new pages I do take the time to make sure the code is as lean as it can be. Most new pages from me are 10k and under.
I can't seem to find increased weightage being given to pages with less k. There are many 44k, 50k, 78k, 101k sites at no. 1. But moreover, there is no order "k"wise; it's all over the place.
This is a bar of averages.
If you do a set of random searches (say 100) viewing top 10 - so 1000 pages, 32K and below appear 65% - 75% of the time.
This still means that 250 pages are above 32K (thus your many).
On any page over 32K in the top ten has a weighing advantage of other considerations. No-one ever said -- file size is the only factor in good ranking - but that it is a factor and equal be most other factors.
<sarcasm> But hey - a few minutes of searching always beats years of research, observation & experience - so what do I know. :) </sarcasm>
One thing just occurred to me. If someone is looking at page sizes and which do best on the SERPs, they should keep in mind that statistically the average page size on the WWW isn't that large. If the vast majority of pages at the top are <32k, this may just reflect the fact the vast majority of pages on the entire Internet are <32k