homepage Welcome to WebmasterWorld Guest from 54.198.130.203
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 67 message thread spans 3 pages: < < 67 ( 1 [2] 3 > >     
Updating Question - no ranking movement despite on-page changes
MrFewkes




msg:4170592
 11:05 am on Jul 15, 2010 (gmt 0)

Hi,

Ive been making changes to sites now based on various sources of information. The objective of the changes is to acheive a better rank.

The changes (done separately - then wait for cache update) and monitored over time (about 4 weeks) have been made on 7 domains of mine.

The changes are split into 3 types.

1) I changed some sites to add the keywords to them inside various HTML tags which I was informed was helpful. So for example I added an address tag with my keywords in it (just an example).

2) I added certain words (not my pink widget words) to my pages which I was advised google required so that it could place my site in the correct "category" - a "Selling" site that is - for pink widget. Rather than an "information" site for pink widget.

3) I added my pink widget keywords to *new* anchor text on my index page which links to existing (and indexed) subpages containing the keywords.

All my changes are cached now - the index page of each site which is where all the changes took place across all the domains are all pr1 to pr3 (just to note)

I do believe that the above changes ARE ranking factors (please please please please correct me if you think im wrong).

The sites are all ranking between position 4 and 80 (so not under any penalties I can see).

Other info - no affiliate links on the sites, pure oscommerce + hand written index page (which im tweaking).

The main point is - non of the sites have moved in rank - AT ALL!

I feel like im flogging a dead horse. Ive spent weeks reading and reading and got nowhere.

Has google not yet factored in my changes maybe?

Cheers - any advice appreciated.

 

MrFewkes




msg:4192192
 9:08 pm on Aug 25, 2010 (gmt 0)

Hi - ill try and get to reply tomorrow and let you know.
Cheers

MrFewkes




msg:4192630
 4:50 pm on Aug 26, 2010 (gmt 0)

Gouri - Ok, I added that page you mentioned as described above - it made no change to the ranking of my index page at all which links to it in the way described above.

I can tell you that since then, I have added another page linked to from the index page also, with the anchor on the index as "pink word1 word2" - the target new page is called "word1_word2_pink.html" (yes pink at the end instead of the front just to mix things up a bit). The content of the new page is inclusive of a title reading "pink word1 word2 specification" and pink word1 word2 and word1 word2 and word1 and word2 combinations are spattered around the new page.

I added this page a couple of weeks ago - the main index page which links to it is now cached (cache "snapshot" date 19/august) and ranks number 4 on google for pink word1 word2 searches just because of the anchor text to the new page (it is worth noting perhaps that the index page we are ranking here did not previously mention the word pink). The new page itself doesnt show for a search of "pink word1 word2" - but DOES show at number 1 for its title exact match search of "pink word1 word2 specification".

Further information - I can say that although the snapshot date says 19/august - this was only updated today on google - so for the past 7 days the cache snapshot date was out of date.

Note that any "" above should be removed for search descriptions - I never use "" to search - thats just for clarity here.

And the crux of all this - my index page is still in the same spot - no movement.

I will add another page tomorrow and link to it from the index page - it is required by the site - rather than for SEO - and will be based around "word1 word2 newword1 newword2 instructions"

I hope this helps you guys - but it didnt help me. :(

MrFewkes




msg:4192631
 4:52 pm on Aug 26, 2010 (gmt 0)

1script - I made internal link changes from and on the index page to sub-pages and that made the sites index page shoot up.
Im not aware of any sub-page ranking improvements as im really only chasing the index page.
Cheers

gouri




msg:4193110
 3:45 pm on Aug 27, 2010 (gmt 0)

Mr Fewkes,

I first wanted to say thank you for providing that description. It is very informative.

I am seeing some of the things that you mentioned. After adding pages to a site, the keywords that the other pages are ranking for have not really moved.

It has been about a week since I added the new pages. Maybe I have to wait more to see what the effects are going to be on the older pages and the keywords that they rank for?

For keywords that I want the new pages to rank for, I am not seeing that particular page or the homepage rank for them. Maybe I have to wait more? But the new pages do have a cache version.

I think I should also mention that the cache version of the homepage is showing a date before the new pages were added. Do you think the date of the cache version of the homepage should be after the new pages were added in order to have a better idea of ranking changes for the older pages and/or the new ones?

MrFewkes




msg:4193134
 4:28 pm on Aug 27, 2010 (gmt 0)

Ok some more info.....

On the topic of the page I made the changes to which went from 38 to 4, and I then rolled the same change out to about 20 other domains.

I am seeing the first cache updates to one of the changed pages.

The change was made 19/Aug/2010 to the index page of a site.

At the time I made my page changes - the cache date was 02/Aug/2010

Today - 27/Aug -
1. Find my site in the SERP.
2. Click on Cached Link
3. Cache date is 02/Aug/2010 - Old page shows in cache
4. Click Back Button
5. Click on Cached Link again (same link - no refresh)
6. Cache date is 23/Aug/2010 - New page shows in cache with my changes

So - they are showing me two cached versions at the moment.

I can say at this stage - whilst its clear that goofle have seen my changes, I would assume as yet that they are not factored. I will also have to assume for now that they will not be factored in the ranking until the old cache version dissapears - maybe - a couple of assumptions there being made in the HOPE my changes will affect my rank like they did on the initial site from 38 to 4.

I just get the feeling its doing its thing at the moment - well Im hoping so - because at this point in proceedings theres NO ranking change.

You can use these dates as a milestone for your own changes going through.

Hope this helps someone - I could certainly do with some help - I think the men in white coats are coming for me. Looking at cache dates - sheez whats life come to?

gouri




msg:4193146
 4:56 pm on Aug 27, 2010 (gmt 0)

MrFewkes,

I would appreciate if you can keep us informed about when you do see a ranking change from the change made to the index page of the 20 other domains that you are working on.

I think this would help to understand the ranking process a little more.

dvduval




msg:4193180
 6:03 pm on Aug 27, 2010 (gmt 0)

I'm in the exact same spot. We have literally millions more links that our competitor and a higher pagerank, and on page changes have zero effect even after waiting 2-3 months. This includes changing the title tag to the exact phrase we most represent, and adding text and headings to the page. Fortunately, we have a strong brand name, but I still find it strange.

MrFewkes




msg:4193198
 6:40 pm on Aug 27, 2010 (gmt 0)

gouri - yes - I will come back to this thread as the changes (if any) occur.

dvduval - you say you have more links - assuming you have some socalled better quality links than your competitor - this points towards my attitude towards links at the moment which is one where I moved away from chasing them and focusing onsite/onpage.

Unfortunately - I cant yet confirm whether my changes will help yet (apart from that one site).

I feel like im chasing my tail - im getting bored of being bored of goofle.

Its probably my own fault for thinking search engines get you traffic in the first place. Silly me.

Watch me change my tone and call them google instead of goofle when/if I rank higher.

Mind you even then I know it will only be a temporary thing - its one thing ive learned after 12 years of this goofle dross.

tedster




msg:4193284
 9:33 pm on Aug 27, 2010 (gmt 0)

Do you think the date of the cache version of the homepage should be after the new pages were added in order to have a better idea of ranking changes

The public cache date is not directly connected to when the ranking calculations are upudated

changing the title tag to the exact phrase...

I've notice recently that exact phrase match in the title does not seem to work well. It does work if you put quotes around the query phrase, but clearly only geeks do that.

Lorel




msg:4193344
 1:01 am on Aug 28, 2010 (gmt 0)

I have a client where Google was only caching the home page every 11 days. I had her write several articles and then I posted one about every 4 days and the cache date went up to every 4-5 days for about 3 weeks but now it's back down to 12 days. So looks like something is definitely slowing Google down.

MrFewkes




msg:4193431
 9:39 am on Aug 28, 2010 (gmt 0)

Exact phrases in the title tag are (imho) a no no on longer phrases - say 4 or 5 words - if the competition is high. Now thats kind of a guess just browsing the serps. If you search a long string with quotes (a popular one) then get a feel for how many titles show in the serp - and compare that to a search without quotes for the same string you will generally not see very many of those full strings in there for high volume searches where normal distribution dictates a proportionally high volume of pages aswell (generally).
If however theres a high volume of searches but a low number of pages - quite rare - you may see the longer exact match strings in the serp without quotes.
I dont know the science behind this.

Lorel - your crawl rate will vary on a number of factors. My guess at three factors would be as follows.

1. Your PageRank/TrustRank value - a higher value would cause a more frequent crawl.
2. Rate of inbound link aquisition - at all levels from 1 link per week to 1 link per hour - new links I suspect and the volume thereof would be an indicator to goofle used in determining an appropriate crawl frequency.
3. Rate of page changes - I (again suspect) that a website which is updated every day with new content would (eventually) gain an increased crawl frequency.

All of the above is easily computed real-time(ish) so variants in any or all of the above (say a drop in rate of page changes for one) could affect the crawl frequency.

There will be knobs and switches for sure and many other factors probably.

Im in a good position at the moment to check point c) above on one site of mine as I am adding new content each time I get a cache update. In theory - the cache updates should become more frequent - and thus to make them even more frequent I would add content more frequently and so on. Eventually - again in theory - I may get a cache update daily or so. As long as keep up!

Can anyone add to the above?

Cheers

dvduval




msg:4194874
 6:03 am on Sep 1, 2010 (gmt 0)

We add new content (videos, articles, forum posts, tutorials) all week long. We naturally receive 100s of new links daily. I'm under the impression there is some sort of freeze on some keywords. We still rank #1 for some of the most important keywords, but other keywords seem frozen.

If I were going venture a far out guess, it would go something like this...

1. Google builds a trust rank and page rank score.
2. They decide how many "keyword points" you get
3. Based on competitiveness and traffic for keywords, you are alloted your share of the pie.

In this way everyone gets a keyword allotment that is "fair" in terms of the score they were given. I would expect this algorithm is only applicable to a large set of "known keywords", but does not apply to long tails that are new or rarely searched.

It then makes sense why they are deciding to give some brands keyword dominance that may not count against their allotment.

So if you have an established site, and you were doing a really good job with optimizing your pages and getting traffic through google, if your pie was algorithmically too large, the Mayday update provided you with a smaller pie.

Thanks Google!

1script




msg:4195058
 2:56 pm on Sep 1, 2010 (gmt 0)

@dvduval Re: keyword pie

Interesting idea! I am seeing something very similar that started all the way back in March 2010, so it may not be exactly MayDay's feature (else I was volunteered to become an early adopter). Actually, if we go back to that time period there were many reports of this kind of "freezing" of the number of keywords "allocated" to a site.

It basically looks like you may be given a different trust rank for different slices of your "keyword pie". I would not even speculate about exactly how your keywords are sliced - there does not seem to be much logic in which KW goes into which slice. However, as soon as this allocation happened (for most of my sites on March 15th and a couple more sites contracted it a few days later) the "trusted" slice has continued to rank high as if nothing happened. The rest of the pie disappeared off the face of Google.

I have done some analysis of the KWs in the "trusted" slice and the rest of the pie and in my opinion there is NOTHING better about those trusted ones. I have better linked pages with more prominently featured KWs now rank for nothing (went to a bad slice?) and seemingly unremarkable pages (no external BLs, and KWs are sometimes not in the title and only once in the content) continue to rank as usual (stayed in the best slice?).

What is also remarkable is that there appears to be exactly 10 slices in this keyword pie. Only one is trusted, the other 9 are not. You can just take both the number of formerly ranked keywords and total G* referrals, divide it by exactly 10 and arrive at todays numbers. Why 10 and not, say, 12? A better looking figure perhaps?

Mind you, this concept does not eliminate existence of the site's overall trust rank. I also have sites that were sliced first, ranked for the 1/10th of the KWs and then completely disappeared. Most of those were new sites that were probably going to disappear anyway as the "honeymoon" period was getting to an end.

There are also some new developments here. It looks like some sites are now finally getting some of the slices re-evaluated and "released" or "thawed" if you will. This whole concept of slicing may have actually been brought about by the need to limit the load while re-evaluating the entire set of keywords. Even with Caffeine I imagine it would be an unsurmountable task to recalculate trust rank for the entire Net given that most large sites receive traffic on hundreds of thousands of keywords each and there are millions of sites...

Again, great concept but there is still the need to explain why not all sites were chopped up like that. Clearly, not everybody suffered a loss and even I had two sites that were stable during this troubled period. Could high enough initial site trust save you from the "keyword pie" effect?

freejung




msg:4195136
 5:25 pm on Sep 1, 2010 (gmt 0)

dvduval, I think you're on to something. I've experienced things that would support your hypothesis. For example, I recently stopped using a keyword group that had lower stats than others. Traffic for that keyword group immediately dropped. Sure enough, traffic across other keyword groups lifted slightly to compensate, with no overall loss of traffic (leading to a nice revenue increase, btw, because the other groups are worth more). And it's not that one particular keyword or set of keywords benefitted from the loss of the other - there was no spike in any particular keyword or group, just an overall slight rise in traffic across a large number of keywords.

As another example, there are a lot of cases where there are several versions of a keyword that are basically synonyms. The competition for each version is exactly the same - the same pages are in the top ten. However, I consistently rank better for the lower-traffic versions, despite heavily optimizing for the higher-traffic versions.

It feels like I have a fixed traffic allotment, and Google is balancing rankings across a large number of keywords to maintain it despite any changes I make.

There have been a lot of threads suggesting that Google is throttling traffic. There have been a lot of threads about on-site changes and additional content having little effect on overall traffic and rankings. I can certainly attest that my total traffic and from Google is remarkably stable, almost unnaturally so, and much more so than traffic from Bing/Yahoo.

The answer, then, is obviously to focus on building more domain authority, and not worry so much about particular keywords and rankings.

1script




msg:4195173
 6:17 pm on Sep 1, 2010 (gmt 0)

@freejung:
I recently stopped using a keyword group that had lower stats than others.
Any chance you can expand on this a little? Do you mean you've stopped writing new content targeting these keywords (i.e. using them in titles of new pages) or you've removed old pages that were optimized for those KWs. I'm assuming by optimization you mean the KWs were used in the title, h1 and a few times in the text as well as used as anchor text in a few links perhaps.

The additional info about your sample may provide some insight on whether new keywords are still being picked up / evaluated since many people here seem to have a feeling they are not.

The opposite of that would be that dropped KWs are replaced by other old KWs that were already in the index before MayDay (assuming this is MayDay's specific feature). That would mean the index is "frozen" and may probably move in the future once some sort of immense calculations are complete.

freejung




msg:4195180
 6:36 pm on Sep 1, 2010 (gmt 0)

Do you mean you've stopped writing new content targeting these keywords (i.e. using them in titles of new pages) or you've removed old pages that were optimized for those KWs.


Neither. I mean that I've removed these keywords from the titles (and in some cases even the body content) of pages that used to use them.

I talked about this in another thread -- in my niche, there are words (widgets, woozles, wingdings, doodads, let's say) that basically mean the same thing, but with slightly different connotation. However, use of "doodads" in a search string generally signifies a slightly different user intent (what they intend to use the doodad for), and that intent is significantly less useful to me than the other possible uses for my widgets.

So before I might have a page titled "Red Widgets | Red Woozles | Red Doodads" (that's an oversimplification, but it works for illustration) and I replaced the title with "Red Widgets | Red Woozles | Red Wingdings"

Now, here's the part I find interesting. I would have expected this page to decrease a lot in traffic for terms involving doodads, and increase a lot for terms involving wingdings, and basically stay the same for the others, right? But what happened instead is that the page increased in traffic slightly for all of the other keyword groups - and this happened across many pages all at once. Actually, traffic for widgets (the most popular variant) increased slightly more than traffic for wingdings. The overall traffic gained pretty much exactly compensated for the traffic lost, hence the speculation that there is some sort of allotment.

However, in the course of explaining the details, another explanation just occurred to me. It could be that Google understands that widgets, woozles and wingdings are more closely related to each other, in terms of specific meaning and intent, than any of them are to doodads. So getting rid of doodads is making me more semanticlly relevant for all of the others at once.

That seems a more plausible explanation, however it still seems odd to me that the overall traffic balanced out so well, particularly because traffic numbers for the various synonyms are radically different (doodads gets a lot more traffic than wingdings or woozles, though not quite as much as widgets).

freejung




msg:4195182
 6:38 pm on Sep 1, 2010 (gmt 0)

As for the question of new keywords being picked up / evaluated, I also have the sense that they are not. New content is consistently ranking much less well than older, established content, and much less well than I would have expected based on prior experience with adding new content.

I'm mostly talking about content that already ranked pretty well across all of my synonyms for quite some time, and I'm just trying to adjust the proportion of traffic that comes from each synonym.

Previously, I was just trying to maximize traffic. I figured any traffic was better than nothing. But since it occurred to me that I might have a specific allotment of traffic, I thought it would be better to use up that allotment on higher-revenue keywords.

MrFewkes




msg:4195225
 7:51 pm on Sep 1, 2010 (gmt 0)

I added another page to my site as I said I would do - and now, we have points 7 through 9.

Today - 27/Aug -
1. Find my site in the SERP.
2. Click on Cached Link
3. Cache date is 02/Aug/2010 - Old page shows in cache
4. Click Back Button
5. Click on Cached Link again (same link - no refresh)
6. Cache date is 23/Aug/2010 - New page shows in cache with my changes
7.1 Added new page 26/Aug/2010
7.2 Today (1/Sept/2010) Cache Date is 30/Aug/2010
8. Today (1/Sept/2010) Link to new page shows in cache
9. Today (1/Sept/2010) New page is not yet indexed though

I can confirm my cache date update frequency has increased to 7 days from around 30 days caused almost 100% by adding new pages. There will be ZERO new links to this site.

To attain this increase I have added (sofar) three new pages and linked to them from the index page only. I have waited for each new page to be cached before adding the next new page - thus causing goof to see 3 changes. For cache update frequency - there are two new pages to consider - the third is the lates referred to in 7,8,9 so cannot have been used to cause the 7 day update..... I changed the odd price here and there on the index page aswell over the last few months before any new pages were added - it would appear these (actually may or may not) have influenced the update frequency) hence the "almost 100%" above.

None of the new pages are interlinked - just straight off the index and then straight back again. No other links.

So at this point - we have 2 new pages indexed and cached - three updates to the index page cached in sequence - and one new page not cached yet but the link to it is cached on the index page.

There are ZERO movements in the serps.

I will add another new page tomorrow and link again to it from the index page - thus causing 4 updates (with new content) to the site to be spotted by goof.

I will let you know what happens of course.

If anyone can give me any tips here aswell - id appreciate it.

Cheers

dvduval




msg:4195314
 10:30 pm on Sep 1, 2010 (gmt 0)

I still believe they are freezing people on medium to high competition keywords until other ranking factors improve your score. Those other factors most likely relate to what you do over a period of time (ex. don't link to bad neighborhoods, add fresh content, etc.).

Now ranking for long tails I find still quite easy to do. I can write an article today and be in the top 20 later today, but I probably can't be in the top 3. Those are more frozen.

Google seems to have evoked a powerful penalty on everybody that puts a damper on things for a period of time, and puts limits on success based on pie size they feel you are entitled to. You can still grow your pie, but there will forever be a mystery and wait time.

This bodes especially bad for young people just starting their web business who have high hopes, and are quickly introduced to the damper mechanism. It boded especially well for the "good old boys" that already have a big presence.

I would hope that google will continue to revise and improve and not stray from their company mantra that made them so successful.

Globetrotter




msg:4195498
 10:50 am on Sep 2, 2010 (gmt 0)

I donít think itís only on popular keywords. Iíve got a section which is a kind of a social network. When you search for someoneís exact name of for the name in combination with the site name itís nowhere to be found. In the old dayís this wasnít a problem and new users and names showed up almost immediately.

Iím situated in Europe and I also notice a lot of foreign (English) results for particular keywords, and sites Iíve never seen before. I find it strange because there are way more sites in my own language that I know of than Google in showing.

1script




msg:4195608
 3:17 pm on Sep 2, 2010 (gmt 0)

I donít think itís only on popular keywords.
I also think that this concept, if it's indeed a real thing, should work on all keywords, not just a subset of them. For one thing popularity of a keyword is a fleeting thing, it's popular one day and dead the next. To measure popularity by the number of AdWords ads on SERP (page) would reek of collusion and I don't think they would do that (but who knows). What else is there to reliably measure keyword popularity?

So, yes, I think throttling down of Google traffic is real and applied to ALL your keywords. Exactly how it is applied (how the keyword pie is sliced or at least how many slices) probably greatly depends on your site's trust and other quality-related properties.

I also think that keywords get into their respective slices randomly, despite that it sounds crazy. Looking at keywords that continued to rank vs. those that were dropped I'm yet to find any reason for why they were allocated that way. I'm looking at # of internal and external BLs and whether the KW is in the title, h1 tag and how many times on the page. Those parameters appear to be exactly the same for KWs that still rank vs. dropped ones. Should I be looking at something else? If anyone knows, please enlighten me!

MrFewkes




msg:4195624
 3:45 pm on Sep 2, 2010 (gmt 0)

I think its all keywords aswell - the sites im referring to - some of them dont even show counts on the external kw generator. (well nothing to write home about anyway - like 2000 a month broad match)

I dont understand how throttling can work - my site is at 4 - stays at 4 and the sites around it dont move either (apart from the dross youtube crud and shopping sites coming in and out)

I havent seen a serp where im not number 4 for maybe two months now. I check all the time and have no cookies distorting the serps.

Can anyone explain to me how throttling can work? I understand the concept in that goof could be wanting (for reasons of its own) to not give a site too much traffic, but I cannot see how they would acheive this technically without moving a site in and out of the serps during the day. This is something I am not seeing.

1script




msg:4195650
 4:39 pm on Sep 2, 2010 (gmt 0)

Can anyone explain to me how throttling can work? I understand the concept in that goof could be wanting (for reasons of its own) to not give a site too much traffic, but I cannot see how they would acheive this technically without moving a site in and out of the serps during the day. This is something I am not seeing.

Well, that's just the thing - it is harder to move your page in and out of SERPs than to simply take your 2000 formerly ranking KWs, retain 200 and throw away (or at best freeze) the rest. Here you go: your traffic has been effectively throttled down to 1/10th of what it once was.

I admit, throttling down is not an ideal term - it implies that the throttle moves up and down (and there is something moving it) whereas in real life the situation is VERY static, just as you described. I also have KWs that stay on their positions since March. They sometimes move + or -1 position but very rarely. Also, the number of those keywords never changes (I mean within 10% margin - the magnitude of random noise in my case).

So, I personally like dvduval's keyword pie idea. You sliced it - you're done, the sizes of the slices don't change. You may later put two or three slices on someone's plate or you can take someone's last slice away (it's your party, can do whatever you pleased) - I have sites in either category.

It is also very troubling concept. You would expect that the "best search engine in the universe" would actually figure out dynamic throttling based on umm... something - not sure what it would be but it's definitely NOT fairness. Yet they seem to prefer the static (simpler, less CPU intensive?) solution. Why? Too busy re-organizing the worlds information perhaps.

tedster




msg:4195687
 5:44 pm on Sep 2, 2010 (gmt 0)

The first time I heard of apparent traffic throttling, the long term day-by-day graphs were pretty much flat-lined. But the hour-by-hour graphs were amazing. Every day at a certain traffic level, the Google traffic graph looked like a buzz cut had been applied - straight to zero within a given hour and no more traffic until the next day.

That seemed pretty heavy handed to me, and I wonder if the throttle has become more subtle now. I don't work with any sites who are seeing a throttle today, so I don't have any data of my own on it.

Yes, I agree that traffic throttling seems (or at least seemed) to be a trust-related action. It was also rare enough that many SEOs doubted it was really happening, but the data I've seen says it was real.

freejung




msg:4195688
 5:47 pm on Sep 2, 2010 (gmt 0)

I think we're talking about two, possibly three different possibilities.

The "keyword pie" idea doesn't have to do with traffic directly, but rather with the number or type of keywords you are allowed to rank for. The idea, if I understand it, is that Google grades your site as a whole and decides, OK, you' can rank for ten high-popularity keywords, 500 medium popularity keywords, and 2000 long tail keywords, and that's it. Even if you add new content, you're not getting any more of the pie. Or maybe, OK, you can rank for keywords involving fuzzy widgets, but not keywords involving smooth widgets, regardless of how you optimize.

Am I understanding the concept correctly?

Traffic throttling is a slightly different concept, it's been discussed in a lot of threads. Basically, Google might decide, OK, you're going to get ~5000 referrals per day, and that's it, no more, no less. So if your ranking for one keyword goes up, your ranking for others will have to be reduced to compensate. By slight adjustments in ranking over thousands of keywords, it wouldn't be too hard to do this (at least in my niche, where the traffic numbers for each keyword are very predictable).

Then there's the "damping factor" idea, which basically says, you're not going to move very much in the SERPs. Wherever you rank now, you're going to rank similarly tomorrow no matter what you do. A sort of molasses, that makes movement in the SERPs slower and more difficult. This would have a similar effect of making your rankings and traffic very stable, but it wouldn't limit you to any specific traffic number, just make it harder to increase traffic in general.

Whatever it is, I'm pretty sure there's something new going on in the last few months (roughly). It's much harder than it was before to rank new content for new keywords, and also to improve rankings of established content. If it was just me, I might put it down to increased competition in my niche, but it sounds like other people are observing the same thing. What it really reminds me most of is the old "sandbox" effect, only applied on a page-by-page or keyword-by-keyword basis rather than to whole domains.

tedster




msg:4195693
 5:55 pm on Sep 2, 2010 (gmt 0)

This sounds like a job for some massive data crunching to pin down what's happening. Not sure when I'll find the bandwidth. But anyone who can find it and does a good job will probably gain major attention from the SEO community at large.

MrFewkes




msg:4195759
 8:01 pm on Sep 2, 2010 (gmt 0)

What would you like to crunch exactly Tedster? I can program it and run it.

MrFewkes




msg:4197285
 7:43 pm on Sep 6, 2010 (gmt 0)

Ok - today - for those following - I added another page and linked to it again from the index. There are now 5 of these little devils linked to from the index.

My cache date is now Sept/02/2010 - from a previous date of August/30/2010 - so a fairly rapid rate of visits now.

The crawler missed a few small changes to prices on the index page the other day by a few hours.

The next crawl will pick-up the following changes.

1. New Price on the index page (a different version of an existing item) So two prices instead of 1 there and two paypal buttons instead of 1.

2. Reduced all images (increased jpg compression). Approximate reduction in bytes of around 100K - so this will improve the speed of the page load by x - I know that speed is now a signal (but I read recently that only 1% of searches are affected by this implementation of a speed check)

3. Another page added and a link to it from the index - this link contains the keywords and a couple of other words in the anchor.

4. Page changed 2 levels in - inside the oscommerce pages - just to add the option to the oscommerce cart inline with what I added on the index page interms of the new paypal button.

So - quite a bit for the great goofle to chew on again - but im not holding my breath for any ranking changes - up or down.

Im going to watch a WT webinair tomorrow - Tedster you said recently I may need to get more quality links for this jobby. The webinair is supposedly going to give me tips on how to do this - im sceptical to be honest as im at my wits end with this dead horse - but for some strange reason I cant give up.

Its not for the cash - ive been at pos 1 before for this about a year ago - and sales are pretty much the same as they are at position 4 where I am now.

Incedentally? My spamming competitor at 1 2 and 3 is starting to slow down his forum profile spamming and many of the spam links he placed originally in profiles have now gone. My link checker shows around 30% not found now. I know this is via yahoo site explorer - but the trend shows and must also be reflected in goofles link profiles for his PR0 sites.

My next change will be the addition of some "attributes" (as seen in that other thread) - I already have the <title></title> ;) "element" of course - but I wouldnt mind slipping a few attribs in there for good measure.

Then - I will be looking more into the fabled "intention engine" aspect more.

MrFewkes




msg:4198395
 9:05 am on Sep 8, 2010 (gmt 0)

Well - we are rocking on the crawl front - cache dates running smooth now as follows :-

30/Aug
02/Sep
05/Sep

(Bear in mind it was around once a month before I started regular updates - IE at least one new page added immediately after each cache update)

Still no movement......

Planet13




msg:4198614
 5:25 pm on Sep 8, 2010 (gmt 0)

@ dvduval

Now ranking for long tails I find still quite easy to do. I can write an article today and be in the top 20 later today, but I probably can't be in the top 3. Those are more frozen.


Do you mean write an article on your OWN web site? Or in an article directory linking and then link it to a page on your site that you want to rank well for long tail keywords?

dvduval




msg:4198798
 8:17 pm on Sep 8, 2010 (gmt 0)

Do you mean write an article on your OWN web site? Or in an article directory linking and then link it to a page on your site that you want to rank well for long tail keywords?


Yes, I can simply blog about something on my own website without getting external links pointing to it, and I rank quickly, but usually low in the top 10 (ex. position 8). I have about 10 million links pointing to my site, so that probably helps a good bit.

This 67 message thread spans 3 pages: < < 67 ( 1 [2] 3 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved