Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: open
My observation so far: little change this month from last. Anchor text of inbound links still counts big time, and PR seems to be worth the same as before. IOW, its the same old, same old. One aspect that isn't relevant with the SERPs I am most familiar with is "spamminess". I don't see much more spam, but then these SERPs don't tend to be the ones that spammers would be found on. Thus, the index may be more spammy, and I wouldn't see it.
<<I take interior pages that are there anyway and optimized them for secondary kws as i mentioned earlier. Once I have the main page built up with links and PageRank, I get my link partners to link to that interior page.>>
My development approach is very much like this, except that in some cases I find it easier to cultivate links for internal pages first. It depends on the topic.
Something I've been noticing is that in www2 the number of entries returned is down for some terms. One term I watch used to show 375,000 results; it's now showing 216,000. I've seen a reduction like this for a number of terms, but not across the board. Not sure what it means.
New DMOZ data is showing up in my backlinks and in SERP descriptions for two sites added to DMOZ only about 4 weeks ago. The Google directory pages are still not updated with new DMOZ data though. It appears that the Google directory refresh is in process?
I have always felt fortunate to have this keyword as part of the name of my site. However, the site that is #1 in this category for several months, does not.
I am now convinced that having your targeted keyword in your domain name is not as important as I once thought. The far more important thing in my view is to have it in the anchor text which you use for linking.
The #1 site, is poorly optimized with a poor title and no H1-H2 headers. The however have far more links than we do and they and they do use the keyword in their linking text.
If I am correct in this regard, this is great news for people who are committed to building brands. For the new sites I have planned, I will definately focus on using memorable, brandable domains while making sure to include our best category keyword in the link text.
The other conclusion I have come to is that the length of time a site has been around also plays an important factor. Over the last four months for instance, we have moved up from #13 to #9 to #6 to #4. We definately have a far more optimized site than our competitors but the higher we climb the longer it takes to edge up. Looks like we will have to wait 3-4 months to see if we have what it takes to become #1.
Just my .02 cents.
1) Backlinks have returned - they have been mysteriously missing for two months or so
2) The algo seems to be pulling more relevant pages from WITHIN a site. I have several pages containing the same product in different formats - department list, individual product pages (for more detail), some have popular products and others have feature products. So I can have 4/5 pages with the same product on - previously Google often brought up surprising pages - deeming department pages more relevant than the individual product pages (which are optimised well). This time, they are pulling more of the individual product pages, which are far more relevant and will yield many more clickthroughs. It seems the same for some competitors too. I would like to report how this affects clickthroughs, but my positions have significantly changed too, so it would not be accurate to track this :(, not complaining cos my positions are higher :)
I THINK my drop has to do with getting picked as a Yahoo Pick a couple months ago. Saw a BIG increase in BLOGGER links, and those seem to be fading now. But because of that kick in the pants, I went from 5 to 7, and to 6 last month.
The algo seems to be pulling more relevant pages from WITHIN a site
This may be true, but not for one of my sites. When the freshbot was coming by, everything was in pretty good shape. Now that the dance has started, everything is old. I know this will change. I just don't understand why freshness has so little meaning to Google. Some of the other SEs have much fresher results on the sites that I monitor. Of course, the problem is that they generate very little traffic. I wish Google could pick up the standard just a bit.
Now, I have not reviewed these sites to see if in fact they are "spammy", I am just using that word because of the domain names.
Assuming they really have cleaned up whatever had them banned, all I can say is "shucks".
Something I've been noticing is that in www2 the number of entries returned is down for some terms. One term I watch used to show 375,000 results; it's now showing 216,000.
Ditto. Virtually every term I follow has been downsized by up to 60%. I noted that one of my more popular terms has fallen from about 800,000 results to 320,000
First time I noticed this decrease. Hmm..
I can see no explanation for this at all. Has anyone else experienced this pattern, of a big drop last month and another this month? My site is all non-profit with no spam.
Grasping at straws: a large number of the pages are archived newsletters which have not changed for many months. They are still perfectly good and useful pages - there's just no reason to revise them. I'm wondering if there is some filter that is dropping pages that haven't been changed within a certain period of time.
If this keeps up I'll be down to 10 pages next month. :-( Any thoughts or recommendations will be *very* welcome.
Yup... for my topics too. Subtle, but noticable nonetheless.
It's looking quite a good re-index this time: lots of tuna caught and no dolphins to date.
We had a demonstartion site up for a long time (over six months) which is just now showing up in Google in the new index. This was a pleasent suprise, midigated by the fact the new Marketing guy has come up with an entire new website to replace it with, which of course won't be in Google until a future index...
The reason for writing, is all the pages are dynamic, they don't use php or asp or even jsp. One of the features of the program is the ability to view the data in a variety of "styles". This is template driven so you can alter the look of the site just by changing the template. We also have preview features so if you want to see what a website looks like in a given style you can use the preview feature.
I also built a gallery to show off some sample styles. This gallery used links like this: www.demo.com/something/somethingElse&styleName
Not only did Google index this stuff this month they also indexed every page linked to by somethingElse in the each style. The Gallery contained about 15 sample styles so there may be 15 copies of the demo site in the new Google index. These copies are identicle in terms of textual content.
This wasn't intentional and Google has finally succeeded in indexing the site, but it is duplicated content which could be easily construed as Spam.
A further factoid is some of the links from the style gallery contained session ids. Google indexed them too and they work in that you can click on the link which includes the session id and get the expected page from Googles search results. I don't know if the entire site has been indexed in each session.
In short Google has indexed some more dynamic content, but it appears to have indexed the same content several times. You have to hit show similar but omitted results to see the full extent of this.
Smaller database = faster searches, faster future updates (maybe more frequent - perhaps Freshbot will be a constant and regular update to the directory once they successfully cull out the weeds), and more relevant results (for most of us, our own sites and pages) for users, which means more traffic for those who are playing by the unwritten rules of the internet.
For those who are complaining that the Google update takes days - gosh - I have a website that processes a paltry 50,000 and 100,000 separate feed records per day from a variety of different sites, and my program takes about half an hour to run. I can't imagine how much processing power it must take to process billions upon billions of pages of data, whereby the data needs to go through a complex rendering process to calculate its priority and relevance...
I will be quite happy to see Google's database decrease in size and increase in relevancy - I'm sick of seeing spammy results for sites that employ thousands and thousands of useless keyword-stuffed pages all linking to each other in a subdirectory to try to be in the top 10 for thousands of terms that don't even relate... Who needs link-farms?
I am personally amazed by Google's technology.
I guess I'll have to figure out what they did (link work I'm sure). Still don't like those spammy 6 word domain names...