Sorry if this was asked before:
Is the new PR already being used in the calculations of search results in ww2 or ww3, or will PR be figured in later in the update?
|My observation so far: little change this month from last. |
Yeah, I was expecting big changes this go round due to the extreme latness of the update. Thought sure Google had something new in the works. Same old thing though. Guess they are just getting tired.
Ditto that. Can't see much change in the overall.
Receptional Andy, if you'd drop a quick report our way, I'd be curious to check out that many similar sites. Then I'll duck out of this thread and let people keep discussing algorithms..
Zapatista said this:
<<I take interior pages that are there anyway and optimized them for secondary kws as i mentioned earlier. Once I have the main page built up with links and PageRank, I get my link partners to link to that interior page.>>
My development approach is very much like this, except that in some cases I find it easier to cultivate links for internal pages first. It depends on the topic.
Something I've been noticing is that in www2 the number of entries returned is down for some terms. One term I watch used to show 375,000 results; it's now showing 216,000. I've seen a reduction like this for a number of terms, but not across the board. Not sure what it means.
Algo doesn't seem to have changed from my view.
New DMOZ data is showing up in my backlinks and in SERP descriptions for two sites added to DMOZ only about 4 weeks ago. The Google directory pages are still not updated with new DMOZ data though. It appears that the Google directory refresh is in process?
We have been moving up steadily for our main category keyword. From being non-existent a year ago we are now #4 in 2,500,000 results.
I have always felt fortunate to have this keyword as part of the name of my site. However, the site that is #1 in this category for several months, does not.
I am now convinced that having your targeted keyword in your domain name is not as important as I once thought. The far more important thing in my view is to have it in the anchor text which you use for linking.
The #1 site, is poorly optimized with a poor title and no H1-H2 headers. The however have far more links than we do and they and they do use the keyword in their linking text.
If I am correct in this regard, this is great news for people who are committed to building brands. For the new sites I have planned, I will definately focus on using memorable, brandable domains while making sure to include our best category keyword in the link text.
The other conclusion I have come to is that the length of time a site has been around also plays an important factor. Over the last four months for instance, we have moved up from #13 to #9 to #6 to #4. We definately have a far more optimized site than our competitors but the higher we climb the longer it takes to edge up. Looks like we will have to wait 3-4 months to see if we have what it takes to become #1.
Just my .02 cents.
On my site, plus notes for this month:
1) Backlinks have returned - they have been mysteriously missing for two months or so
2) The algo seems to be pulling more relevant pages from WITHIN a site. I have several pages containing the same product in different formats - department list, individual product pages (for more detail), some have popular products and others have feature products. So I can have 4/5 pages with the same product on - previously Google often brought up surprising pages - deeming department pages more relevant than the individual product pages (which are optimised well). This time, they are pulling more of the individual product pages, which are far more relevant and will yield many more clickthroughs. It seems the same for some competitors too. I would like to report how this affects clickthroughs, but my positions have significantly changed too, so it would not be accurate to track this :(, not complaining cos my positions are higher :)
I saw a drop in backlings.... from 550 on www, to 420 on www2/3. No change to speak of in placement or PR (steady at 6).
I THINK my drop has to do with getting picked as a Yahoo Pick a couple months ago. Saw a BIG increase in BLOGGER links, and those seem to be fading now. But because of that kick in the pants, I went from 5 to 7, and to 6 last month.
|The algo seems to be pulling more relevant pages from WITHIN a site |
This may be true, but not for one of my sites. When the freshbot was coming by, everything was in pretty good shape. Now that the dance has started, everything is old. I know this will change. I just don't understand why freshness has so little meaning to Google. Some of the other SEs have much fresher results on the sites that I monitor. Of course, the problem is that they generate very little traffic. I wish Google could pick up the standard just a bit.
What does www2 and www3 have to do with anything? Is what is showing up there on www2 and www3 now... will that be the content that shows up on www.google.com when the update is over? Please explain. Thank you.
grainfarmer: you've got it. The content eventually makes it's way to the main site, although everything can be changing rapidly during the update. TheComte: shoot over a report or drop us an email. We want to be the freshest search engine and any examples where we can improve are way helpful.
I see major changes on a very competitive keyphrase. But the changes (us and others dropping) are due to the addition of new "spammy type" sites in the top positions. I believe this is because Google has lifted penalties, as mentioned by GoogleGuy here:
Now, I have not reviewed these sites to see if in fact they are "spammy", I am just using that word because of the domain names.
Assuming they really have cleaned up whatever had them banned, all I can say is "shucks".
grainfarmer... I've noticed the totals have now been moved over from www2 and www3, to www on several keywords I've tried, but Google's still got some dancin to do yet, since the backlinks are still different between www and www2.
budterm, we're looking for good data on improving things. I'd appreciate if you dropped us a report with your nick..
|Something I've been noticing is that in www2 the number of entries returned is down for some terms. One term I watch used to show 375,000 results; it's now showing 216,000. |
Ditto. Virtually every term I follow has been downsized by up to 60%. I noted that one of my more popular terms has fallen from about 800,000 results to 320,000
First time I noticed this decrease. Hmm..
I had a major drop in the number of pages indexed last month, and this month it has dropped even more. My site has had about 300 pages in Google for a long time; last month it dropped to 199, this month to 111 - not good! (I am counting pages by means of a "sitename site:sitename.org" command.)
I can see no explanation for this at all. Has anyone else experienced this pattern, of a big drop last month and another this month? My site is all non-profit with no spam.
Grasping at straws: a large number of the pages are archived newsletters which have not changed for many months. They are still perfectly good and useful pages - there's just no reason to revise them. I'm wondering if there is some filter that is dropping pages that haven't been changed within a certain period of time.
If this keeps up I'll be down to 10 pages next month. :-( Any thoughts or recommendations will be *very* welcome.
Spam is down in the top positions for my 5 top terms. I follow these spammy companies closely, some I've reported, some not... but now the serps are really clean.
I know there is an improvement is the quality of the serps for these particular terms.
>> I know there is an improvement is the quality of the serps for these particular terms. <<
Yup... for my topics too. Subtle, but noticable nonetheless.
It's looking quite a good re-index this time: lots of tuna caught and no dolphins to date.
About the algorithm, it seems to me that anchor text in links "rules", as always. No big changes.
Before the dance I noticed that some spammering websites lost some PR (from PR8 to PR6). But actually in www2/www3 their doorpages still are on top5 for many keywords.
Sorry, this is kinda off topic, but I need to know if I am completely #@$*ed.
My site was online mid-dance, and google was picking up links to me etc, but as of 45mins ago, my server seems to be down - will this completely destroy my ranking if the robot returns, or does it only come by once?
Thanks for your time!
Unless things have changed recently, its whether or not your site was up at the time of the deep crawl (first half of last month) that counts. That said it wouldnt suprise me if google incorporated some check to make sure the site still existed.
I am seeing more dynamic pages with PR while surfing with the Google Toolbar.
GoogleGuy, I just sent in the report you requested via the spam report form. Many of his pages were eliminated but I still found enough to warrant some attention.
My company has a product which the marketing people call a "website publishing application" or something. That isn't a keyword phrase you're likely to find any info on it under though. Though I've repeatedly mentioned it, SEO hasn't been a high priority.
We had a demonstartion site up for a long time (over six months) which is just now showing up in Google in the new index. This was a pleasent suprise, midigated by the fact the new Marketing guy has come up with an entire new website to replace it with, which of course won't be in Google until a future index...
The reason for writing, is all the pages are dynamic, they don't use php or asp or even jsp. One of the features of the program is the ability to view the data in a variety of "styles". This is template driven so you can alter the look of the site just by changing the template. We also have preview features so if you want to see what a website looks like in a given style you can use the preview feature.
I also built a gallery to show off some sample styles. This gallery used links like this: www.demo.com/something/somethingElse&styleName
Not only did Google index this stuff this month they also indexed every page linked to by somethingElse in the each style. The Gallery contained about 15 sample styles so there may be 15 copies of the demo site in the new Google index. These copies are identicle in terms of textual content.
This wasn't intentional and Google has finally succeeded in indexing the site, but it is duplicated content which could be easily construed as Spam.
A further factoid is some of the links from the style gallery contained session ids. Google indexed them too and they work in that you can click on the link which includes the session id and get the expected page from Googles search results. I don't know if the entire site has been indexed in each session.
In short Google has indexed some more dynamic content, but it appears to have indexed the same content several times. You have to hit show similar but omitted results to see the full extent of this.
I have a feeling that Google is reducing its database size by eliminating as much of the spam as they can. Anybody here knows that Google has been overly kind to spammy sites, which sometimes intefere and come higher than our own sites and pages due to their spammy techniques.
Smaller database = faster searches, faster future updates (maybe more frequent - perhaps Freshbot will be a constant and regular update to the directory once they successfully cull out the weeds), and more relevant results (for most of us, our own sites and pages) for users, which means more traffic for those who are playing by the unwritten rules of the internet.
For those who are complaining that the Google update takes days - gosh - I have a website that processes a paltry 50,000 and 100,000 separate feed records per day from a variety of different sites, and my program takes about half an hour to run. I can't imagine how much processing power it must take to process billions upon billions of pages of data, whereby the data needs to go through a complex rendering process to calculate its priority and relevance...
I will be quite happy to see Google's database decrease in size and increase in relevancy - I'm sick of seeing spammy results for sites that employ thousands and thousands of useless keyword-stuffed pages all linking to each other in a subdirectory to try to be in the top 10 for thousands of terms that don't even relate... Who needs link-farms?
I am personally amazed by Google's technology.
Oops, I must correct my previous post: finally taking a closer look, the sites that leap-frogged us and others were down in the second ten before. (Sorry, GoogleGuy, no choice tidbits this time). Like they say, who's behind you doesn't matter (but now they matter!)
I guess I'll have to figure out what they did (link work I'm sure). Still don't like those spammy 6 word domain names...
Jon_King, Napoleon, Xenozenith, glad it's looking better on your areas. Zapatista--got the report, thanks. Looked like a bad search to me too; we'll dig into it more over the next few days.
P.S. Napoleon, I love your tuna/dolphin thing. I'm gonna have to start using that. :)
I've seen the same thing happening myself...on more than one site. I'm hoping next month will be better...
|translation - put your url on your business cards and brochures as this will be the only way people will find it and raw internet marketing will do nothing for you. |
Probably a good idea if your customer insists on flash movie intros. I have a customer that insists on flash intros. Very irritating when I go there. I should probably add index2.html to my favorites to avoid this irritation.
| This 128 message thread spans 5 pages: < < 128 ( 1 2  4 5 ) > > |