I'd say add them ASAP. It may already be too late.
I'm already online....... :-)
But I was just curious, why you are all waiting for the "dance". Is it because you just want to watch it or because you want to make some changes on your sites to influence somehow your listing?
From my understanding Google does deep crawls twice a month....if this is wrong correct me please, but with that information, I would say that anytime is really a 'good' time.....I guess you may luck out sometimes more than others, but I don't think I'd stress myself out too much over when was the 'best' time to add pages.....just my $.02
Once a month for lower PR sites .. or at least in my experience this is the case. But then again, Google is continuosly "changing" and therefore it might be a single crawl this month and a double crawl next month, or vice versa.
The Google Dance is in my opinion the latest time period for you to upload content if you wish to have it indexed in the follwing update. This because Google usually crawls at its fastest pace directly after a dance. If you miss that crawl, then you're waiting another month.. Then again, the content will need some backlinks if its to be found by the bot!
|The Google Dance is in my opinion the latest time period for you to upload content if you wish to have it indexed in the follwing update. This because Google usually crawls at its fastest pace directly after a dance. If you miss that crawl, then you're waiting another month.. Then again, the content will need some backlinks if its to be found by the bot! |
This is true for sites that don't get visited by freshbot. Those that do can have their content indexed on a daily basis (well, not today or yesterday).:( The important thing for those sites to get up before the deep crawl is any link changes you want to be indexed. Of course, currently mysite.com, which has been freshed all month now has reverted to the cache from previous deep crawl instead of fresh crawls. I wish I had updated it sooner.
|This is true for sites that don't get visited by freshbot. |
That's not entirely true. Whereas the freshbot might hit daily, it only usually hits the root level, so uploading NEW content should be done before the deep crawl, especially given that if it's NEW content its most likely goingto have a low PR, therefore might not warrant a "freshbot" visit!
I've noticed that too (about the root directory).
My main content is in a subdirectory of the root, just one level down, but it doesn't get hit by the freshbot nearly as often.
I'm debating on moving it to the root too.
Any downside to this?
Should I leave the existing where it is and just add new pages to the root?
As far as adding pages, it really doesn't matter. If you post a new page, and Googlebot crawls your site within the hour, there are no guarantees that it will hit every (or even one link) to the new page.
Moving sites, having a down server (e.g. for maintenance) are probably best the moment the update appears in www2. or www3.
I suppose that adding new pages this time would maximize the change of being indexed on the update as well, but still no guarentee.
Noting: had a page where a comment tag in the header was "cut" but didn't actually cut all of it-->
Notice this a couple of updates back in Google and immediately cut the remainder. 2 updates since, google still has not re-visited this page.
I get hit daily at the root level too - I think it might be a pr issue - I am a pr6 and my content changes - maybe pr7+ sites get crawled deeper by the freshbot (guessing)
|I'm debating on moving it to the root too. |
This sounds like a good strategy to me! :)
|Whereas the freshbot might hit daily, it only usually hits the root level |
I didn't think of that aspect. My world's biggest site is <50 pages and all in the root. Every time she comes by gbody grabs everything. I was recently looking over the root vs directory [webmasterworld.com] thread from august and it looks like this may be a small argument for root.
I think you might be right about the root and freshbot. I have a sight that had 100 pages at the start of the month and 150 by the end. All the pages are in the root. During that time googlebot has requested 2900 pages which is pretty heavy spidering for that amount of pages in only one month.
I have a PR6 site, and new pages get into the index quickly (between monthly updates) if they're linked from the home page.
Example: I write a six-page article on Elbonia and link to page 1 of the article on my home page. That page is listed in Google within days. However, the other five pages of the article don't appear until the next monthly update.
And, of course, PR isn't calculated for the new page until the next monthly update.
i also think, it's already to late. The changes we make now are for the december dance (now we wait for the october dance).
I'm interested in whether the key here is being 1 link from the homepage or root level. The determining factor for pagerank is the amount of links it takes to get there from the externally linked page. I'm wondering if there is a similar relationship with freshbot.
I believe your /index.html or main root level page is your key page (money page). Link new pages off of this page to ensure that googlebot finds them properly.
For the rest of the site, I believe it has something to do with the following factors.
1.) # of internal links to the page.
2.) # of external links to the page.
3.) Last-Modified Header date, etc...
4.) Date the page was first indexed by Google.
In other words, IMHO it has a lot to do with the page rather than the directory that it is in. I "constantly" get certain pages hit by freshbot that are in a level 3 directory (example: /dir2/dir3/page.html).
Being too late to submit pages and not wanting to make content changes until I see how I'm placed in the next update, I use this time to poach images from competitors sites.
During poaching I have two windows open allowing me to check for the Google dance without losing my place.
|My main content is in a subdirectory of the root, just one level down, but it doesn't get hit by the freshbot nearly as often. |
My main content is also one directory down. It gets hit by the freshbot every couple of days, whereas the root page is rarely hit by the freshbot at all (although it is deepcrawled). I figure this is because nearly all my incoming links are to the content page. I think there may be two to the root page, if that. So, since the main page really isn't very important (pr1 or 2) and rarely updates, it gets ignored in favour of the main content page (pr5ish, updated nearly daily).
The "main" part of my site is definitely contained in the root.
Maybe it doesn't have anything to do with root v. subdir; maybe it's based on number of incoming links to x page as you suggest.
_After_ the google dance the deep crawl begins. You dont want to miss the deep crawl, which i have never seem happen before the dance.
So currently it is great time to create more content. And be sure to get it posted before the end of the dance. That will give you the shortest time from creation to distribution. who knows the next update could be less than 4 weeks away. If you miss the crawl you may need to wait 8 weeks.
|I figure this is because nearly all my incoming links are to the content page. I think there may be two to the root page, if that. |
I think whether or not it is the home page or not is not important but rather whether it has external links and possibly whether those links come from sites that also are visited by freshbot.
just to add my two shillings.
what to do during Google-update:
1)chew many many toothpicks
2)continuously return to google and make sure that 657,000 really still is 657,000
3)drink a lot of coffee
regarding the other topic, off topic:
I moved my whole website from www.mysite.com/web/mypages to www.mysite.com/mypages two weeks ago and have I seen a difference.
All pages are database driven, mainly one page with 200 differnt ids used in the querystring.
And boy didn't googlebot go mad when he saw that. ever since, I have had the googlebot all over my website daily picking 20-100 urls. stopped dead day before yesterday though...
So yeah, putting all pages in root did make a big difference to the crawls.
So I guess it is understandable why I am a bit twitchy right now, not knowing what effect this is going to have in this update.
arghhhh. still 657,000
Another "freshbot" variable is how long the pages stay in the index (and I agree with the "root" argument and especially the "link it off your entry page" argument - that has generated the best results for me).
My fresh pages never last long (and a number of people said the same on another thread), typically being in with VERY high serp for anywhere from hours to days, then disappearing only to return after the next update.
Today is 30th and the dance has no started yet... Will this month have update, why the update is delaying so much, the moon have completed its cycle and nothing has happened.... I'm nervous
Googleguy said: be patient, be patient.
He wouldn't have said this whether there will be no update.
(I hope so...)
This is nothing new, it has happened a few times where google misses the month, it will most likely mean 2 updates in Nov ie, 2-3rd for what would be the Oct update and then late in Nov for the Nov update.
nothing to worry about IMO :)
I am sure they are double testing their algo, just to be sure about this time. That's what seems to be taking time I guess.
Frankly I am bit nervous too!
"double testing their algo"
Hi everyone.. I must confess I have been snooping here for about a month but I cannot hold back any longer.. I just have to post. I was afrid of becoming addicted to posting messages up here, so I have delayed in registering.
I think Google have been making major changes this month to thier filters. Many of us will see big changes to our rankings and PR, with a lot of sites being dropped for cross linking. That is why have the delay.
| This 82 message thread spans 3 pages: 82 (  2 3 ) > > |