Forum Moderators: open
The Google Dance is in my opinion the latest time period for you to upload content if you wish to have it indexed in the follwing update. This because Google usually crawls at its fastest pace directly after a dance. If you miss that crawl, then you're waiting another month.. Then again, the content will need some backlinks if its to be found by the bot!
IMHO
The Google Dance is in my opinion the latest time period for you to upload content if you wish to have it indexed in the follwing update. This because Google usually crawls at its fastest pace directly after a dance. If you miss that crawl, then you're waiting another month.. Then again, the content will need some backlinks if its to be found by the bot!
This is true for sites that don't get visited by freshbot. Those that do can have their content indexed on a daily basis (well, not today or yesterday).:( The important thing for those sites to get up before the deep crawl is any link changes you want to be indexed. Of course, currently mysite.com, which has been freshed all month now has reverted to the cache from previous deep crawl instead of fresh crawls. I wish I had updated it sooner.
This is true for sites that don't get visited by freshbot.
That's not entirely true. Whereas the freshbot might hit daily, it only usually hits the root level, so uploading NEW content should be done before the deep crawl, especially given that if it's NEW content its most likely goingto have a low PR, therefore might not warrant a "freshbot" visit!
I've noticed that too (about the root directory).
My main content is in a subdirectory of the root, just one level down, but it doesn't get hit by the freshbot nearly as often.
I'm debating on moving it to the root too.
Any downside to this?
Should I leave the existing where it is and just add new pages to the root?
Any opinions?
Jenny
Moving sites, having a down server (e.g. for maintenance) are probably best the moment the update appears in www2. or www3.
I suppose that adding new pages this time would maximize the change of being indexed on the update as well, but still no guarentee.
Noting: had a page where a comment tag in the header was "cut" but didn't actually cut all of it-->
Notice this a couple of updates back in Google and immediately cut the remainder. 2 updates since, google still has not re-visited this page.
Whereas the freshbot might hit daily, it only usually hits the root level
Example: I write a six-page article on Elbonia and link to page 1 of the article on my home page. That page is listed in Google within days. However, the other five pages of the article don't appear until the next monthly update.
And, of course, PR isn't calculated for the new page until the next monthly update.
I believe your /index.html or main root level page is your key page (money page). Link new pages off of this page to ensure that googlebot finds them properly.
For the rest of the site, I believe it has something to do with the following factors.
1.) # of internal links to the page.
2.) # of external links to the page.
3.) Last-Modified Header date, etc...
4.) Date the page was first indexed by Google.
In other words, IMHO it has a lot to do with the page rather than the directory that it is in. I "constantly" get certain pages hit by freshbot that are in a level 3 directory (example: /dir2/dir3/page.html).
My main content is in a subdirectory of the root, just one level down, but it doesn't get hit by the freshbot nearly as often.
My main content is also one directory down. It gets hit by the freshbot every couple of days, whereas the root page is rarely hit by the freshbot at all (although it is deepcrawled). I figure this is because nearly all my incoming links are to the content page. I think there may be two to the root page, if that. So, since the main page really isn't very important (pr1 or 2) and rarely updates, it gets ignored in favour of the main content page (pr5ish, updated nearly daily).
So currently it is great time to create more content. And be sure to get it posted before the end of the dance. That will give you the shortest time from creation to distribution. who knows the next update could be less than 4 weeks away. If you miss the crawl you may need to wait 8 weeks.
I figure this is because nearly all my incoming links are to the content page. I think there may be two to the root page, if that.
I think whether or not it is the home page or not is not important but rather whether it has external links and possibly whether those links come from sites that also are visited by freshbot.
what to do during Google-update:
1)chew many many toothpicks
2)continuously return to google and make sure that 657,000 really still is 657,000
3)drink a lot of coffee
regarding the other topic, off topic:
I moved my whole website from www.mysite.com/web/mypages to www.mysite.com/mypages two weeks ago and have I seen a difference.
All pages are database driven, mainly one page with 200 differnt ids used in the querystring.
And boy didn't googlebot go mad when he saw that. ever since, I have had the googlebot all over my website daily picking 20-100 urls. stopped dead day before yesterday though...
So yeah, putting all pages in root did make a big difference to the crawls.
So I guess it is understandable why I am a bit twitchy right now, not knowing what effect this is going to have in this update.
arghhhh. still 657,000
I think Google have been making major changes this month to thier filters. Many of us will see big changes to our rankings and PR, with a lot of sites being dropped for cross linking. That is why have the delay.