Forum Moderators: open
People keep arguing back and forward, but having read almost every post, all I see are rehashes of the same old rumours, respun as truthes, until the origin of such belief is lost in the mist of forum threads past.
So, my questions are:
1. For thoise taht believe, what are you basing this assumption on, apart from the lack of an update? (REMEMBER: Lack of evidence is not evidence of a lack)
2. Has anyone seen a change in the number of backlinks for their site since the last update, or seen links updated on different sites on different days? (this would be a good indicator)
3. For those that do not believe, how long before no update does equal rolling update?
4. What effect will this have on Link Rep and PageRank, and how often will these elements be updated?
I am starting my own investigations, with the following being my basis for a conclusion:
1. Backlinks changing on different sites on different days.
2. PageRank changing on different sites on different days.
Any other suggestions?
Cheers.
I believe you have 3 algos that dictate the results you see. One of which is freshbot based, which is "continuous" / dynamic. But the main algo is still deep bot with subsequent traditional G calculation. And I think we will have a quick step soon.
I can live with this and in the good old days the links would update now nothing, so come on Google throw the switch and update the links
Thus either it is something between a continuous and a traditional update or there is a continuous update but the new PR and backlinks are not (yet) shown.
The only way I see to make a decision is to wait. If the backlinks are changing, I would look at the ranking: If there are major changes (in particular for new sites/pages) then PR is updated. However, if the are just small normal changes, then PR was already updated (continuous updated) and the new values are just displayed.
No change to the PR or the backlinks however, but it is a new site.
2odd...
After having a discussion on this issue, I have been checking Google's index. It has not been discussed a lot here, but some of you may know that there have always been 2 versions of pages which were crawled regularly (the fresh tag) in Google's index. One version of a page reflected Google's previous deep crawl and one version reflected the fresh crawl. (The fresh crawl version never really replaced the deep crawl version in the index. There rather were two indices and results from the fresh index replaced results from the deep index in the SERP.) Our main site has a date on it and, so, it was possible to see both versions of a page in the SERP by doing the right queries.
However, currently, at i.e. the www-ab data center there is a version from July 29 and a version from July 23 of our home page in the index. (It can be found for searching our site for "29 juli" and "23 juli" respectively. As I have said, before Dominic the older version always reflected the last deep crawl. Now, it appears that the data gathered by the freshdeepwhateverbot replaced the old data in the "main" index completely. IMO this is a good indicator for Google indeed being on its way to a continous update.
My site seems to get pages added and updates reflected about every 3 days, this is nice.
My site also seems to get some pages go back and forth between new pages old pages with no continuity. For every thing google gives it seems to take away just as much at the moment.
My site relies on regular updates and if everything works like how they are hinting it will be great but at the moment things are a bit fuzzy.
- I have seen no backlinks added for 2 months +
- Our site google directory link in the serps but if you click through to the directory page we are not on it.
its all just getting a little tired now, come on google, get your act together.
I started to believe in the ongoing update about a week ago. A relatively new site (fully indexed by Esmeralda) went up in the SERP significantly. Google referrers increased from 100 to 2,500 per day. Since there have not been any changes on the site and since there have not been any major algo modifications at Google, new inbound links must have been taken into account (IMO in terms of PR and anchor text.)
I agree, I have had major movement a couple days ago on 3 keyword (from #80 to #3), when I check the allinanchor: for these keywords I come up in the top 5. A few days back I didn't even come up for the allinanchor. This is a newer website and all the links for these keywords were added after the last update. So that is almost enough proof for me now that we maybe are in a continues update.
My oberservations are:
- anchor text seems be continuously updated
- no update for backlinks and PR (shown by Google)
Same here. Also many days after Esmeralda was complete one of my sites in a competetive category did a big jump from #93 to #5 and is fluctuating almost every day in top 5.
Other brand new site got picked up in google and is on top because of its aggressive link campaign. Is showing a PR 0 though :)
I have seen these changes around the times of the fresh listing changes - in or out - with the fresh changes behaving as they always did - bumping listings higher temporarily.
Also, right after the last real update I added a new section with about 5,000 pages to my site. The main page has been spidered and cached. None of the others have been cached. It is all well linked on a PR5 site (one of the top 3 PR sites in my DMOZ cat).
I still believe there is another lever to pull to initiate the true perpetual update or another update like they used to be.
EquityMind
They are referring to this as "Incremental Indexing" and while the update is still occuring once a month, the goal is to go from the batch indexing into the incremental indexing platform.
This could explain why there seems to be so many inconsistencies with what people are observing. Did they mention how far along they were in the implementation?
One thing that does seem consistent. There seems to be more emphasis on fresh content than in the past.
In the meeting, they referred to their monthly update and I commented how I was seeing a continous update cycle every few days as we added content and made changes to a large ecommerce site (one of the dot com big boys). My contact nodded in agreement and referred to the "Incremental Index"....I laughed and told him I had better write that down all the while thinking about how I was going to post that here at WebmasterWorld. The conversation implied that they were moving to this platform for one of their product lines sometime in Q3-Q4 so the assumption is made that for the normal index that would coincide most likely around the same time albeit anybody's guess whether it would precede or follow.
Here's the facts:
I have a PR5 page that I link to other internal pages
to 'boast the PR' of several internal pages. One of those pages has always rated #1-2 in a major search term for me.
Since it now has many other backlinks (that I do not control), about 2 weeks ago I removed the backlink I control to see the affect of removing that PR5 link.
No change until yesterday but it could have happened anytime within a week. (I only check rankings once a week). It dropped to #23. I looked at the backlinks and sure enough the controlled backlink was gone.
The receiving page has a cache date of 6 days ago. (I date the page when changes are made and that page is updated every day or two).
I want that page in the top 5, so I have now added the link back, so I'll see what happens.
This last week the 'second layer' of pages (approximately 400-500 pages) only linked to by the 'first layer' pages all made it into the index and traffic shot through the roof (all PR0 with no backlinks showing at all). All of this without an update? We are extremely well ranked in the SERPS this week (very competitive terms), all of the static pages we put in are in the Google database (using the site search within Google) and it feels like we just went through a 'dance'.
I believe the 'Incremental Indexing' is on...
EquityMind
Jeremy