Forum Moderators: open

Message Too Old, No Replies

Google's 2 indexes & you

Google is alternating between 2 indexes, every 3 days.

         

Namaste

10:31 am on Jul 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I am sure you all are seeing this too, but every two-three days the index changes and if a page drops, it drops to the same position it was 3 days back.

For example, my result for Buy Widgets is alternating between position 6 and position 11. This has been happening for 2 weeks.

To me, this shows that Google is publishing 2 indexes and is alternating between the two.

What is the reason for this?

Theory 1: SEO Neutralisation. Since Jan, we have seen steps towards this. Obvious step being reciprocal links neutralized. Why would Google want to do this? "It's the money stupid", Less predictable results means more spend on Google Adwords by webmasters!

Theory 2: Continuous Update. Inorder to publish a new index and allow it to settle across all data centers 3 days is required. So, DeepFreshBot scours the depths of the web continously and a new index is published every three days. GoogleGuy allows the new index to settle and then pushes the red botton,making live the new index. (Best way to test this is to make title tag modifications to deep pages and see if they appear with the new index).

Theory 3: Both. How do you kill two birds with one red button? Simple, you undertake 2 above, but use two seperate algos, one for each index(theory 1). Thus, index A is on for 3 days, meanwhile index B(with different algo) is being prepared and published. Then Index B goes live and index A goes in for updation, and so on and on. To make it greater fun, every now and then GG inserts experimental algo C for 3 days.

Enjoy!

steveb

7:55 am on Jul 8, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Dear GoogleGuy,

Please push the "update PR" yellow button.

Please push the "Update Directory" pink button.

Please do not push the "put the doo-doo back in the index" brown button.

Your Pal,

GoogleLover

steveb

8:18 am on Jul 8, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



July 7th Fresh tags appear.

esllou

8:32 am on Jul 8, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



yes, fresh tags are back and I am back way up there on all relevant search terms.

Where I will stay for three days!

Aaaaagh!

tobstar

9:44 am on Jul 8, 2003 (gmt 0)

10+ Year Member



WELL DONE GOOGLE! Seriously.

Controversial I know but I thought I'd offer a slightly different angle on the new Google situation.

Firstly I think we all agree that Google has problems with Spammy/irrelevant sites getting into the top 10 for various SERPs. I think this move from Google is an attempt to rid the SERPs of spam. By changing/updating the algo every 3-6 days it leaves a much smaller window for spammers to take advantage. Initially there will still be crap but eventually I hope the latest "improvements?" will sort the wheat from the chaff.

Secondly this does makes the life of the SEO difficult because of these constant changes but maybe instead of debating what Google's up to we could improve our websites and maybe just maybe the internet/SERPs would get better. If they are going to fiddle every few days then what have we got to talk about...nothing coz who know whats going to happen in a few days!

Whilst on the subject of spam, it’s now got to the point where I’m thinking "if you can't beat 'um, join 'um". It's going to more profitable to create a spam domain to cover my ass when my "law abinding site" dives out the SERPs and the SPAMMERS reappear. This is surely not what google have intended but if a well behaved SEO like me is thinking like this then maybe Google should bite the bullet like I think they may have and try and address the situation.

Could I be right or am I trying to hard to justify Googles new lifestyle?

p.s. Where is GoogleGuy in this time on need?
I hope he hasn't gone into hiding with Bin Laden and Saddam somewhere? :)

Dave_Hawley

9:55 am on Jul 8, 2003 (gmt 0)



We use to have a PR of 6:7 now, after many more links pointing to us, we have dropped to 5 and have been stuck there for months. The Google back links tells us we ave 119 pointing to us, that less than half of the true figure. What gives?

I also notice that the back links from many site include links from their own domain!

Dave

Tropical Island

10:02 am on Jul 8, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It now appears that there are two new indexes that have propogated to the 9 data centres. Our missing index page is #9 on one set and #5 on the other for our best term. That's up from our secondary pages in the 70's+ and our index MIA.

I also agree that these sets of results look a lot better than they have in a long time not only because we are back in a high ranking but the sites on the first page should be there. The backlinks are still not up-to-date and PR has not been assigned to new sites we watch.

Fearless

3:33 pm on Jul 8, 2003 (gmt 0)

10+ Year Member




Please push the "update PR" yellow button.

Please push the "Update Directory" pink button.

Couldn't have said it better myself. The PR rankings and backlinks for the sites that I manage are a fraction of reality. The directory is hopeless. I have a new site with respectable (getting better every day!) legit links and I have only been able to get the front page indexed by manual submission. G-Guy has made cryptic references, saying vague stuff like that

"it takes time for a new site to build PR (sometimes)"
The spiders stop by, ask for "robots.txt" and move on.

It seems that the 'Plex is focused on their new shenanigans, but aren't “tending to their knitting.”

Toolbar URL tracking is involved, I'm convinced. They must figure that enough people have it installed (that's their first error) that the results constitute a legit estimate of a site's significance.

I would assert that the “sample” involved in those who have the Toolbar installed is utterly skewed and not nearly large enough. It’s spyware. I can’t/won’t install it on anything except my “dirty” PC, for instance. It’s not available for all platforms, browsers, etc.

What is it that Dilbert says “Incorrect data and wrong assumptions equal [bleep] conclusions?”

And, by the way, isn’t G-guy starting to become conspicuous in his absence (or did I miss that he’s on vacation?)

[Thus endeth today's rant. Now turn with us in your hymnals to that old favorite "Don't Worry. Be Happy." For verily, we know that the beneficent Plex is always right..... eventually.]

Enuff venting. Now back to work- adding fresh, interesting, relevant new content and scouring cyberspace for good backlinks. (;-}

Herenvardo

4:17 pm on Jul 8, 2003 (gmt 0)

10+ Year Member



After reading all your posts (even before I became a member of the forum), I've reached an hypotesis that i'll summarize here in some poits:

1 We know that google doesn't like SEO
2 We know that search results change radically every few days.
3 Even most of you speak 'bout a 3-day period, it seems to not be the same everytime.
4 Some of you spoke about randomizing?

This is the key to make the results impredictable. If google wan't to keep their results safe from our SEO techniques it can do (or is doing) the following:
1. Keep different algorythms ready to do a research
2. Generate a random integer number, between 1 and the total number of algorythms (even variations) available
3. Choose and apply the "lucky" algorythm from the list when a user makes a search

You might think that this does not explain the "timed" variation of results, but thoose among you that have some knowledge 'bout encryptation, programming or both, might know or, at least, have heard that system date and time and other variables are normally used to improve results in random number generation algorythms.

Tropical Island

4:49 pm on Jul 8, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"1 We know that google doesn't like SEO"

I definately do not agree with this statement. Google may not like over optimization but without some SEO how is Google going to know which sites to bring up. Put in any search and see the highlited words. This is SEO.

They do not like blatant excess but normal optimization helps.

Brett_Tabke

5:08 pm on Jul 8, 2003 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



> Building a good quality site doesn't keep you from
> having links missed or algorithmic calculations done poorly

Absolutly it does. That's the whole point. You build a good site and you need never worry about it again. Get to the point, you don't care what they do and it doesn't affect you one way or another. Or better yet, get to the point where you can turn the bandwidth content leeches off completely ;-)

vbjaeger

6:01 pm on Jul 8, 2003 (gmt 0)

10+ Year Member



edited

[edited by: vbjaeger at 6:17 pm (utc) on July 8, 2003]

Fearless

6:02 pm on Jul 8, 2003 (gmt 0)

10+ Year Member



Brett,

Or better yet, get to the point where you can turn the bandwidth content leeches off completely ;-)

Could you amplify what you mean?

Since I've started studying our logs files, I can't BELIEVE how much "junk" traffic hits our sites, often times dubious searches from off shore domains, I assume seaching for email addresses to spam.

Now that the googlebots have started requesting the robots.txt file, I've installed a version of the sample file on Searchengineworld (and have used the robots.txt verifier, too) [Thanks BTW] but I still see a LOT of garbage traffic. Among other things, it makes hit statistics even MORE meaningless, which is always a pain to explain to organization leaders.(You know the conversation: "Whaddya mean ya can't tell me how many people visit our site?")

Is that what you're talking about?

Namaste

8:10 pm on Jul 8, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You build a good site and you need never worry about it again

what is good? Yes, we all have good sites. We have excellent products, customer service, user interface, seo, etc. Now suddenly we find that a reliable partner, Google, is becoming unpredictable. Ofcourse we are concerned. The previous cycle has seen junk sites beat us out of the top 50...many of us here are shocked by this...we are re-visiting our seo notes to see where we are short and where the junk sites score.

And, a good site is never built...it is always a work in progress.

This discussion now continues in this thread that explains what we all have been seeing:
[webmasterworld.com...]

Herenvardo

4:06 pm on Jul 9, 2003 (gmt 0)

10+ Year Member



Tropicall Island:
Maybe Google doesn't punish the sites unless they are over-optimized, but the SEO is, by definition, against the search engines.
In order to make a good search engine, this must be able to find the most interesting pages, and not the most optimized.
SEO means lying to the Search engines to make our site to appear among the bests, even when it is not so good.
Since this, no search engine can like SEO, but most of them tolerate it.
You can say that google tolerates non-excesive SEO, but I can assure you that it doesn't like it.

If you wish, we can begin a new discussion about this, but this topic is not the place to do it.

This 104 message thread spans 4 pages: 104