Forum Moderators: open
For example, my result for Buy Widgets is alternating between position 6 and position 11. This has been happening for 2 weeks.
To me, this shows that Google is publishing 2 indexes and is alternating between the two.
What is the reason for this?
Theory 1: SEO Neutralisation. Since Jan, we have seen steps towards this. Obvious step being reciprocal links neutralized. Why would Google want to do this? "It's the money stupid", Less predictable results means more spend on Google Adwords by webmasters!
Theory 2: Continuous Update. Inorder to publish a new index and allow it to settle across all data centers 3 days is required. So, DeepFreshBot scours the depths of the web continously and a new index is published every three days. GoogleGuy allows the new index to settle and then pushes the red botton,making live the new index. (Best way to test this is to make title tag modifications to deep pages and see if they appear with the new index).
Theory 3: Both. How do you kill two birds with one red button? Simple, you undertake 2 above, but use two seperate algos, one for each index(theory 1). Thus, index A is on for 3 days, meanwhile index B(with different algo) is being prepared and published. Then Index B goes live and index A goes in for updation, and so on and on. To make it greater fun, every now and then GG inserts experimental algo C for 3 days.
Enjoy!
Controversial I know but I thought I'd offer a slightly different angle on the new Google situation.
Firstly I think we all agree that Google has problems with Spammy/irrelevant sites getting into the top 10 for various SERPs. I think this move from Google is an attempt to rid the SERPs of spam. By changing/updating the algo every 3-6 days it leaves a much smaller window for spammers to take advantage. Initially there will still be crap but eventually I hope the latest "improvements?" will sort the wheat from the chaff.
Secondly this does makes the life of the SEO difficult because of these constant changes but maybe instead of debating what Google's up to we could improve our websites and maybe just maybe the internet/SERPs would get better. If they are going to fiddle every few days then what have we got to talk about...nothing coz who know whats going to happen in a few days!
Whilst on the subject of spam, it’s now got to the point where I’m thinking "if you can't beat 'um, join 'um". It's going to more profitable to create a spam domain to cover my ass when my "law abinding site" dives out the SERPs and the SPAMMERS reappear. This is surely not what google have intended but if a well behaved SEO like me is thinking like this then maybe Google should bite the bullet like I think they may have and try and address the situation.
Could I be right or am I trying to hard to justify Googles new lifestyle?
p.s. Where is GoogleGuy in this time on need?
I hope he hasn't gone into hiding with Bin Laden and Saddam somewhere? :)
I also notice that the back links from many site include links from their own domain!
Dave
I also agree that these sets of results look a lot better than they have in a long time not only because we are back in a high ranking but the sites on the first page should be there. The backlinks are still not up-to-date and PR has not been assigned to new sites we watch.
Please push the "update PR" yellow button.Please push the "Update Directory" pink button.
Couldn't have said it better myself. The PR rankings and backlinks for the sites that I manage are a fraction of reality. The directory is hopeless. I have a new site with respectable (getting better every day!) legit links and I have only been able to get the front page indexed by manual submission. G-Guy has made cryptic references, saying vague stuff like that
"it takes time for a new site to build PR (sometimes)"The spiders stop by, ask for "robots.txt" and move on.
It seems that the 'Plex is focused on their new shenanigans, but aren't “tending to their knitting.”
Toolbar URL tracking is involved, I'm convinced. They must figure that enough people have it installed (that's their first error) that the results constitute a legit estimate of a site's significance.
I would assert that the “sample” involved in those who have the Toolbar installed is utterly skewed and not nearly large enough. It’s spyware. I can’t/won’t install it on anything except my “dirty” PC, for instance. It’s not available for all platforms, browsers, etc.
What is it that Dilbert says “Incorrect data and wrong assumptions equal [bleep] conclusions?”
And, by the way, isn’t G-guy starting to become conspicuous in his absence (or did I miss that he’s on vacation?)
[Thus endeth today's rant. Now turn with us in your hymnals to that old favorite "Don't Worry. Be Happy." For verily, we know that the beneficent Plex is always right..... eventually.]
Enuff venting. Now back to work- adding fresh, interesting, relevant new content and scouring cyberspace for good backlinks. (;-}
1 We know that google doesn't like SEO
2 We know that search results change radically every few days.
3 Even most of you speak 'bout a 3-day period, it seems to not be the same everytime.
4 Some of you spoke about randomizing?
This is the key to make the results impredictable. If google wan't to keep their results safe from our SEO techniques it can do (or is doing) the following:
1. Keep different algorythms ready to do a research
2. Generate a random integer number, between 1 and the total number of algorythms (even variations) available
3. Choose and apply the "lucky" algorythm from the list when a user makes a search
You might think that this does not explain the "timed" variation of results, but thoose among you that have some knowledge 'bout encryptation, programming or both, might know or, at least, have heard that system date and time and other variables are normally used to improve results in random number generation algorythms.
I definately do not agree with this statement. Google may not like over optimization but without some SEO how is Google going to know which sites to bring up. Put in any search and see the highlited words. This is SEO.
They do not like blatant excess but normal optimization helps.
Absolutly it does. That's the whole point. You build a good site and you need never worry about it again. Get to the point, you don't care what they do and it doesn't affect you one way or another. Or better yet, get to the point where you can turn the bandwidth content leeches off completely ;-)
Or better yet, get to the point where you can turn the bandwidth content leeches off completely ;-)
Could you amplify what you mean?
Since I've started studying our logs files, I can't BELIEVE how much "junk" traffic hits our sites, often times dubious searches from off shore domains, I assume seaching for email addresses to spam.
Now that the googlebots have started requesting the robots.txt file, I've installed a version of the sample file on Searchengineworld (and have used the robots.txt verifier, too) [Thanks BTW] but I still see a LOT of garbage traffic. Among other things, it makes hit statistics even MORE meaningless, which is always a pain to explain to organization leaders.(You know the conversation: "Whaddya mean ya can't tell me how many people visit our site?")
Is that what you're talking about?
You build a good site and you need never worry about it again
And, a good site is never built...it is always a work in progress.
This discussion now continues in this thread that explains what we all have been seeing:
[webmasterworld.com...]
If you wish, we can begin a new discussion about this, but this topic is not the place to do it.