Forum Moderators: open
IMO It would make sense if google determined its algo randomly, assigning different factors, different weights at different times.
How else would they keep ahead of the game?
By making it almost impossible to know *exactly* what the algo is, Googles users win.
SEOs shouldnt complain about this. IMO they should just continue to make use of good seo principles, and the benefits should follow through.
I think they are a sensible company, who know what they are doing, and we should trust them to get on with the job.
It seems that for the last month Google is in a constant state of flux.
It also appears that they did do a tweak to the algo at the last update as my site dropped from PR 6 to PR 4 depsite adding more relevant content. But the PR dropped for all sites related to the keyword so the relative positions stayed the same.
One thing that you can use to determine how important anchor text is in link (and that's an Off the page Factor)
You should take a look at:
How to use Allinanchor [webmasterworld.com]
It's a very usefull function.
allanp73, Google's results are harder to understand than they used to be. A number of factors come into this such as the Everflux and, IMO, server IP geolocation. This doesn't mean that the results are worse, just less predictable.
Google's result counts have been flaky for as long as I can remember.
I think a core part of the 'problem' is that Google has multiple data centers and they are in a constant state of flux. And sure.. their technology is wobbly from time to time as well.
One of my sites comes and goes for its main keyword. If I search it is gone. If I dial in from another ISP, it is there. If I go back to the first ISP it is there now. If I try a third ISP it is gone again. It is constantly there on www2 and www3.
I put this down to the possibility that the index at one or more of the centers has been backed up or shrunk for some reason. Each time I try (especially from different ISPs) I get a different center.
The likelihood is that it will be back in all centers next update. That is exactly what has happened in the past for several sites that have demonstrated the same behaviour.
Maybe I'm wrong.. maybe it is really something that went askew with a fresh crawl... but it does seem to be a tenable technical explanation... and I can't think of any others.
I think we have to remember that Google isn't perfect and that their physical technology is going to be stretched from time to time. When search engine technology is the core deliverable anomolies are bound to occur.
The results Google is producing are crap.
I have not noticed this. Did you see this behavior across many sites or just a small number? It is possible that the sites in question used to have relevant content and they were recently modified - but they have not been updated in Google yet (ie Google thinks they still have the relevant content). Look at the sites again in a month (relative to the keywords in question) and I bet they are gone.
John
I agree, I can't believe the sites that are suggested when I search my keywords. These sites have not been prepared or dressed up for SE's and don't even relate to the topic I search for.
I was commenting on the permanent state of change in the SERP / link counts for a keyword. We had all come to expect some changes from search to search but Google usually settled down 1 week or 3 after an update. Now the flux is 24/7 and seems to last right through to the next update.