Welcome to WebmasterWorld Guest from 22.214.171.124
Forum Moderators: open
My guess is that Google is/has moved into a perpetual update model. I might also guess that we will see a monthly update in regards to PageRank, a rescaling as they say. Maybe some form of monthly purge also (based on applied filters). Keep in mind, these are only guesses based on observations and the current discussions taking place here and other resources.
See related thread here started by rfgdxm1...
Is Google now just doing a continuous, rolling update? [webmasterworld.com]
In half an hour my site is ranked No.1 then 4 then 2 then 19.
I am no guru or senior member, but you may want to check your rank in each of the data centers.
It's likely that your rank is constant in each of the centers, but Google keeps rotating the datacenter for the main www as well as aol, yahoo, etc...
It may not be your rank that changes, just which center is giving you the data.
If I had some money on advertising, I would not spend it on ambiguous single word searches where I can't be really sure what people want, I would spend it on searches describing exactly what I am going to offer.
obviously the google advertising system functions parallel to the web search, for users and advertisers alike.
Well define content of search :
1)Is it that links from relevant pages are taken?
2) If google see that one site is only linking to one relevant site then it doesnt count that link?
e.g lets say theres a site with key word widget and the site on which its been advertise has only this site about widget so google is thinking that it can be paid or some kind of spam.
3) If on a website there are links to more than 1 site with same keyword will that link have more weightage.?
None of them, and all of them.
Google isn't going to use just one datacentre to serve results, and have the other 8 datacentres just sitting idle. All 9 datacentres serve data all of the time. At times when there are differences between the datacentres your results seem to vary from search to search if you are using just www for searches. I think this will always be the case; the datacentres will always have small differences between them all of the time now.
Can you predict google moves?
Judging by the smell of the bile from the sparrow I caught I'd say that google will be moving somewhat unevenly in the near future. She will encounter a large number of tall dark strangers, and some shorter, and not so dark ones. I believe that Google will move three steps forward and two steps back within the next week, and collateral spreading will be more prevalent than in previous mission centric stances.
Page rank just helps to decide if a page is respectable, it does not help to decide if the content of the page is what the user is searching for. Especially as page rank and the anchor text is most times (e.g. in directories) directed towards the index page, which is at best an entrance describing the themes of the site (thought from the point of view of content, not presentation of a brand.
so the index page is useless for advertising, as at that point it is not clear what the user is really after.
It would be some time before the index stabilizes but my thinking is that it will stabilize for this month like it used to before. After that I don't know. These are just my opinions and I know it contradicts with what many people think. HTH :)
On most terms there are a ton of sites that could be very relevant, so a little unpredictability doesn't hurt the searcher and it allows less direct manipulation of the results. On very specific searches, I see things working much the same as always.
one of my sites with a very popular service/product: hardly any content, lots of pictures, just five pages chancing on a combination of keywords from anchor/referral text plus onsite text that seems to have been overlooked by everybody else on the web, very few hits and a really good conversion rate (i.e. people phoning up for it per hit)
one of my sites trying to promote something everybody says is very nice but nobody seems to need: lots of content, really good listings at present Google and lots of hits for all kinds of specialist interest, conversion rate = null
joined:May 28, 2002
Where they actually track the performance of queries. Like, what percentages of users click on the first result, how many users click on something from the first page, etc..
Considering what GG have said...
Why am I talking about this? Well, Kalman filters have a knob that blends between how much you believe your model vs. how much you believe each new data point. If you tweak the knob all the way in one direction, you always trust the model and any new input just gets ignored. On the other extreme, you can ignore your current estimates about the state of the world, and only trust each new data point as it comes in. If you set the knob too far in that direction, the object you're trying to model jumps all over the place each time you see even a hint of new info.
I would venture to say that aside from the movement of data between data centers that we have unknowingly become Google guinea pigs. Who knows how many engineering team are now perusing every click through for every turn of the knob.
Short analysis and figuratively speaking, Google is optimizing for click through.
I predict that the serp will only stabilize until Google find that just right 'knob' setting.
Just my 2 cents
The simple fact that someone clicks on a SERP means nothing other than they liked the title.
If they were using click tracking it would have to be monitoring a lot more than clicks. I still haven't figured out a way in which this data can be used to create better results. Maybe the duration someone stays on the page?
joined:May 28, 2002
The amount of click through on page1 can be directly analyzed in relation to the result of an algo tweak. For instance, if a lot of users are clicking results on page2 or even on page3 then that send a signal that there's something wrong with the result. Why would the majority of users would go to page 2 or 3 to find the answer for their query?
Google is known to be obsessive of the quality of search result that's their cutting edge and intend to stay on top of that game and just as an fyi, Google does monitor click through just like your adwords.
There were about a hundred threads, or more, open per day on some of the busiest days.
There were multiple threads on the same topic, often right next to each other in the list.
There were threads that were receiving 150 to 200 new messages per day, much of it noise.
Unusable, unworkable, unecessary.
Moderators are cutting down on noise in the forums.
Check to see if there is already a thread covering your subject and use that one.
I'm not a Mod here, but I suspect I have covered the salient points.
Arguing further about it just increases the noise level; so just find a suitable thread and join it.