| 6:32 pm on Jun 24, 2003 (gmt 0)|
Sorry, but there is only one person who can help you:
And I don't think he will answer. :)
| 6:51 pm on Jun 24, 2003 (gmt 0)|
well i know google guy can obviously answer this, but thats the only reason i didnt mention google guys name i asked all the gurus and experts who have been studyin google for "x" number of years.
Anybody any take on the question.
| 7:00 pm on Jun 24, 2003 (gmt 0)|
|Can you predict google moves? |
Two choices if a person answers yes:
1) Bow down in adoration before her
2) Drive a silver stake through his heart
| 7:05 pm on Jun 24, 2003 (gmt 0)|
more of a newbee than a guru, but i predict that the results google produces will be the result of many hours of research and debate among their engineers, webheads and marketing staff, combined with overall response from both surfers and webmasters.
| 7:10 pm on Jun 24, 2003 (gmt 0)|
Hehehe, there would be some very wealthy people around here if we could predict Google's next move. I would equate it to predicting the stock market. ;)
My guess is that Google is/has moved into a perpetual update model. I might also guess that we will see a monthly update in regards to PageRank, a rescaling as they say. Maybe some form of monthly purge also (based on applied filters). Keep in mind, these are only guesses based on observations and the current discussions taking place here and other resources.
See related thread here started by rfgdxm1...
Is Google now just doing a continuous, rolling update? [webmasterworld.com]
| 7:21 pm on Jun 24, 2003 (gmt 0)|
well i know we all cannot predict google next move, but can anybody see a pattern. For last 3 months google has been giving result which doesnt have much explanation. Before april update i think most of us made sense abt what happened and why that happened but last 3 months have been close ur google browser and dont bother bc u dunno what might happen. This update has alrady taken more than week and havent settled down. Do you feel it will stop or this is the new thing google is tring to keep updating the results every hr or min.
My sites ranking is changing everytime i check for my preferred key word.
I have read what everybody thinks but not once have anybody said with confidence whats goin on. May be no body can predict since its so fragile but there gotta be one person who might have figured otu whats goin on and why and how?
| 7:36 pm on Jun 24, 2003 (gmt 0)|
|In half an hour my site is ranked No.1 then 4 then 2 then 19. |
I am no guru or senior member, but you may want to check your rank in each of the data centers.
It's likely that your rank is constant in each of the centers, but Google keeps rotating the datacenter for the main www as well as aol, yahoo, etc...
It may not be your rank that changes, just which center is giving you the data.
| 7:40 pm on Jun 24, 2003 (gmt 0)|
Well that make sense to me but then what center will finally give the results which will show up all the time. Or any guesses why google doin it is it indexing the pages using freshbot allthe tiem and gettting data and try to change results of diff datacenter and if it is then that means it might keep on doin that for a month or wont stop ever.
| 7:51 pm on Jun 24, 2003 (gmt 0)|
I think google is going deep into the context of searches.
If I had some money on advertising, I would not spend it on ambiguous single word searches where I can't be really sure what people want, I would spend it on searches describing exactly what I am going to offer.
obviously the google advertising system functions parallel to the web search, for users and advertisers alike.
| 8:22 pm on Jun 24, 2003 (gmt 0)|
<b>I think google is going deep into the context of searches.</b>
Well define content of search :
1)Is it that links from relevant pages are taken?
2) If google see that one site is only linking to one relevant site then it doesnt count that link?
e.g lets say theres a site with key word widget and the site on which its been advertise has only this site about widget so google is thinking that it can be paid or some kind of spam.
3) If on a website there are links to more than 1 site with same keyword will that link have more weightage.?
| 10:12 pm on Jun 24, 2003 (gmt 0)|
>> Well that make sense to me but then what center will finally give the results which will show up all the time. <<
None of them, and all of them.
Google isn't going to use just one datacentre to serve results, and have the other 8 datacentres just sitting idle. All 9 datacentres serve data all of the time. At times when there are differences between the datacentres your results seem to vary from search to search if you are using just www for searches. I think this will always be the case; the datacentres will always have small differences between them all of the time now.
| 10:18 pm on Jun 24, 2003 (gmt 0)|
|Can you predict google moves? |
Judging by the smell of the bile from the sparrow I caught I'd say that google will be moving somewhat unevenly in the near future. She will encounter a large number of tall dark strangers, and some shorter, and not so dark ones. I believe that Google will move three steps forward and two steps back within the next week, and collateral spreading will be more prevalent than in previous mission centric stances.
| 10:22 pm on Jun 24, 2003 (gmt 0)|
I am talking about the sense that can be deducted of the combination of search words. From what I observe google is thinking more and more in combinations of words than in single words.
Page rank just helps to decide if a page is respectable, it does not help to decide if the content of the page is what the user is searching for. Especially as page rank and the anchor text is most times (e.g. in directories) directed towards the index page, which is at best an entrance describing the themes of the site (thought from the point of view of content, not presentation of a brand.
so the index page is useless for advertising, as at that point it is not clear what the user is really after.
| 10:42 pm on Jun 24, 2003 (gmt 0)|
Just a thought... No expert mind you but, My site is built around 3 and 4 word keyphrases and the rankings have not wavered not even one place for each target page I care about except one one page which moved two places up last month. The serps place me at least #3 to #1 with a smattering of secendary pages in the top twenty. Anchor text, title and desc. are all the same. What do you think? Works for me.
| 11:05 pm on Jun 24, 2003 (gmt 0)|
Before you say there are not many returns for that many search words think about a 75% conversion rate for each unique. On target for the searcher... thats what I think counts the most. I might get more uniques for larger serps but what is the conversion rate for that? Not anywhere near what I have now. Maybe I'm just lucky in my little niche.
| 11:55 pm on Jun 24, 2003 (gmt 0)|
|collateral spreading will be more prevalent than in previous mission centric stances |
Sounds like you know what you're talking about. Sure you're not really GoogleGuy?
| 9:14 am on Jun 25, 2003 (gmt 0)|
IMHO Dominic was a Quality control exercise planned bcoz of the upcoming competetion to Google. The first part of Esmeralda gave results for May update with the new algo and what we are now seeing is Deep Fresh Bot adding links for the May Deep Crawl which didn't happen. So this update which initially seemed to stabilize is still showing some wild fluctuations.
It would be some time before the index stabilizes but my thinking is that it will stabilize for this month like it used to before. After that I don't know. These are just my opinions and I know it contradicts with what many people think. HTH :)
| 9:42 am on Jun 25, 2003 (gmt 0)|
Maybe Google is just going to be less predictable in the future. I think that would be good for them in the long run, just because people read these forums or know how Google ranks doesn't mean their sites are more worthwhile. Over the past year or so it's been easy easy easy to predict placement and results.
On most terms there are a ton of sites that could be very relevant, so a little unpredictability doesn't hurt the searcher and it allows less direct manipulation of the results. On very specific searches, I see things working much the same as always.
| 10:26 am on Jun 25, 2003 (gmt 0)|
Sounds like you know what you're talking about. Sure you're not really GoogleGuy just trying to find out what we have
Interesting your question is directed to experts, ie. little spurts. Some of us lowly folk are apparently significantly outdoing the SEOs at this point in Google's
masinations. Analyze that phenominun please.
| 12:22 pm on Jun 25, 2003 (gmt 0)|
unfortunately the world is not a simple as that.
one of my sites with a very popular service/product: hardly any content, lots of pictures, just five pages chancing on a combination of keywords from anchor/referral text plus onsite text that seems to have been overlooked by everybody else on the web, very few hits and a really good conversion rate (i.e. people phoning up for it per hit)
one of my sites trying to promote something everybody says is very nice but nobody seems to need: lots of content, really good listings at present Google and lots of hits for all kinds of specialist interest, conversion rate = null
| 2:29 pm on Jun 25, 2003 (gmt 0)|
IMHO, Google would want better results for the searcher instead of worring about how much the SEO's have figured out. When a dog chases its tail he never quite catches up anyway.
Maybe us lowly folk just follow the rules and have enough knowledge to build good quality relevent sites that deserve high rankings, but not enough to try and get fancy or out-think or over analyze every blip on the screen.
| 2:43 pm on Jun 25, 2003 (gmt 0)|
"IMHO, Google would want better results for the searcher instead of worring about how much the SEO's have figured out."
Maybe so but Nothing is surprising in today's world.
| 3:41 pm on Jun 25, 2003 (gmt 0)|
Considering how Google works...
Where they actually track the performance of queries. Like, what percentages of users click on the first result, how many users click on something from the first page, etc..
Considering what GG have said...
|Why am I talking about this? Well, Kalman filters have a knob that blends between how much you believe your model vs. how much you believe each new data point. If you tweak the knob all the way in one direction, you always trust the model and any new input just gets ignored. On the other extreme, you can ignore your current estimates about the state of the world, and only trust each new data point as it comes in. If you set the knob too far in that direction, the object you're trying to model jumps all over the place each time you see even a hint of new info. |
I would venture to say that aside from the movement of data between data centers that we have unknowingly become Google guinea pigs. Who knows how many engineering team are now perusing every click through for every turn of the knob.
Short analysis and figuratively speaking, Google is optimizing for click through.
I predict that the serp will only stabilize until Google find that just right 'knob' setting.
Just my 2 cents
| 3:51 pm on Jun 25, 2003 (gmt 0)|
<<Short analysis and figuratively speaking, Google is optimizing for click through. >>
The simple fact that someone clicks on a SERP means nothing other than they liked the title.
If they were using click tracking it would have to be monitoring a lot more than clicks. I still haven't figured out a way in which this data can be used to create better results. Maybe the duration someone stays on the page?
| 4:08 pm on Jun 25, 2003 (gmt 0)|
it would be interesting if they somehow added alexa data into the mix - particularly the page views per user info... this might give some indicator of the credibility of a site. perhaps if google toolbar monitors user activity on a site... gotta love spyware :-)
| 4:09 pm on Jun 25, 2003 (gmt 0)|
Individual click through means nothing but if there's a huge click through on page 1 after an algo tweak means a lot.
The amount of click through on page1 can be directly analyzed in relation to the result of an algo tweak. For instance, if a lot of users are clicking results on page2 or even on page3 then that send a signal that there's something wrong with the result. Why would the majority of users would go to page 2 or 3 to find the answer for their query?
Google is known to be obsessive of the quality of search result that's their cutting edge and intend to stay on top of that game and just as an fyi, Google does monitor click through just like your adwords.
| 4:10 pm on Jun 25, 2003 (gmt 0)|
It would give more credibility if someone more credible than Alexa was doing the calculating. That data is for the birds. I think it would hurt more than it would help.
| 6:04 pm on Jun 27, 2003 (gmt 0)|
well i am using this thread to say that i posted a topic about the google moves or what i think (and what i read )may be google is doing but i dunno why that was not posted. I request Brett to atleast tell the user of this forum why their post are not shown. After submitting the post it just told me that my post is for review and never came up in the forum.
any member of this forum please i think i deserve an answer, its a fair and open forum isnt it?
| 11:08 pm on Jun 27, 2003 (gmt 0)|
As of about 6 weeks ago the Google News forum became unusable by most people here.
There were about a hundred threads, or more, open per day on some of the busiest days.
There were multiple threads on the same topic, often right next to each other in the list.
There were threads that were receiving 150 to 200 new messages per day, much of it noise.
Unusable, unworkable, unecessary.
Moderators are cutting down on noise in the forums.
Check to see if there is already a thread covering your subject and use that one.
I'm not a Mod here, but I suspect I have covered the salient points.
Arguing further about it just increases the noise level; so just find a suitable thread and join it.
| This 32 message thread spans 2 pages: 32 (  2 ) > > |