homepage Welcome to WebmasterWorld Guest from 54.226.166.224
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 395 message thread spans 14 pages: < < 395 ( 1 2 3 4 [5] 6 7 8 9 10 11 12 13 ... 14 > >     
2: Google MAYDAY Update - SERP Changes May 2010
cangoou




msg:4131672
 10:22 pm on May 12, 2010 (gmt 0)

< continued from [webmasterworld.com...] >

Puh, this is really strange: I still see 2 resultsets - and after 7 days I can say that SERPs #1 are shown from right at 0am to about 5am local time and SERPs #2 the rest of the day. I thought this was some kind of dancing around again but it seems to be quite stable now for 1 week.

Funny thing is: SERPs #1 are better then before, SERPs #2 are worse (in rankings of main keywords).

Google is no longer following my robots.txt file. It has indexed hundreds of pages that I have disallowed.

Same here, got a complete set of new pages with robots=noindex in the index.

Generally it's best to make any changes when SERPs are stable, never within an update due to panic!

In general I would agree, but I haven't seen something like "stable SERPs" since January... This makes it quite hard not to panic ;-)

[edited by: tedster at 7:43 pm (utc) on May 13, 2010]

 

dertyfern




msg:4133467
 8:47 am on May 16, 2010 (gmt 0)

That all may very well be TheMadScientist. I'm no expert in AI, load balancing, and all that hyper geeky stuff :)

But what I do know is that the moment I saw the new search interface popping in and out of my browser, traffic to lots of site I own/manage began tanking--many of which have outstanding quality links and good content.

My non-technical take is that between the new interface pushing organic down even further and the introduction of the left margin in search results, a portion of traffic that we'd all otherwise get is making use of ads, maps, and tools (to a larger degree).

Google's just slicing away another layer of organic traffic to monetize.

graeme_p




msg:4133470
 8:56 am on May 16, 2010 (gmt 0)

I have seen some copied wikipedia articles in the SERPS recently (and the copying is that way round as the source is acknowledged). Something wrong with duplicate content filters?

mrez74




msg:4133477
 9:26 am on May 16, 2010 (gmt 0)

TheMadScientist....so what do you suggest we use to actually get a good sense of where our site ranks if we are not able to emulate what others are seeing? I mean how can we track our rankings across the board (what other searchers are seeing) if we can only seeing ranking from our end and have no clue what they are seeing is the same or not?

Should we stick with GWT data for now to let us know exactly where our site's average ranking is? Instead of assuming that if I do a search on my computer for my keyword and it shows
my site at #3 that I'm ranking the same for all searchers?

Seolearner....I currently rank #1 for a keyword that GWT says has 2400 searches a month and I get about 15-20 uniques/45-50 impressions a day from that keyword.

Also for the last 2 days the SERPs have been stable now for me and the same one dataset shows up every time I run a search. I am not getting any of hourly SERP fluctuation between the two datasets like I was getting in the
last month or so.

mrez74




msg:4133480
 9:31 am on May 16, 2010 (gmt 0)

On another keyword that G shows monthly EXACT search volume of 1900 I get like 1-3 uniques a day with that keyword with site ranking at #5.

TheMadScientist




msg:4133484
 9:35 am on May 16, 2010 (gmt 0)

My non-technical take is that between the new interface pushing organic down even further and the introduction of the left margin in search results, a portion of traffic that we'd all otherwise get is making use of ads, maps, and tools (to a larger degree).

I see what you're saying to a point, but there are odd things that keep happening with my traffic on one that I haven't had stats on long enough to understand...

In the new layout thread I posted about my traffic on it tanking about the same time as the roll out of the new layout, but today is the best Saturday I've had with it, and it's a long-tail site from the start and it's consistently moved up in traffic throughout the change until last week when things got all 'goofy', technically speaking of course.

My visits are generally trending up, with last week being down, and visit times / page view averages are all over the place and it's so odd I almost think there's a reporting error for the stats, but I can tell you from what I saw today, the long-tail traffic is still going to sites (more today than on any Sat. I've had the site by about 20%) and last week could have simply been an odd week, and I'm sure Google wants more traffic to stay on their site, because that's business, but IDK how long they could keep it that way, because one of the reasons people go to the web AFAIK is to visit websites, not a single website, so I can see where ads and maps and 'stuff' are important for them to display, but I think people who use Google will continue to visits websites, because that's why they call it surfing the Internet, and I think it'll be interesting to see where things go.

What I'm saying about the changes and taking time is fairly easy to understand if you just think: It takes time to gather the necessary data within some systems to refine them and there are a couple systems (or more) it seems they need more time to just plain gather data within before we see the 'final product'.

I guess another way of saying it is we could be seeing the final product 'in the rough' and as data is gathered and processed by the systems the roughness will probably wear off, and if that's the direction they've gone and what they're doing the system will continue to refine itself constantly, which is a huge move if you think about it, and IDK if that's what they're doing or not, but it's something tedster suggested and he's made some fairly good 'guesses' in the past if I remember correctly, so IMO it definitely could be.

Personalization and AI type rankings both work about the same way, although separate, so it's easy to think of it like 'click based rankings', only with a really complex scoring system...

Simplistically, if you start off with ten URLs on a web page and you don't know which is the most popular or which is the best answer, you might display them in a some type of random order until people start to click on them and 'vote' for each. After you have some data you could begin displaying the ones with the most clicks at the top after a certain 'threshold' is hit. (Say 20 clicks 'gets a site in to the scoring' at the top of the page). The ones with less than 20 clicks could remain randomly displayed below...

Sites within 5 clicks of each other could even switch back and forth until there was a 'more definite better answer'...

The above could all be written into the system from the start, so it adjusts by itself based on the clicks, but thing is you need a certain amount of data before people can see the refinements the system will make to the scoring over time, and until you get the information you need about the specific sites (pages) for the specific system you are running from the outside it looks random, broken and not well refined, but once the scoring mechanisms have the data they need to 'kick in' the system looks 'fixed' and 'refined' and continues to look more 'refined' over time.

Hope that makes a bit of sense. :)

[edited by: TheMadScientist at 9:58 am (utc) on May 16, 2010]

TheMadScientist




msg:4133486
 9:39 am on May 16, 2010 (gmt 0)

TheMadScientist....so what do you suggest we use to actually get a good sense of where our site ranks if we are not able to emulate what others are seeing? I mean how can we track our rankings across the board (what other searchers are seeing) if we can only seeing ranking from our end and have no clue what they are seeing is the same or not?

Throw out the old school thinking and watch traffic very closely? When you make a change don't go to the results the next day and see if it's there or if your site moves, but give it a week and see what happens with your traffic, maybe?

I actually haven't checked ranking in Google in about 3 months on most of mine... All I've been doing lately is watching traffic, Closely!, but I'm a bit different, so you might want to do something else.

Personally, I think the days of 'instant knowledge' related to change are about gone... IMO moving forward things are going to be more about finding a way to influence and generate traffic than knowing exactly where you rank for a given term. They're moving away from letting us see where we rank for anyone else's searches, and the WMT data is a nice bone to throw, but IMO being able to manage traffic without seeing the SERPs is going to be the key moving forward, because it doesn't matter if you're number 1 in a generic result set when you're between number 5 and 8 in 60% of the personalized sets and your title doesn't attract any attention (AKA Clicks) ... ;)

dertyfern




msg:4133490
 10:01 am on May 16, 2010 (gmt 0)

TheMadScientist, you're watching traffic closely so I imagine you've got your nose stuck in logfiles most of the time: are you experiencing an increased preponderance of Google search referrals with your site URL and referral keywords in the Google referral string?

I'm seeing quite a bit of this.

TheMadScientist




msg:4133495
 10:29 am on May 16, 2010 (gmt 0)

Only when the referrer string starts with /url rather than /search :)

I would guess it has to do with the switch to AJAX for the results, but haven't used Google enough in the last 6 months to really know and haven't looked into it that closely. I'm more concerned with long term trends and patterns right now, because sometimes I have a habit of, uh, micromanaging? if I don't do it that way and I think people who constantly 'tweak' things because of a few down days present a manipulative pattern of changes over time, so I've been tring to work with ebbs and flows and look at things over time, the same way I think they have to for rankings...

It's one of the reasons why I only made one change last week even though traffic was down, and I didn't actually do it for rankings, but more for visitors and actually did it because traffic was already down, basically thinking what could it hurt? Of course with personalization visitors more directly effect rankings, but anyway I kept asking myself what I could do better or different and the only thing I came up with was a navigational change I thought made the site easier to surf, and it seems to have worked because my page views have increased since the change, and I thought it was fairly minor, but, go figure...

NOTE WRT WATCHING TRAFFIC CLOSELY: I guess I would say I like to know what happens hourly, daily, weekly, monthly, and all the little details, so I can incorporate them into a major change when it's time to do something different... Watch things closely, so you can change things wisely might be another way of putting it.

pontifex




msg:4133528
 1:13 pm on May 16, 2010 (gmt 0)

... as data is gathered and processed by the systems the roughness will probably wear off...


I think that part is now just a stronger ranking factor but it has been there quite a while (rember that results sometimes pointed to a redirect URL?)

The question is: HOW STRONG are these factors in the algo now?

In the past I managed quite well to make it up for missing keyword density by just pouring more links (for example).

Means: sloopy SEO could be compensated by massive linking - or: other parts of the 200 algo factors were not as strong.

Now after this update I think there might be some new "super factors" that influence the ranking very much.

If that AI theory has some ground, I am sure it is just a new evaluation of certain parts they already used in the past.

So from my point of view...

"Super factors" before MayDay:

  • Keyword in title
  • Keyword on page in unique content
  • Named links to this page
  • Trust from the link sources

poured in to any optimization with enough volume could save the day...

Now that would be:
  • all the above
  • page loading speed
  • user clicks in the SERPs
  • user time on the destination

roughly speaking?

P!

MacSeth




msg:4133634
 6:08 pm on May 16, 2010 (gmt 0)

Because we knew website speed was going to be a heavy thing we spend a couple of months to improve website speed big time... after that we gained traffic but after the 30th of april it is going downwards hard. We have absolutely no clue why other then Google pushing some buttons somewhere.

And what about Location?
My wild guess is that location is becoming more and more relevant. If you have a big website such as us which focusses on the whole world you are getting pushed down by local results. At least... that is something I suspect. I have no other idea other then the things which where already mentioned.

Reno




msg:4133648
 6:52 pm on May 16, 2010 (gmt 0)

If you have a big website such as us which focusses on the whole world you are getting pushed down by local results.

I do agree that this could be important, but I don't think it matters whether your website is big or small. Even if you have a small site but only sell to the WWW as a whole, without a local bricks & mortar outlet, you may see the same thing happen. Add to this the speed factor (which as far as I can see, G has entirely wrong for most people/pages); and the inclusion of so many blog listings in the SERPs; and "personal history"; and the possibility (discussed here) that some filters may still be off ... it may all add up to the decline so many of us see. For the record, I see no improvement as of this posting.

................

dusky




msg:4133663
 7:47 pm on May 16, 2010 (gmt 0)

TheMadScientist IMO is on target, G* is experimenting with the AI concept and the aim to finally to eradicate SEO firms and experts from influencing the SEPRs without due merit. SEO expertise got to point where spammy sites can outrank the largest sites on thin long tail pages, G* thinks this has got to stop. The only problem now is the update is not complete, and still on going

The returned search for different people is now different to others. Joe bolgs in Chicago searches for "best way to clean widgets" will get site 1: widget-cleaning.com/how-to.html as top site returned, BUT Sanda Blogs in NY gets site 2: cleaningwidgets.com/cleaningwidgets.html as top site. Now both of the webmasters of site 1 and 2 will be puzzled, they would not know where they are ranking to Sandra or Joe, they will only know for their own search, when they do the search themselves. The returned results for webmaster 1, may be on page 23, why because he/she is a webmaster, searched, never clicked, never went to site / page, never spent anytime, never, never..,...in G* AI algo conclusion, this searcher did not find this site interesting, he just looked. Sandra on the other hand did go to the site and bought something or bookmarked it, or spent some time on it FEW TIMES, G* thinks, let's returned it again if she searches for the same or closer keyword. Joe may get the site he went to before as top ranked, if he clicks it again, that site may score more....

Searcher satisfaction is a big influence now, however, whilst the update is still going on, many sites are flat and still being re-spidered, so that AI influenced Algo appear not to be very good, because not all sites are included / taken into consideration and only sites that are moved and ranked from scratch, the rest are still being moved, spidered, re-assessed etc.

It appears now, webmasters and SEO experts will have a harder time, instead of working out keyowrds ranking influence from tiles, descriptions, onpage SEO, backlinks, anchors etc, while those are important, they have to take into account what makes the searcher return and click again, time spent on site, call to action, speed...and all the rest of the factors that will make the site returned on page one to THAT PARTICULAR searcher.

Regionalization, language, market intelligence, social grouping...., will all be more important factors. If G* is going to base its algo on an AI foundation, common variables and factors we already know are going to be more important to SEO experts, time on site, bounce rate and all the rest of it are probably the first one need to cater for and improve, they are likely to be amongst the good practices one may see being part of the AI algo

TheMadScientist along with many here (tedster et al) who spoke about this while back IMO are heading to the right conclusion, the mayday update include a massive algo change based AI and USER SATISFACTION. I can see why G* is rushing to be the first on this, M*soft were rumored to want to use the same based on the wofram alpha concept (http://www.wolframalpha.com/).

In any case, the update is far from complete and data is still being moved, you'd think they should've waited for the algo change until all the data is moved to the new Infrastructure, then again, probably the new algo was already written and just still being tweaked while the data is being moved in chunks, so if your site is completely moved (re-spidered and re-ranked) you'll start doing well, even better, before that and you are bouncing from page 1 today, page 5 few days later and back and forth and so on.

Andylew




msg:4133670
 8:18 pm on May 16, 2010 (gmt 0)

I like the idea of AI but it doesnt fit the current symptoms, it doesnt explain why adult sites are returned for basic searches amoungst other things.

I would also suggest that implementing a big change like an AI algo would be done at a different time to a major infrastructure change - see matt cutts blog a while back where he explains the difference between the different updates, caffine was an infrastructure change not an algo change.

Google is fundermentaly broken, even adwords is broken it isnt returning anywhere near the number of adverts it would normally do and these are obviously non algo influenced.

Re AI i dont think google would have sufficient data on sites to implement it reliably, yes they can measure bounce rate, clickthrough etc but how do you factor in new sites with no data and time on site cant be measured. I can see more ways that an AI type interface can be influenced for long tail than that current system.

TheMadScientist




msg:4133681
 8:27 pm on May 16, 2010 (gmt 0)

Re AI i dont think google would have sufficient data on sites to implement it reliably

What would you call personalization a close parallel to other than AI?

They're trying to show people the results they want to see by attempting to derive intent and other things from previous searches and reactions to results with personalization, aren't they?

cien




msg:4133685
 8:30 pm on May 16, 2010 (gmt 0)

Anybody care to explain what "AI" stands for besides American Idol? You guys are killing me out here. :-)

mcdarwin




msg:4133701
 8:46 pm on May 16, 2010 (gmt 0)

AI would be artificial intelligence. Rather than a more or less static algorithm it could be an algorithm than can "learn" from for example user behavior eg. CTR, bouncerate and time on site and have these factors influence the algorithm.

trakkerguy




msg:4133703
 8:55 pm on May 16, 2010 (gmt 0)

AI - Artificial Intelligence.

londrum




msg:4133705
 9:03 pm on May 16, 2010 (gmt 0)

this is nothing like AI, is it. it's just a load of new things that they're introducing to the algo.

AI implies that it's 'learning'. but all they're doing (probably) is factoring in user behaviour -- which is basically just a load of hard data that they're gathering from their toolbar, analytics and other stuff like that.

it's no different to them factoring in old stuff like inbound links.

the only difference this time is that it looks like they're starting from scratch, and it's going to take a few iterations of the algo to get everything back to normal.

tedster




msg:4133710
 9:16 pm on May 16, 2010 (gmt 0)

it doesnt explain why adult sites are returned for basic searches

I think it could. Why do two year olds use "bad language" when company comes to visit? Because they haven't learned enough yet.

It's the wave-like behavior of shifts in long-tail rankings that make me suspect automated ranking changes based on statistical feedback. I suspect this kind of approach was already being tested on a small scale for quite some time.

Remember past complaints about "Yo-yo rankings [webmasterworld.com]" and "traffic throttling" [webmasterworld.com]? Only a few websites saw that, but for those that did the effect was quite unsettling - and some affected sites were very strong brands. Now the same kind of reports seem to be more widespread.

The number of Google employees with PhDs in statistics is pretty large. They've got to be doing something to justify their salaries, right? Deciding between natural versus un-natural [webmasterworld.com] wouldn't take that army of statisticians.

I even suspect that the Position 6 Bug [webmasterworld.com] may have been related to building an AI algorithm. And now that Google has a lot more infrastructure to work with, the grand plan is coming into place.

What we're seeing now is not some recent brainstorm. I think it IS a multi-year grand plan, whatever it may turn out to be. And we will all live with this new approach for quite a while. I plan to study the current SERPs as much as possible while their algorithmic underwear is still showing. Soon enough, it will be mostly invisible.

dickbaker




msg:4133725
 9:49 pm on May 16, 2010 (gmt 0)

I plan to study the current SERPs as much as possible while their algorithmic underwear is still showing.


But what to study? I'm looking at the number of inbound links to a variety of sites, the age of those sites, estimated traffic, number of pages on the site, and a couple of other factors. For most of the first-page sites, these factors in various combinations seem to explain the sites' rankings. It's when some site that has 10% of the links, 50% of the pages, and is 25% as old as the other sites pops on to the first page that I get confused.

I'll look at a site like that and make a note that it won't last, that it might stay but won't go above #6, or that it might come and go between pages one and two. More often than not I'm right, but some sites are really throwing me off right now.

I'd measure link freshness, but I don't know where to find a site's link history.

tedster




msg:4133738
 11:16 pm on May 16, 2010 (gmt 0)

But what to study?

I'm starting out with the degree of churn. Which SERPs (which type of query terms) are churning and which seem relatively stable. Also which positions seem to be most involved in the churn. I'm hoping, with such data, that some further directions for investigation will become clear.

And I might not get anywhere - but it feels like an angle that might bear some fruit. Something more than historically acknowledged factors seems to be at work, now. If it is CTR (or even Site Performance) then the data for websites that I am not involved with will be hidden to me, so conclusions will be rather hard to come by. But I will give it a try.

1script




msg:4133793
 3:17 am on May 17, 2010 (gmt 0)

Getting back to the AI topic for a moment here:


First, let's bring SciFi excitement down a notch: it's not AI as it will never, say, write poems (or start a nuclear war for that matter), after a year or so of sorting search results. It will always sort search results. So, it's a specialized software with a self-adjusting algo - not unlike your favorite antivirus software.

Anyways, what I'm trying to say is that such software would require looking at lots of data points to come up with a reasonable change in the algo. Your antivirus software looks at EVERY email you receive and EVERY file you run/open and so it has plenty of data to crunch. However, most (90%+ in my experience) searches do not provide enough data points simply because there may be 1 or two searches per month.

This thread has been centered around problems with long tail searches and I would like to sort of bring it all back tying the pseudo-AI and long tail together. They simply don't mix: how do you vary SERPs and analyze the resulting CTR if all you got is one individual search in a considerably long period of time? On most of those extra-long tail searches (again, 90%+ of traffic on some sites) you literally need to wait years to accumulate any statistically significant amount of data. Or, if you let your AI loose on a thin data point diet, you are going to have wild swings on the output end. Maybe that's what explains the Yo-Yo traffic some people here reported?

What I'm trying to say is that there is still an important role in this new Google for a good old fixed algo. If you are trying to rank for "buy [your favorite ED remedy brand]", let the Force be with you - you are going to need it to fool Hal 9000.( "sorry Dave, I can't let you search for this brand. Remember last time you took it and it lasted more than 4 hours?")

For the rest of us - I think it will be business (almost) as usual once the move over to the new servers completes. Another reason for continuing the use of the fixed algo (however complex with 201 variables) is that the long tail searches are usually not important, both from AdWords revenue stand point (most have no ads) and from user experience. Most reasonable people would understand that feeding an entire page of text into Google will produce unexpected result and will manually adjust their query. So, you don't need your CPU intensive pseudo-AI on unimportant searches - just send it to the old algo.

What do you guys think?

CainIV




msg:4133797
 3:37 am on May 17, 2010 (gmt 0)

If it is CTR (or even Site Performance)


Tedster, this is an interesting angle. However don't you think that if this was being taken into account then they would want to release that information publicly, since it would imply that the website needs to provide a better user experience (which Google has been trying to say all along)

TheMadScientist




msg:4133799
 3:42 am on May 17, 2010 (gmt 0)

Sounds reasonable, but what if every action (or non-action) has influence, so query, click, no click, SERP page views prior to a click, etc. provides an influence to the whole (of searches and the site) and to the individual searcher (and page in the results) specifically?

So visitor searches for 'Widgeting Information' and clicks on the number 2 result which is widgetworld.com... If that click influences the rankings for not only the page, and the specific search but also for the site in general you could have somewhat (fairly?) reasonable site wide data in a shorter period of time.

IOW: Rathern than thinking when someone clicks a 'Widgeting Information' link the click, the elapsed time from the query to the click, the visitor behavior after the click, etc. all influences the sites rather than just the specific search or page in the SERPs, which would make a bit of sense IMO, because you would get 'feedback' from a more often used query you could probably use to influence 'Widgeting Information for Building Large Widgeting Systems' if you know who has the most liked 'Widgeting Information', if of course widgetworld.com had information on Building Large Widget Systems present.

It's a really complicated process I'm thinking they would probably have to use and trying to explain it simply is really difficult, but it's a thought I'm having about what they might be trying to do, and why it's causing 'oddities' with the long-tail sites used to a specific positions or traffic levels, and when you throw 'influences overall' together with 'personalization' which is probably being influenced more heavily by a certain behavior(s) of a specific user than 'overall influence' it gets really complicated and confusing.

The shorter version is:
If you use short-tail terms to influence the long-tail rankings on a per-site basis you have personalization much quicker.

[edited by: TheMadScientist at 3:49 am (utc) on May 17, 2010]

trakkerguy




msg:4133800
 3:43 am on May 17, 2010 (gmt 0)

If it is CTR (or even Site Performance)


don't you think that if this was being taken into account then they would want to release that information


As soon as they (google) confirm site performance influences serps, webmasters will be trying to fake stickiness, or traffic, or whatever they think google is measuring.

I'm not saying they don't/won't use visitor behavior, but just don't see them being too forthcoming about it.

tedster




msg:4133809
 4:05 am on May 17, 2010 (gmt 0)

Both Site Performance (as in speed) and CTR (from the search results) are currently being reported in Webmaster Tools. I'm certain that Google offers webmasters that data for a good reason.

gethan




msg:4133816
 4:43 am on May 17, 2010 (gmt 0)

< moved from another locations >

On a large site I'm seeing the query: " site:example.com " * returning a rapidly decreasing page count.

(* I know this is not a reliable or even accurate number - just looking to see if anyone else is noticing the same.)

28th April ~1,000,000 pages
15th May ~500,000 pages

Each day has been a steady drop - a few hundred thousand less. Each page corresponds to unique content (community site). WMT listing is stable. Traffic is fluctuating with Caffine/Jazz - but not following the same trend.

The same thing is observed for " site:example.com intitle:keyword "

I haven't been tracking any other large sites to compare this with.

Anyone else?

[edited by: tedster at 4:58 am (utc) on May 17, 2010]

Reno




msg:4133817
 4:47 am on May 17, 2010 (gmt 0)

don't you think that if this was being taken into account then they would want to release that information publicly

As trakkerguy indicates, G* is notoriously tip lipped about indicators that may give a clue to the SEO crowd. Still, I'm perplexed -- and I think others here are as well -- about the Cone of Silence that everyone out there seems to be under right now. Either they have nothing to say because they think it's all going along splendidly (there's a leap of faith for you!); or they have nothing to say because everyone is scrambling to diagnose and fix the problem. Neither scenario is encouraging, but of the two, I'd hope they are in serious repair mode, and I do wish that MC or someone would say something (ANYTHING) to help put it in perspective.

............................

ohno




msg:4133835
 6:30 am on May 17, 2010 (gmt 0)

So, who's looking forward to Monday?! Analytics says traffic UP on previous weekends, we sold two items! Looking at our other site tracking shows some users spent 5 minutes+ on our site. Of what i can gather personalised search requires the Google toolbar?(that is what it said when i checked web history). If this AI thing is correct then we are going to have catch 22 for a while in my eyes, users won't click on our site unless they go past the first few pages to actually find us! I see the shopping feedback is still broken & reporting double the number of reviews(is that Google way of helping us? Haha). Anyway, this week should be interesting either way, good luck everybody.

tedster




msg:4133837
 6:35 am on May 17, 2010 (gmt 0)

Of what i can gather personalised search requires the Google toolbar?

No. Just a Google Account of any kind - anything you sign into. and even if you don't have that, if your browser stores cookies, that browser's history with Google is also used to customize search results.

If this AI thing is correct...

Please, let's not turn the AI idea into a new mythology. It's just an idea, a theory that some of us are investigating - and nothing more than that at this time. I'm hoping that our forum can become more like a think tank than a place to come for "the answer". That's the best way to approach things, IMO. Then if some new idea begins to bear fruit, we can all share the victory.

ohno




msg:4133843
 6:48 am on May 17, 2010 (gmt 0)

Tedster i really think you are on to something with AI, it would stack up & make Googles job easier in the future. Reno, i think the fact they havn't said anything is a GOOD thing, if it was all complete & working how they wanted i would expect them to be shouting it from the rooftops (just preying they do not announce anyhting until we see things as "normal"!). Just one point-did anyone else experience erratic traffic prior to this all happening? Looking at analytic data i could plot next weeks vists to within +-5 which smacked of traffic throttling.

This 395 message thread spans 14 pages: < < 395 ( 1 2 3 4 [5] 6 7 8 9 10 11 12 13 ... 14 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved