homepage Welcome to WebmasterWorld Guest from 54.196.24.103
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe and Support WebmasterWorld
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 133 message thread spans 5 pages: < < 133 ( 1 [2] 3 4 5 > >     
Mayday Algorithm Update - with video from Matt Cutts
tedster




msg:4144049
 8:19 pm on May 30, 2010 (gmt 0)


For all who have been looking for a more official word about Google's ranking changes around the beginning of May, Matt Cutts has released a video on Google's Youtube channel for Webmaster Help.


YouTube



http://www.webmasterworld.com/google/4132195.htm


edited by brett tabke (embedded link)

 

CainIV




msg:4144195
 4:15 am on May 31, 2010 (gmt 0)

It is not affiliate site only algo change. I think it is inbound link value change.


A link change doesn't seem enough, on its own, to describe the loss of long tail in this update, especially where the long tail phrase is not a specific match with the inbound link.

I think it is much more likely that Tedster is on the right path here - the change is a page-level assessment, which also included 'trumped' benefits when that specific page also has relevant and targeted inbound links.

A specific indirect issue with this may be that websites that lose long tail positioning may take some form of position loss where parent categories were supported well in breadcrumbs by those lower pages.

Very few of the websites I have examined in this update are what I would deem to be 'perfect' matches for the queries they lost positions with. Couple that with little or no inbound links to those specific pages, it makes sense that there may be other pages that could offer the consumer a better experience. And that is what Google is all about more than ever - offering the best possible 'page level' fit long tail queries.

I personally work with a few affiliate websites that are informational websites, and non of them have experienced any loss in long tail specifically.

setzer




msg:4144200
 4:57 am on May 31, 2010 (gmt 0)

Our site has actually seen an increase of close to 10% in G traffic over the past few weeks. Currently averaging 4600 UVs a day from Google. All original content, no affiliate links. Just a news blog.

solidradicle




msg:4144213
 6:14 am on May 31, 2010 (gmt 0)

It's a mixed bag for most of the sites. Long Tail keywords were the ones that Google never had controlled earlier. I see this as a great opportunity to small business who seek traffic. It's a win-win situation for Google, Top Sites on Google SERPs and New Business. It will also allow Google to define more authority sites for these long-tail keywords and giving an edge to Google Business.

I am not sure if anyone noticed that before MayDay change, Google used to make few website as authority sites for few keywords and it was easily visible in their search results. I hope this changes helps small businesses.

seoN00B




msg:4144227
 6:54 am on May 31, 2010 (gmt 0)

This is very bad news. I lost 40% of my traffic in the past two months. My site is really dependent on long tail keywords. =(

foolsgold




msg:4144270
 9:18 am on May 31, 2010 (gmt 0)

Interesting that Matt says Caffeine is continuing apace so suggests not fully rolled out if at all.

anand84




msg:4144291
 10:27 am on May 31, 2010 (gmt 0)

Neither does he confirm that everything with respect to this update is completely rolled out. Sites are still seeing lots of Googlebot activity. The other day my Google search engine layout briefly went back to the old format (with the old Google logo). The SERPS still are constantly changing. I am just having my fingers crossed for the moment..Maybe..just maybe the sites will be back to where it was earlier..

hughmac




msg:4144295
 10:32 am on May 31, 2010 (gmt 0)

I have a site that almost disappeared from the listings altogether, down around the 900 position. When i checked the incoming links they had disappeared as well, went down to showing 1 link only (using yahoo). The site has now come back to its original position and the links (150approx) have gradually reappeared. Now whether the site will stay there or not i dont know. It might be just a blip as part of this particular dance.

Tedstars point about page authority being automatically inherited from site authority and that this has changed could be very relevant because it would mean that any links from those pages would also be diminished. This might also be a swipe at the huge industry that has grown up around back link building.

Off course, just speculating as usual.

rustybrick




msg:4144310
 11:43 am on May 31, 2010 (gmt 0)

@Tedster

Didn't Brett say his rankings are the same but the referrals are down due to the new Google design?

macas




msg:4144330
 12:49 pm on May 31, 2010 (gmt 0)

Okay, this is not answer(s) which anyone waited from Google stuff , correct me if I'm wrong.

Since last October to this that date changes which was made on SERP's , website ranking and indexing , gone worst and worst by months ... and NO I didn't saw better ranking and relevants on Google Search engine , everything become total mess and disaster.

How can anyone tell me ( only if I'm total noob I will believe in everything ) that there's much better search results and content relevances when we see on first pages of search queries some malware pages/links , dead pages/links and some scraper website's ?
Sorry , I wont buy this and probably no one else here .
To be honest , bad things gone and on Google Images too, dead pages/website's and malware pages are more and more on top .

This is not "good" change at all.

And other thing :
Are we in some spining game of new algorithm and because that we have wild bouncing and dropping for search traffic ?
My website is in that spining game with bouncing and dropping traffic , and from time to time I go up and few weeks after go down... and this play I playing for months now. This include and my indexed pages and results.

hughmac




msg:4144346
 1:43 pm on May 31, 2010 (gmt 0)

This was certainly my first instinct macas, its possible that something went askew during an algo update or due to the caffine rollout and some elements of the datasets have been corrupted and need to be rebuilt, which might go someway to explaining the excessive googlebot crawling. It all seems to drastic and uncontrolled with #*$! sites and other dross frequently being returned for innocuous search's. Its simply not Google's way of doing things. This appears to be affecting sites worldwide and i dont think that marks a controlled rollout which would surely happen on a country by country basis. Of course Google would put the best gloss on it to disguise what happened, perhaps we will find out in due course.

dickbaker




msg:4144351
 2:11 pm on May 31, 2010 (gmt 0)

Hughmac, how would the number of links that Yahoo shows be caused by anything Google does?

TheMadScientist




msg:4144362
 2:54 pm on May 31, 2010 (gmt 0)

From tigger in this thread: [webmasterworld.com...]
Waaaaaay close to the end.
...but something that has amazed me is a competitors site that I know for a fact he's done nothing with for over 2 years (he's a friend) is now ranking for all his top terms

Some of the reports like this make me think it could be pattern based.

Link churn and historical link acquisition type data patterns. The statement is someone has not done anything with the site for 2 years, which would give it a very natural pattern of link churn and link behavior, and quite a few people here attempt to manage their links, some at an unsustainable rate of growth, which presents a different pattern and when the growth is unsustainable IMO the people building the links have not done themselves any long-term favors.

I think people forget some of those old patents (applications at the time) which were discussed and keep trying to do things now that worked then and are forgetting now 3 to 5 years later quite a few of those concepts have probably been fully implemented and refined, which means it's not 2005 any more people, so adjustments probably need to be made and 'but my site has more links' doesn't play nearly as much when you read about ten links last month being considered more important than 10 links 12 months ago.

When you think about it on a percentage basis if a site has 100 links and gains 2 they have increased their inbound links by 2%, where if a site has 5000 links they have to increase by 100 for a 2% gain in links to be reached, and if they are moving yet even more to patterns all those link building campaigns could bite people in the a** when they aren't sustained...

tigger




msg:4144363
 3:03 pm on May 31, 2010 (gmt 0)

but to rank a site thats not had ANY development at all seems like a smack in the mouth to webmasters and a complete contradiction of what G is bleating about - IE content / links

TheMadScientist




msg:4144367
 3:16 pm on May 31, 2010 (gmt 0)

My point is if you look at the threads about this update they look quite a bit like the old update threads, meaning the 'Google's broken', 'they can't keep doing this', 'my page should be number 1', 'but I have more links', 'these SERPs suck' type posts are all essentially the same, and so is the analysis of the SERPs / changes.

IOW: To me it sounds like the same old stuff, including the analysis of the changes and seems to indicate people are stuck in the past thinking what worked before should still work today and IMO that's not the case.

Has anyone looked at link building patterns and percentages to see if there is anything they are doing different than the competition, and also have they looked at the content growth / update patterns of the sites?

Keep in mind: Some things are niche specific, such as QDF scoring in results, so what applies to one site / query may not necessarily apply to another or all. Personally, I think it's getting increasingly difficult to make a determination on what changed, because there is so much 'independence' of rankings from query to query and even searcher to searcher as personalization becomes more refined.

[edited by: TheMadScientist at 3:27 pm (utc) on May 31, 2010]

maximillianos




msg:4144372
 3:24 pm on May 31, 2010 (gmt 0)

I am one of the few that definitely don't think G is broken. I lost quite a bit of traffic, but I can understand why G is trying to spread out the spectrum of results for long tail. Long tail shouldn't be controlled by a few mega-sites that are deep-crawled and cover hundreds of thousands of millions of key phrases. It makes sense to me to try to spread the results around to get a better sampling.

So while we took a hit, I can see why G did it, and I think they are headed in the right direction.

We've known all along that we were extremely fortunate to be getting any traffic for free from G, so you can't be upset when some of it is re-allocated.

We just continue to focus on our site and usability, and branching out our traffic streams to new sources. It has kept us around for 10 years so far, and we hope to see at least 10 more years... =)

TheMadScientist




msg:4144376
 3:37 pm on May 31, 2010 (gmt 0)

One more note: One of the things Cutts mentions is this update being about long-tail, not the head, so it could be the change was to the importance of the 'full semantic phrase' use and where a site with the 1, 2 or 3 word variation would be considered the most important / relevant 'by default' for longer searches in the past that's no loner the case and the scoring is based more on the weight and use of the individual pages.

IOW: Before if you held the 2 word phrase the 'weight' would be passed down to longer variations of the phrase and now the weight is no longer passed, but rather the pages are evaluated more granularly and individually for the longer search phrases... The difference I'm thinking of could be 'overall site weight' compared to 'individual page weight', and to really get into the difference it leads to page level link churn, page level phrase usage, and other page level factors rather than site level determinations 'cascading' from the shorter search phrases to the longer ones.

[edited by: TheMadScientist at 3:41 pm (utc) on May 31, 2010]

anand84




msg:4144377
 3:39 pm on May 31, 2010 (gmt 0)

Long tail shouldn't be controlled by a few mega-sites that are deep-crawled and cover hundreds of thousands of millions of key phrases. It makes sense to me to try to spread the results around to get a better sampling.


I feel left out here because neither is mine a "mega site" (My website only has pages in the very low thousands) nor am I benefitting out of the spreading of results.

Anyway, why do you think long tail shouldn't be controlled by a few sites? Not that my site is one of them, but from what the definition of search-engine tells me, it is about scouring through the billions of web pages to render those 10 or 20 links that are most relevant to a search query. Spreading of SERP-love or Robinhood-ism on part of Google makes no sense because that would mean that as we move ahead, there will be several million more web pages added to the Google database and all these new sites will start sharing results. This would only lead to more dilution of search results.

Okay, from my side, I think Google has got what they wanted. My site lost it in the SERPS and I am readying myself up for Adwords marketing starting June.

steerpikegg




msg:4144386
 3:49 pm on May 31, 2010 (gmt 0)

Whilst not a response to TheMadScientist's post, it did make me reflect quite a bit...

I think Google's notion that lots of inbound links = quality content are and have always been fatally flawed. To base an algorithm on something which is by its very nature self perpetuating is just plain madness. I'm sure this has probably been covered here many times before, but there must be untold good and interesting pages out there that have never been linked to because they've never been discovered. The fact that anybody at all has to talk about link building campaigns to get ranked or managing their links so that they aren't penalised for accruing too many links too quickly just goes to show how ridiculous the situation is. Google need to accept that this is a flawed system to base your main ranking factor on. I would imagine that whoever originally conceived the concept of hyperlinking only ever considered it as a way to cross reference information, not as a way to judge the quality of information.

Whilst search engines are supposedly advancing, I think for a lot of the time, the results I used to get on Altavista back in the mid 90's were far better that what I get nowadays from Google. Granted there are many, many more sites, but I think in the process of trying too hard, Google seem to have lost sight of the fact that a search should be largely based on what you typed in as opposed to what a computed algorithm 'thinks you might want to see based on how many spammers and/or scammers have posted links to it'.

I also think that the supposed 'quality based nature' of Google has brought us to the current situation where the general populace's mentality is that of 'if it's not on page one it's not worth looking at'. I remember (again back in the days of AV, Yahoo, Infoseek) trawling through literally hundreds of results looking for the best information, and often to great benefit discovering many new sites and much information from a wide range of sources. These days, your average web user would not learn any more than the contents of the first page Wikipedia result.

Quite a rant (which I am good at), but I always get a bit crazy when I think about a world where people have to 'manage their links'.

walkman




msg:4144390
 3:51 pm on May 31, 2010 (gmt 0)

"Didn't Brett say his rankings are the same but the referrals are down due to the new Google design?"

Rusty, it's impossible for Brett to know and keep track of all his rankings with close to 800,000 indexed pages.

Reno




msg:4144392
 4:02 pm on May 31, 2010 (gmt 0)

we were extremely fortunate to be getting any traffic for free from G

I continue to be amazed how often this sentiment is expressed, that Google is somehow bestowing on us their gifts like some sort of generous benefactor, undeserving though we may be. The truth is, Google exists at all because they TAKE free content from a gazillion sites and re-package it in response to queries. It has always been a symbiotic relationship where all parties should stand to potentially benefit. So it's equally true that THEY are "extremely fortunate" to have so much free content available, as it has made them all very very wealthy and they've not had to pay so much as one cent for any of it.

........................

Tyme




msg:4144410
 4:38 pm on May 31, 2010 (gmt 0)

Our website is about 2.5 years old, we have never gone after links. All linking is natural and we don't get many of them. We have about 20 recip links that are very much on target with our site. Every bit of text on the site has been written by us alone, except for a few pages were we invited people to post their own stories. Were not a huge site, but we cover a lot of topics. Our traffic from Google was holding steady at about 2400 visits a day. We saw the first jump in traffic April 17 the visits jumped to 3200 a day and has been holding steady or risen slightly since.
We have a lot of affiliate links that are mostly Follow. In the last three days old pages that never got traffic from google are beginning to appear in the serps, mostly first page results. The titles of those pages are at least 6 words long and have been almost an exact match to the query.
Don't know if any of this is helpful, I'm still a novice but thought some of this might be helpful because we gained traffic instead of loosing.

TheMadScientist




msg:4144413
 4:52 pm on May 31, 2010 (gmt 0)

It has always been a symbiotic relationship where all parties should stand to potentially benefit.

Well said, but how many people here complain because they're not the party benefiting any longer when someone else is? You say yourself all parties should stand to benefit, so what about those parties who's sites (pages) have not been ranked previously or not ranked as well but are now replacing others pages in the SERPs and finally seeing some benefits, do they not deserve to benefit like those who's sites have ranked better previously?

If Google was not publishing a top ten results any longer I could see a bunch more room for complaining, but I think they still do for most queries, which means people are not complaining about Google stealing their traffic as much as they are about someone else seeing the benefits of the traffic, which IMO is a completely different story.

@ Tyme...
Interesting results you are seeing.
Thanks for posting.

ADDED: The traffic on the main site I've been watching / working with is down 20% (it bounced all over for a while but has steadied). Page views and time on site have solidified up and sales for the last two weeks alone were a record for any full month I've had the site, so I think they got something right, even though overall traffic is down.

hughmac




msg:4144421
 5:14 pm on May 31, 2010 (gmt 0)

dickbaker i wish i understood why but have published sites for three years now and monitor site links on a regular basis. I have no understanding of why unless the pages being linked from were deindexed for a period? Would that do it? Im only reporting what i'v seen happen on one of my sites, not seen links disappear and then reappear again gradually over a period of weeks. Backlinks to other sites seem ok, no change.

pontifex




msg:4144423
 5:23 pm on May 31, 2010 (gmt 0)

@tyme - yes it helps to see posts with "I gained traffic in my niche" and actually after reading along for now 30 days there are some who posted that and it completes the picture IMHO to the right balance.

@tedster - thanks for posting the link to the video. It was about time that an official statement came out, even if it is vague as always :-)

I also second maximillianos with: "I can understand why G is trying to spread out the spectrum of results for long tail". The problem I see is that the kind of sites that cover the long tail topics in more volume are mixed with low quality scraper sites or poor content. I also think that this is a simple consequence, because a hobbyist writing about the "green widget repair procedure in iowa" has:

. no idea how to create nice pages

. no relations to the topic SEO in any form, builds not links, etc.

. and old PC with crappy tools: if he would be very computer savvy, he would be also able to create better pages

... and more disadvantages, if you think that thru!

That means, if you try to dig up new "long tail content" to widen the spectrum, you ultimately dig up the dirt of SE spam from the sewers. There are diamonds hidden in that crap I agree, but I see 3 pieces of rubbish for 1 piece of gold!

MC said: this algo update went thru a big quality process - and I say: rework that process!

I base my research mainly on the music/MP3 area where the long tail is huge and diverse. Good listings we had went down and I see a lot more scraping ranking high than ever before in that field in the last 8 years!

P!

rustybrick




msg:4144428
 5:30 pm on May 31, 2010 (gmt 0)

"Rusty, it's impossible for Brett to know and keep track of all his rankings with close to 800,000 indexed pages."

Walkman, so Brett was wrong about that. Fine.

walkman




msg:4144433
 5:50 pm on May 31, 2010 (gmt 0)

"Rusty, it's impossible for Brett to know and keep track of all his rankings with close to 800,000 indexed pages."

Walkman, so Brett was wrong about that. Fine.


Whether he was wrong or you misquoted him I don't know. Check this [google.com...] and let me know how is it possible to check the rankings on all these pages, and hundreds of new pages daily?

JoeSinkwitz




msg:4144440
 6:20 pm on May 31, 2010 (gmt 0)

Walkman, it is doable on a daily basis with some heavy scraping from multiple IPs.

To provide a bit more detail, even though somewhat off-topic to the discussion I realize.
1. Pull all data from CMS/forum into a flat-file setup, parsing for unique phrases, then map unique phrase to page URL.
2. Scrape search engine of your choice for each unique phrase, cylcing through based on frequency and # of IPs at your disposal.
3. Supplement with log data to determine which phrases need more frequent checks.
4. Make pretty graphs.

Theoretical stuff. ;)

Of course, this really isn't necessary since trends can generally be seen quicker looking at a subsection of data (W% of new pages, X% of old pages, Y% topic level, Z% category level, etc).

julinho




msg:4144457
 7:13 pm on May 31, 2010 (gmt 0)

I think Google's notion that lots of inbound links = quality content are and have always been fatally flawed.

Google need to accept that this is a flawed system to base your main ranking factor on.


Google have already accepted that; check out items [008] and [009] of this patent: [appft1.uspto.gov ]

IMO, Google's algo is relying more and more on user behavior signals.
Links will be of little help, for pages which don't satisfy their visitors.
And sometimes it is not easy for a site owner to tell whether or why their pages are not satisfying visitors, vis-a-vis the competitors (Google, OTOH, have lots of information about that).

SUMMARY OF THE INVENTION

[0010] Systems and methods consistent with the present invention address this and other needs by identifying compounds based on the overall context of a user query. One aspect of the present invention is directed to a method of organizing a set of documents by receiving a search query and identifying a plurality of documents responsive to the search query. Each identified document is assigned a score based on usage information, and the documents are organized based on the assigned scores.

tedster




msg:4144464
 7:48 pm on May 31, 2010 (gmt 0)

@Tedster

Didn't Brett say his rankings are the same but the referrals are down due to the new Google design?

You're right, that is what Brett reported [webmasterworld.com]. It was Rand at SEOMoz that reported a long-tail traffic loss that correlated with Mayday.

The full roll-out of the new Google design came on May 5 and that does make analysis more challenging.

Reno




msg:4144465
 7:51 pm on May 31, 2010 (gmt 0)

do they not deserve to benefit

Google "owes" me nothing. The way I see it, there is a kind of "deal" between Google and all of us, meaning all webmasters everywhere. That deal is simple: They can freely present our content (text and/or graphics) and we get a fair shot in the SERPs.

Google has another kind of deal with regular visitors: "Use our search engine and we'll give you the best results".

So there's 3 parties: Google, regular users, webmasters. To G the users are the most important, as they should be for all of us too.

So using your example, if the sites replacing us are of genuine quality, then we have no real complaint because (presumably) the algo was fairly applied to all candidates. But if as so many have said here those listings are populated with sites that are thin or using questionable ranking techniques or are mostly dupes of other better sites, etc, then there's a problem.

I think what's muddying the waters in my mind is the confusion about whether this is a completed rollout, or, is in a rollout in progress that still has a way to go. I'm as patient as the next guy, so if we are seeing this introduced in stages, that's fine and I'll wait it out. It would be nice for someone at Google to indicate that, but if I'm hearing them correctly, they're saying it's done. That's what worries me -- if it's done, then the New Google 2010 is a step backwards, and that can't be good news for those of us who try to build quality pages without the use of questionable tactics, and are now seeing our traffic drop significantly because other less quality sites are too often filling the top positions.

And of course there's the way the SERPs are now laid out, with a greater emphasis on sponsored results and diminished organics... but that's another discussion.

.......................

TheMadScientist




msg:4144475
 8:05 pm on May 31, 2010 (gmt 0)

I see what you're saying better now Reno, and I'm glad you're not complaining just because you aren't getting all the traffic any more... ;)

That's what worries me -- if it's done

I think the definition of the word 'done' could probably be interpreted... By done I think they mean it's present through out the system, so the 'integration is complete on all datacenters', which I think is a good think, because now the 'dial turning' can begin.

So, to me, them saying it's 'done' is a good thing because now it's an 'adjustment and fine tuning' situation rather than more implementation.

I think this is why we hear quite a bit of 'they can't keep doing this ... Google's broken' when they are implementing a new piece to the puzzle and right after they get 'done', because Cutts also says there are about 400 minor algo adjustments a year, and once a rollout is 'done' it can be tuned with the adjustments. So, I think 'done' is subjective, and 'done' to Google may not mean exactly the same as 'done' would mean to the rest of us, because 'done' seems to imply stagnant, set, not-changing, but what it really means IMO is 'there in the algo for G to adjust, edit, fine tune, update slightly, etc.'

This 133 message thread spans 5 pages: < < 133 ( 1 [2] 3 4 5 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved