- I wonder what the change in algorithm are that are effecting these changes.
- We all need to figure out what this specific algorithm change is and how we can fix our sites for it.
- For us it is not a long tail KW that is effecting traffic.
We manage an ecommerce site (not affiliate site) that is 3rd largest in its vertical.
80%+ of our links to this ecommerce site are from medium quality sites in terms of home page PR (PR0-PR4).
My guess is that our back links are on pages that have either lost live PR or are no longer in the G index. Basically either have lost PR juice transfer or filtered out by this new algorithm update thus effecting our rank.
- We understand that some non-long tail KW's will get effected. What is at the core of the latest algorithm change? we are not seeing quality sites in the top 10 any longer... Most of the sites in our vertical are purchasing high PR links and are not effected... is that what we need to do to climb back up?
[edited by: tedster at 8:37 pm (utc) on May 30, 2010]
I agree with sGroup
there is a link component to it too. To remind people we have had some theories about 'new' links not being counted (yet?) or google needing to replenish it's database by re-crawling the sites that link to us.
|medium quality sites in terms of home page PR ... we are not seeing quality sites in the top 10 any longer... |
Are you referring to quality sites not seen in the top 10 any longer in the same way (TBPR) as you are the quality of the sites linking to yours, or do you mean something different?
We have 3X to 4X more inbound links than anyone on the 1st page.
Our links are mostly from sites who's home page PR is between PR1-PR3.
The sites on the 1st page:
- have less links in total.
- mainly have purchased links from sites who's home page PR is between PR5-PR7.
Appears like changes in NATURAL (I hate giants, (but this is business) for paid changes....) search ranking patterns which are against fellow webmaster accross world followes ?
We need to knock this change out of the park since it's an algo change and not a human evaluation. I'm seeing tons of new spam (multiple sub domain abuse) in the rankings I follow.
If you have a site that has seen significant traffic loss I'd like to know the following (and I'll answer too with one site that got hammered).
# does the site have affiliate links? answer: yes
# if yes, how many per page? answer: 15
# if yes, is nofollow used? answer: yes
# does the site repeat verbatim title and descriptions of products as worded on the affiliate site? answer: yes
# if yes does the site offer any unique content to improve the "stock" product information? answer: yes (reviews)
# is the site a thin affiliate site? answer: no
# does the site, if it is an affiliate site, have a better than 50/50 ratio of non affiliate pages? answer: yes
other questions need to be added, I'd love to see a form created to gather data from a wide range of webmasters to spot the culprit more quickly. We're dealing with an average of a change per day so we NEED to pick up the speed on figuring them out.
I have a feeling that sites who offer products for sale AND have an affiliate program are seeing their traffic increase from long tail, the algo change might be to remove the middleman. I don't have enough data to conclude that just yet.
Quick point of information - this change did not only affect e-commerce sites or affiliate sites. We know from Brett that WebmasterWorld has been hit. And Rand also blogged that SEOmoz has been hit.
your question are aimed to sites that are affiliate sites. Not all sites on teh web are affiliate sites. You may want to get some better questions together to figure out the algo change.
It is not affiliate site only algo change. I think it is inbound link value change.
# does the site have affiliate links? answer: NO
# if yes, how many per page? answer: NOT Affiliate site
# if yes, is nofollow used? answer: NOT Affiliate site
# does the site repeat verbatim title and descriptions of products as worded on the affiliate site? answer: NOT Affiliate site
# if yes does the site offer any unique content to improve the "stock" product information? answer: yes
# is the site a thin affiliate site? answer: NOT Affiliate site
# does the site, if it is an affiliate site, have a better than 50/50 ration of non affiliate pages? answer: NOT Affiliate site
***** I'm seeing tons of new spam (multiple sub domain abuse) *****
We can found same since last 15 months and reported more the 4 times, unsure why no feedback !
[One reason can be, google LIBERATES adwords advertistors to SPAM publishers across search/content network country specific]
|I think it is inbound link value change |
I think there's more to it than that personally, but I will say that my site has seen very little change in (Top 3) rankings or in traffic levels and also has just a handful of IBLs pointing at it, so there may be something in that. It also has affiliate links, although redirected via a nofollow script.
Webmasterworld seeing fewer long tail searches might be a devaluing of forums in general? It might also still be affiliate related since forums repeat things like user profile links and such many times per page like affiliate sites send links to the same re-direct page many times per page.
I did say there needed to be more questions - I only asked questions related to the site I personally saw lose long tail search traffic. That site btw has now seen it's bounce rate drop 7% since the change so maybe the algo is working, the pages visitors bounced from are ignored more now than before.
#what was your overall bounce rate? answer = before 47%, after 40%
My sense is that this is more a measure of PAGE quality than site quality. I know that there has been a long standing problem with certain pages ranking on long tail searches only because the parent domain was strong. However, when you click through the relevance was really weak or absent, even if the vocabulary was there.
Whoa - now that's an interesting change. It seems to support the idea that this change is a good one for the user. Even though the site lost traffic (and that certainly hurts sites with ad based income), it looks like the traffic you still get is better targeted. Thanks for sharing that one.
|when you click through the relevance was really weak or absent |
Can you elaborate on this some more?
Relevance is something that I see as critical , but perhaps we need to be a bit tighter in what we define as this.
Navigational structures ?
Keyword density ?
External linking ?
Internal linking ?
Some of the demonstrated sites mentioned as being hit , may have difficult in their heirarchy of semantics , meaning that if the site and/or the page is supposed to be about " a " , by the time it drills down to weaker pages to " z " the relevance signals have also become very weak.
My speculative hunch is that this is where the change is.
Another thought while working through all the "possibilities". Perhaps google has categorized the net. If it deems a search query to be about something in the widgets category perhaps the scope of possible longtail results is now limited to sites that fit that category. That theory would support what happened to webmasterworld. I'd be interested to know if the missing long tail search term traffic to this site could be construed as having looked for "non webmaster category" content.
That's an interesting thought. Since this is algorthimic , it would be reasonable to assume Google has been collecting data , perhaps , as you say , on a category basis. Some sort of data matching validation - site to a category database match.
I think i've been noticing this type of validation with mis spellings over the last 12 months or so, which all though different in it's application , provides for a similar concept of execution.
It's all highly speculative and imaginative , but it does make potential sense.
It brought my zombie site back from top 10,000 to top 20.
|when you click through the relevance was really weak or absent |
Can you elaborate on this some more?
I'm really speaking as a search user here, not as a webmaster or SEO. And by relevance, I really mean "user intention" behind the query terms, rather than mere text matching.
It's something like this: Many pages on strong sites used to get long tail rankings apparently from a combination of a strong site and the mere presence of the query terms somewhere on the page. No proximity or semantic association between the keywords at all, just mere presence - and often in different parts of the page template.
These strong-site pages were even outranking other pages that had the exact query phrase right there in the content area. In the 1990s, I learned to put up with this kind of "matching" but who likes it? The link I clicked on wasn't relevant to my query's intention, and the need to reformulate the query, over and over again, could be quite frustrating - heck, it still is.
Early on in this Mayday discussion, I mentioned the possibility of a change or evolution in phrase-based indexing. If you add to that increased weight for content area terms, you've got the picture I'm contemplating.
|...by the time it drills down to weaker pages to " z " the relevance signals have also become very weak. |
That too, absolutely yes. Again, the template areas such as the main navigation, or user comments, or even sidebar "features" used to help generate too many long tail rankings. The deep page itself had minimal relevance to the keywords.
I guess I'm saying it would be good to look at the lost traffic at a very granular level rather than only in aggregate. Pull out some specific examples and ask whether that traffic was truly right for that page. The bounce rate observation from Sgt_Kickaxe would support this direction.
|Perhaps google has categorized the net |
IMO, they absolutely have done that. There are some patents from a while back that talked about creating an automated taxonomy for websites. I'm pretty sure something like this kicked in last year with the focus on user intention.
You may notice that some queries (1-word and 2-word) almost never show a "transactional" result, or never show an "informational" result. Other queries always show a mix. On a very high-level, that's an indication of one kind of taxonomy kicking in, but the full taxonomy is much more granular.
The take away for me has been not even to be concerned about ranking certain pages for certain queries - just write it off as a lost cause. Google has decided that those terms want a different type of page from a different taxonomy category, and that's that
[edited by: tedster at 12:53 am (utc) on May 31, 2010]
Whitey, there is no question that Google has been collecting data. Log into your "Google Accounts" page https://www.google.com/accounts/ManageAccount and see all the information they have listed about you on that one page.
~ your web history unless you opted out
~ your history with every tool you've used
~ data about every blogger post you've ever made
~ Knol, subscribed links, feedburner, reader... the list is endless and your history is in each
Click on "view data stored in this account" and log into your google account. Next scroll WAY down to the very bottom and click on "show all" next to "Other Products". Your history is stored for each of those.
More importantly, click on the "Webmaster tools" link called "manage my sites"... those sites are tied to you as a webmaster too. All the info is on that one page suggesting it is all available from one central database location. I'd be very surprised if that info wasn't used in adding a quality factor to the sites because of YOU as a webmaster. If you're a known spammer do YOU hurt the quality of all sites you own?
Back on topic...
|Google has decided that those terms want a different type of page from a different taxonomy category, and that's that |
That fits with the latest change results too. I know there are owners who are on cloud 9 with all the new traffic they are getting, it has to be going somewhere, so until they speak up and their sites are analyzed chalking it up to a lost cause makes sense. I just wish I wasn't seeing mashup sites and multi sub domain sites with identical content rankings so well.
|That site btw has now seen it's bounce rate drop 7% since the change so maybe the algo is working. |
Bounce rates dropping do not necessarily mean folks are finding what they are looking for. If I search and find what I need right away, I then leave the site (higher bounce rate). If I don't find what I'm looking for on the first page I land on, I poke around a site (lower bounce rate).
So it is really hard to say if such stats show an improvement in the algorithm or a deficiency.
I've been observing sites that have used either keyword specific URL's and / or keyword specific navigation of more than three words to form good rankings in the long tail.
From Google's point of view , there may have been a perception of " manipulation " in the results that came from these and in many instances it didn't always reflect in the quality of the content. If this theory is correct , then maybe Google considerd that it could do a better job of deciding what those pages were about.
So could it be that Google's data matching capability that better analyses on page content and other signals has overtaken these tactics , or at least weakened them , as one element of the change ?
Time will tell max.
another observation - copy and paste an ebay auction title into search and the results are different.
Before - the top results invariably were ebay and whatever affiliate site had copied the auctions onto their site and used the titles as their page titles.
After - specific words are pulled and official sites for those words are returned. a product named example returns example.com, not the entire exact match string.
Perhaps long tail exact match in titles carries less weight and/or a titles core subject (as determined by google) matches official sites more often. This change may be a simple title weighting tweak.
Sgt_Kickaxe - I sense we have similar observations , even though you are able to be specific.
My hunch is that if we look at factors like the one you are describing , more science may come into the discussions.
Again , these early observations taste of the relevance signals that may have been tweaked. Why would Google want Ebay determining what results can appear when it prefers to control the output itself. More Adwords revenue benefit maybe with higher click thru's .
Never loose site of the commercial benefit that drives Google in it's search for more revenue by controlling the funnel.
|Never loose site of the commercial benefit that drives Google in it's search for more revenue by controlling the funnel. |
The key factor that drives Google revenue is people depending on Google as a search resource, over and over again. This means that the organic results need to be as good as Google can get them to be.
So I strongly doubt that Google would intentionally sacrifice organic quality to drive more PPC clicks. Yes, I know that many have expressed that suspicion and over a long time. I just don't buy it, mostly because it would be short-sighted and essentially stupid.
I also really do believe the input we get from Google people, over and over again, that the paid advertising side has little to no interaction with the organic side. I've strongly considered the opposite point of view - this isn;t blind faith.
I've even "aggressively interviewed" several Google engineers and Adwords/Adsense reps at various conferences. I really am convinced - algorithm changes are aimed at improving the organic user's satisfaction, even when they miss the boat
I can see what Google did this, it's a step in the right direction. Vanessa Fox mentions this too, again it makes a lot of sense.
No matter what certain sites they published, it was an instant hit. The power of the site was instantly passed to each page.
All Google is doing is leveling the playing field a bit.
If you're right tedster we need to learn what metrics google is now relying on more than they were before. For such drastic changes to have unfolded on many sites the change isn't trivial.
Perhaps it's time to unhook our sites from various services and start with a clean slate on these downgraded sites since they were just downgraded without changing.
I'm very tempted to remove all search engine specific services and remain as unbiased with this site as possible. That would mean...
~ removing feedburner and going with a generic script for rss or not offer it at all.
~ removing nofollow tags completely as they are specific to only one search engine which would force user content to be moderated or link free.
~ removing specific social networking services and going with only the non search engine company related standards (twitter/facebook etc).
~ removing search company based search features and going with an in-house version like WW has.
~ removing analytics and using 3rd party software that has no affiliation to a search company.
etc.. (wow, I can think of 12 other search company features right now - sure are lots).
With a clean slate I could test for effect by adding only one at a time, or realize i'm happy without them. The other benefit is cutting off the data gathered about a site, which is obviously used to grade the site, and force search companies to use less invasive measuring tools. While that sounds drastic we've been given no data as to what constitutes data that adds to a sites quality rating and what doesn't. When in doubt do nothing the saying goes. Having the site do nothing until it is proven to add to its quality makes sense. Surely ranking factors can't be tied to using a search companies products anyway right?
Webmasters are always asking which features can improve their sites, perhaps its time to figure out if removing any of them removes any "poor quality" signals. Does having less than 500 rss subscribers hurt? 250 readers? Keeping all these things OFF your site until it receives a lot of daily traffic might be better than offering features that scream "barely used".
[edited by: Sgt_Kickaxe at 2:08 am (utc) on May 31, 2010]
|algorithm changes are aimed at improving the organic user's satisfaction |
..... could be , but in my opinion they go hand in hand.
turning back to the relevance issue , something that strikes me about some of sites that have been mentioned is that they loose relevance signals as the content builds up. For example, in this thread , the fresh data get's indexed almost immediately , but as it get's older, and more pages are created with only pagination links, the heirarchy looses it's key indicators. Same for structures like Rand's site - although from memory the methods of ageing/ archiving pages is different. I wonder if those who have experienced similar losses have similar issues ( or indeed if it is correct ).
The concept may also be similar for folks who cannot get pages to serve results from day one. So on the one hand sites that grow , can loose relevance and on the other hand sites that create pages from scratch without all the relevancy indicators get ignored when too far removed without support from the root domain.
[edited by: Whitey at 2:06 am (utc) on May 31, 2010]
>> My sense is that this is more a measure of PAGE quality than site quality.
Tedster, I agree with you on this. On our main e-commerce site (not affiliate) we continue to rank for the major terms which are super competitive and have *high* quality inbounds.
What we've lost out on are about 10-15% of our users who used long tail terms to find us ... think product titles, "buy whacky green widget in location". Those product pages are algorithmically weak - dupe descriptions with little value content (SKUs, Weights, descriptions, manufacturers etc..). Over 3 million of these pages.
To the person looking for that widget in our specific location, the page is VERY helpful and we converted at a significantly higher rate on those searches than others. However, I can see how they're algorithmically weak.
Another site that has lost a few percentage points in traffic is a large forum that we run. Again, highly relevant content for someone looking for something obscure but these pages are not encyclopaedic in nature - both in terms of content and variable deep inbound links. The forum topic would be something like "where can I buy a NTSC video convertor in location" and would rank for "ntsc video convertor location". Hard to describe this, as we have over 100,000 similar long tail searches over a month and most of them are so low frequency that they're not worth tracking.
New inbound and onsite strategy needs to be put into place to get external juice and distribute internal juice better than we have in the past.
I should add that there is very little or no change on our smaller sites (100-2000 pages).
|Another site that has lost a few percentage points in traffic is a large forum that we run. |
Any similarities to the forum examples provided ?
No keyword specific URLs. Too messy in my opinion.
Onsite anchor for forum topic would be "Where can I rent a furry green halloween costume?", title would be "Where can I rent a furry green halloween costume - location".
Forum and shopping sites tend to have a very hierarchical navigation ... "main page -> category -> sub category -> topic / widget". The internal link distribution relies on breadcrumbs and navigational links. Encyclopaedic / wiki type sites have a huge amount in content linking.
I have not tested this... and to be honest, not even going to bother too much with this, as we've seen similar things happen in the past and it generally tends to even out as we keep working on our link building and generally improving in site, on page and in content navigation.
| This 133 message thread spans 5 pages: 133 (  2 3 4 5 ) > > |