| 3:08 am on Feb 27, 2011 (gmt 0)|
|The 'lack of complete sencentces' was and still is the reason for a -950 re-ranking. |
Over-optimization, or spending too much time on SEO forums as Matt Cutts put it [seroundtable.com...]
| 3:10 am on Feb 27, 2011 (gmt 0)|
Sooooo, eHow is not a content farm according to Google. The end. Time to start hiring cheap writers.
Back to work guys. Nothing to see here.
| 3:14 am on Feb 27, 2011 (gmt 0)|
I have done my best to express my opinion here without coming out and saying it in which I will not. Seo all of the websites you want and you will see that the future only holds with sites that are producing, or ones that are producing nothing at all. A middle of the line (producing average revenue) website will seldom place in the top unless it has a recognized or unique name. This is why fresh content is always needed. Old content will not survive by viewing it by this way. A good idea for websites that sell product is to have a "deal of the day" in which you are seeing a whole lot more of. Even for that to place you all know what needs to be happen. You will see that free will become a thing of the past rapidly, and "charging" will be the "new free".
| 3:20 am on Feb 27, 2011 (gmt 0)|
|Other queries only need 3 quality results |
Yeah, eHow, About and Wikipedia enough times in a row could drive anyone batty enough to use Bing! lol
|Why else would a company change their algo? |
query determines freshness (qdf), user behavior, comparative analysis?
|eHow is not a content farm according to Google. |
Nope doesn't look like it ... And, keep in mind their little js link trick to artificially increase the PR passed through the internal links GoogleBot actually gets to see doesn't seem to be against Google's guidelines either.
|Back to work guys. Nothing to see here. |
| 3:57 am on Feb 27, 2011 (gmt 0)|
robert76, I'm sorry for asking questions that I can see now were too intrusive. I just thought that, with your site being one with 100% original hand-written content, but one that still suffered, we might be able to consider some other factors.
On the "stale old site" issue: several posts back I mentioned a site I did for a retailer in 2001 that's been sitting on page one for years for a certain phrase, despite the fact that nobody has touched the site. Well, in checking other key phrases I found that site on page one for other phrases as well. It's like an old anvil that won't budge.
In looking at phrases in other niches that I was tracking for a long time last year, I see that some of the sites that were page one for certain phrases now cannot be found even in the first 150 results for those phrases. They didn't just get demoted, they were vaporized.
| 3:59 am on Feb 27, 2011 (gmt 0)|
Something that tedster said some time ago was to make sure your site is an actual business. There are many small things that can show it is a operational thriving business that is useful to users and not just put up for the sake of converting leads or driving the traffic elsewhere. Here is a checklist of some possible things you should have on your site that might win you points with google.
1. Unique content (duh! but not everyone gets this, if you're content is not unique, dont cry that you are not ranking well)
2. Quality content. Is your content useful? Was it made for your visitors or for google only?
3. Fresh content. Is your site at least updated once a week? If not, try adding a blog to your site and blogging once or twice a week. This tells google its being cared for and regularly updated and not something that was last updated 2 years ago and the info might be out of date.
4. Social Media. Is your site mentioned on facebook, twitter etc? This shows google people are sharing your site so it must be useful.
5. phone number / contact forms. This shows that you can be reached and are a legitimate business and not just a site driving traffic to a 3rd party site.
6. Is your site the source for a given product, or is your goal to get your visitors to a paid site where you earn a commission? If you are driving referrals, dont be surprised to see google cut you out as the middleman.
7. Look for alternative means of promotion such as video marketing, forum discussions etc. The more different places your site/business is mentioned the better you look as a legit business.
| 4:00 am on Feb 27, 2011 (gmt 0)|
|I don't think that's the mechanism. Many big name sites that lost rankings are not using copied content |
Tedster, they might not have copied content but their content is published everywhere and this update seem to have impacted sites whose content are republished partially/fully through feed aggregators or copycats.
AlyssaS has a good point on the dial down of internal links.
1) It impacts site owners (many of them individuals) who usually don't get the external links as compared to articles from big business houses.But where do the big business houses get the links from?
1)Their own blog/site farms.
2) They also get the bulk of the external links from people who prefer to link to authoritative sources than the sites they often refer to as "Source" at the bottom of their pages.
Yes, they do link back to the original source at the bottom but they often do it with just the blog name. Some like eHow take it to the extreme of crediting you with JS links as googlebots are "link blind" to those neat designs.
Why do people prefer linking to them and not the original source?
They are authority sites in the eyes of google and google/Matt Cutts encourage people to link to authority sites.
Hence, they get the external links while our poorer "Source" doesn't get them. As time passes by, pages from our poorer "Source" get buried while pages on those big authority sites remain on top of SERPS.
What form of SEO does the poorer "Source" resort to?
Highlighting his Popular pages on Home page and sitewide.
What has google done now?
Through this update, google is helping the authoritative ones further by dialing down internal links. Internal links are the poorer "sources" best form of SEO and they link to their best content to keep themselves afloat in the SERPS.
What is stated in google's official blog?
|This update is designed to reduce rankings for low-quality sites |
The poorer "Source" fall within this block.They often have internal links to their best pages which they want to highlight on Home page/sitewide.
Why do they want to highlight?
Because it got referred by the "authoritative sites".
|At the same time, it will provide better rankings for high-quality sites |
The authoritative sites/blogs get the external links at the expense of the poorer "Source" and get a further boost from google through this update.
|We can’t make a major improvement without affecting rankings for many sites. |
Through this update, Google was targeting content farms that keep themselves afloat through lots of internal links.But in the process they are also killing our poorer "Source".
The above google statement does seem to be an admission of the fact that they are aware of how the "internal links dial down" will destroy the poorer "source"
The net result:
1) Authoritative Sites which re-post stuff in other "Source" sites flourish in this update, as they have the bulk of external links
2) Authoritative sites that get a lot of external links through their own site/blog farms flourish in this update.
I guess dial down of internal links may be an understatement.
| 5:33 am on Feb 27, 2011 (gmt 0)|
dataguy: not necessarily. You could have been penalized across the board, but some of your competitors for specific search queries could have been hit worse, thus giving you an increase in those pages. Also, the algo looks like it promoted a good bit of spammy material into the SERPS by virtue of removing a lot of the good players... Earlier one person was reporting a data center that was still running the old algo. You could do a side-by-side and see what things look like like.
| 5:40 am on Feb 27, 2011 (gmt 0)|
Indyrank, not a bad theory here, but I'm not sure G would want to strictly devalue internal links. They are a valuable source of information about what is the most important part of the site. I think this is a sitewide aggregate penalty, not simply a link devaluing.
Case in point: several of our pages have dozens of quality external links from related sites coming into them - in fact, more high quality external links than competitor sites - but in this new algo, those competitors (some of them truly ridiculous) are coming out on top... apparently because their sites are not affected by this penalty.
| 5:42 am on Feb 27, 2011 (gmt 0)|
Remember that this has been rolled out only in U.S. The increase for those fewer pages might have come from other geographies.You need to do a geographical study of the traffic as well.
| 5:49 am on Feb 27, 2011 (gmt 0)|
|but I'm not sure G would want to strictly devalue internal links. |
It might not have been completely devalued, but dialed down to a great extent.
Case in point: several of our pages have dozens of quality external links from related sites coming into them - in fact, more high quality external links than competitor sites
1) As you said rightly the value of those links depend on the volumeand the authority of sites from where they come - quality
2) Many now resort to 301 redirects.They build links to some pages or domains and redirect them to their pages that rank.So it isn't easy to find external links these days.They might have huge volume of links from a number of crappy farms.
| 5:57 am on Feb 27, 2011 (gmt 0)|
What will sites like ezinearticles do now?
They now have the option to become another eHow.
They might also look for external votes as they do have the money to get them.
Who will be killed by this update then?
1) Contributors to these UGC, as they might shut the doors for them like what eHow did. You might have known what happened to those contributors to Huffngton Post.All they could do is create a Facebook page and bring in traffic to Facebook.
2) The poorer "Source" who will continue to be the "Source" for these big auhtority sites, without anything in return.Would you expect anything anymore? Nope you just cannot.
| 6:34 am on Feb 27, 2011 (gmt 0)|
Guys and Gals,
The new Gooogle does not give a $&%* about copied/duplicate content... (as much as it used to.)
| 8:41 am on Feb 27, 2011 (gmt 0)|
My first post here...hoping my data can be of some help to us all.
I've got about ten Advanced Web Ranking projects in different industries, each with 10-25 keywords and tracking 5-10 sites. So I'm seeing about 150 keywords and 80 or so sites over a variety of niches. Most of the niches don't appear to be affected other than the normal week to week movement. My e-commerce niche was though, and my site took a massive dive across the board. If we're unable to get back up we'll be out of business. It wasn't only my site. A number of my competitors also lost badly, across the board. A few "new" sites popped in, and a couple of the usual players benefited across the board...more than just rising due to other sites falling.
My e-commerce site is paired with a real physical business at the same physical address (although owned by another family member and a separate company accounting wise). The physical business has been around since the 70's, is featured in tour books, has been on the news, etc. It's a real brand, and our site gets brand searches...probably more so than most of the competition. My family has run and lived off this site for the last 6 years. All of the content is unique and was written by us. All of the pictures are ours, etc. This site is NOT populated by manufacturer descriptions, there is no duplicate content, no low quality text, etc.
I've been reading the theories regarding why sites have taken SERP dives...
Usage Data/Time On Site/Bounce Rate: I have a hard time imagining this could have much to do with anything, as our site has a larger selection, looks as professional and well designed as the other sites that are still ranking...and there's no reason any other site in our niche would have substantially better user data indicators if any.
On Site Content: Our content is all entirely unique. We're probably now above average with the amount of content we have on our site compared to what's currently ranking, but not by a lot. I imagine the "grade level" of our writing would be higher than average. My wife wrote most of it and has a PhD. I can't see how on-site content would have contributed to our massive across the board loss.
Links: My opinion is that this update is related to links. Some Googler, referenced in this thread I believe, stated a few weeks back that something big was coming with the way Google values links. Here's what I'm noticing regarding links and anchor text in THIS NICHE:
The new sites that are now ranking often have very little to zero links to the ranking page. (I know deep pages began ranking a couple months back based on links to the home page/other pages.) While these new sites do have links to the home page, some have zero links with the anchor text that matches the query. For example: if there's a page ranking for "red widgets" there are no links to that page and no links to any other page of the site with the anchor text "red widgets".
Some of these new sites are brands (Walmart, Lowes, etc.). Some of them are older sites with lower quality content. Some of them are new, spammy affiliate sites with terrible content. What they have in common (minus the affiliate sites) is that they have very few links and not much visible on-site or off-site SEO.
Then, there are the usual players who have the greatest number of spammy links. However, the common denominator appears at this point in time to be high PR, obviously paid links. So I'm seeing sites ranking with the greatest number of obviously paid links, and other sites ranking with little to zero links (and/or anchor text links for the given query). It seems to be two opposite ends of the spectrum.
My site does have some paid links. Prior to this "update" you couldn't rank in my niche without having some paid links, regardless of what anyone might like...Google included. If you didn't get paid links of some sort, you didn't rank...simple as that.
The only thing I've discovered that makes our site an outlier is the ratio of unique linking domains to total links. We do some advertising on industry related sites...site wide...which has greatly skewed our ULD to TL ratio. I'm wondering if this could have been the problem.
In any case, if anyone from Google is reading this: High quality sites and businesses have been trashed by this update. People will lose jobs and businesses will close. These businesses are NOT content farms, do not have low quality content, and are most often MORE relevant than the big brands and low quality sites that have replaced them. Please fix this.
| 8:59 am on Feb 27, 2011 (gmt 0)|
maxmoritz, I also think this update is about external linking, more precisely sitewide linking. And the main problem is that we are actually talking about on topic linking, no links from link farms or bad neighborhood. To me, it looks like the big G is trying to force us to remove (or ask for removal) our links from sites that link to us just to rank in their engine.
| 9:06 am on Feb 27, 2011 (gmt 0)|
And here is a stupid example: a search for "international widget" returns many results with company names like: "mexico international widget", "*** International | lake tahoe widget", "santa fe widget".
| 9:11 am on Feb 27, 2011 (gmt 0)|
|maxmoritz, I also think this update is about external linking, more precisely sitewide linking. |
Well I kind of hope you're right, because then it would be relatively easy to fix. However, I think there's more to it than that.
To be more concise: From what I'm seeing this update seems to be about links and it seems to affect entire sites.
| 9:59 am on Feb 27, 2011 (gmt 0)|
No, not entire sites, mainly the pages that are linked get devalued. Of course, this can be a sitewide effect if we are talking about your homepage that gets linked the most, all internal linking from your homepage gets devalued in this case.
| 12:04 pm on Feb 27, 2011 (gmt 0)|
|The site lost it's site links that google showed, the PR was dropped to 0 (from 3) and in webmaster tools, the Crawl Rate and the Crawled Pages per day dropped to 1/3 to what they were the previous day. |
I just posted the following thought in the linked thread, but wanted to post it here for people who haven't been around as long or seen this type of an update before.
The above describes things very similar to what we used to see quite a bit of during updates, because certain systems and filters would get turned off while the update was in progress and then get added back in after it was completed.
I'm tellin ya there's still more to change...
| 12:17 pm on Feb 27, 2011 (gmt 0)|
I'm experiencing drop again, probably its rolling out in the rest of the world already. Anybody experiencing the same ?
| 12:33 pm on Feb 27, 2011 (gmt 0)|
Apologies if already posted, although I don't think it has been: eZineArticle's response to this algo change and - after analysing its effects a bit - what they're doing to change things:
Might contain some useful information since they're one of the big sites to get hit a fair bit by the Google update.
Two of the big changes for people who use EZA is they'll be raising the minimum word count, and they're going completely no-follow (which itself will knock some sites since I know some - albeit unwise - marketers who rely solely on EZA to get exposure and backlinks).
Fair play to Google - whilst there are a bunch of false positives which need sorting ASAP IMO, I didn't think that Google would come along and truly sort out the increasing mess with low quality content farms that we've been seeing recently.
It does look as though they're serious about the issue, and I still think there's other big algo changes (relating to backlinks) to come.
All-in-all, if/when Google sort out the false positives, this could be quite a good thing for the web.
| 12:54 pm on Feb 27, 2011 (gmt 0)|
10 year old site handled personally be me at all times. Home improvement field. 98% of the content on the site is written by me based with on the job experience (getting sweaty and dirty) Others are contributors that work their butts off in a labor intensive field. Some major sections go into detail in how to do certain procedures, others are information about a certain application, product or tool, once again from my personal field experience. I am not a sit behind the desk kind of guy that does copy research. My research is on the job experience. Did quite well with Google. One major setback in 2004, returned to normal one month later.
Yes many Ehow links …about 140 at last count. Approximately 100 sites recognized as copying content six months ago. Links? Low by standards mentioned here. About 800 over 400 sites.
I think anyone could classify the content as being unique. After all how many people in this crazy game would go to lengths to actually learn about a business first hand? People are generally lazy and it takes far too much work especially in my trade. Traffic loss 38% Good living and can wait this one out hoping for adjustments. I am patient.
| 1:07 pm on Feb 27, 2011 (gmt 0)|
I don't know how you have any ehow links GoogleBot sees?
| 1:10 pm on Feb 27, 2011 (gmt 0)|
you're right..the count was from a few months ago. I was merely trying to show I am on the Ehow regurgitation end too.
| 1:12 pm on Feb 27, 2011 (gmt 0)|
Ah... Thanks for the clarification.
Just to be sure, you HAD those links from ehow and now they don't show as backlinks, correct?
| 1:12 pm on Feb 27, 2011 (gmt 0)|
So ezinearticles say they are not a content farm? I guess no one, and I mean no one has a website which is a content farm, then.
| 1:14 pm on Feb 27, 2011 (gmt 0)|
I'm starting to wonder more and more about the ad-to-content ratio. Some of the sites which took the biggest hits (EZA, WiseGeek) have got a fairly large ad-to-content ratio. In-fact I've seen some cases of EZA and WiseGeek pages which have more text in the ads than on the page's content.
But at the same time, I personally think that this update is fairly complex and brings in a massive array of different factors and indicators (which would be consistent with the fact that different sites and different types of sites have been hit, and that Google spent perhaps a year on this update).
I think it might well look at everything from SERP bounce rates (going to a result, coming back soon after to click on other results) and ad-to-content ratios to quality of content and backlink profiles/signals.. and much more. And with a fair few false positives, too.
| 1:50 pm on Feb 27, 2011 (gmt 0)|
I have a suggestion if anyone wants to follow it. I am in a niche where, to my embarassment, mine is the only site to drop off of page 1 for my major term -- I also fell anywhere from 5 - 20 places for dozens, maybe hundreds of other terms.
My competitors all seem to be doing fine. So why have I fallen and they have not? My site is every bit as good as theirs in terms of content, when a human being reads it. In fact, far better than several of them.
I am going to do some more research on this later today. But I will be the first to say I am not the best at this kind of research, not being a web developer. I did not build my site. I know quite a bit about SEO, but I'm geared more to the public relations side of things. I am a writer of content, more than anything (which makes my current situation kind of ironic, no?).
Anyway, if anyone else wants to use my site and niche as a test example, just let me know and I will give you the niche and the keywords etc. As I said, it provides a test case where just one site out of the top dozen or so has been hit. Since most of the sites have similar content, there may be something about my site that I am overlooking, but which a more expert SEO would spot quickly.
Let me know if interested.
| 2:40 pm on Feb 27, 2011 (gmt 0)|
What i liked best in that blog article from ezinearticles is the striking out of this line
|The rel=”NOFOLLOW” attribute will be added to all links on all articles very soon. |
What a comedy.They expected people to contribute without a dofollow link.Oh oh, we made that statement in a hurry.People won't come to ezinearticles then.So, let us strike it, as there are people already opposing it in the comments section.
Let us maintain calm now and then implement what eHow did. WoW...Kudos folks...
| 2:45 pm on Feb 27, 2011 (gmt 0)|
I remember when nofollow was first introduced, EzineArticles came out with a statement that they would never use the nofollow attribute in their writers' links, it would be a breach of trust between them and their users. Go figure.
What's funny is that a lot of people believe that the use of the nofollow attribute is what triggered this penalty, now Ezines wants to start using it. Good luck to them.
| 2:48 pm on Feb 27, 2011 (gmt 0)|
I've got a co.uk site and this morning almost 90% of my traffic disappeared. The strangest thing is that the same thing happened on 21st of Jan, than on 2nd of Feb the traffic returned and after a short period of joy and happiness today the story repeats. Basically for almost all my important keywords the site is nowhere to be found or if I find it it's on 18th page. But for some of the keywords (appr 1%) it's firmly on the 1st page, even on 1st place. Last time I thought that the reason were the huge design changes I made which affected the entire site's URL structure. Than after 2 weeks Google decided that my site is very good regardless the changes and returned it to it's old positions in serp. Well I haven't changed anything since than and honestly today I will have to buy a gun :-( that might help me solve this...
| This 245 message thread spans 9 pages: < < 245 ( 1 2  4 5 6 7 8 9 ) > > |