homepage Welcome to WebmasterWorld Guest from 54.204.58.87
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 43 message thread spans 2 pages: 43 ( [1] 2 > >     
Google Referals Down and Dropping More
carfac




msg:4177892
 1:18 pm on Jul 28, 2010 (gmt 0)

I have gotten a feeling of paranoia recently, with regard to Goog...

I run a clean site. It is a large, informational site. Sure, I have links to Amazon and such where appropriate, but I make most of my income from ads. My site is deep, with about 110K main topics covered. From each of those, there is probably 4-6 sub pages. Original content, no penalties ever. Been on the web over ten years, and generally considered an authority site I guess.

So, since March, I have seen a slow but steady bleed off of referrals from Goog. Late April there was a big hit- maybe 15 % in a week right there. I look at my logs, and I see it continuing, albeit a bit slower.... but still loosing traffic.

So I have not been sitting idly by. I have gone over my site, and refined it. I have made my meta tags much more accurate and on point. I redesigned my title tags to be more informative (took out the site name, added what I considered more pertinent info). With 110K pages, it is a little hit and miss with my CMS to get accurate tags across all my different data types, but I think I have done well in that regard.

I crawl my site, and make XML feeds of my updated pages for Goog, and give them to her. I have also added a new series of pages (on the order of 110K x 4) so there is new, fresh stuff.

Goog is taking the feeds. They show 878K pages sent, 420K in google, which is higher that it ever has been. A "site:my.site" check on Goog shows 106 results (a month ago, this was 130K, a year ago, 200K)> I have addressed most concerns in Web Master Tools (dup titles, dup meta tags), have no crawl errors. Goog shows 30K pages a day crawled, and this looks to be headed up, while my time spent to d/l a page is dropping.

It seems like I am doing everything right, and I am still bleeding traffic. I am at a loss of what to do. If someone would care to see my site, I am happy to PM it to you. ANy ideas what I can do?

Thanks!

Dave

 

morehawes




msg:4177923
 2:14 pm on Jul 28, 2010 (gmt 0)

Sounds like a webmasters worst nightmare. Have you been through every one of the Google Webmaster Guidelines and ensured you are metting every one? What is your hosting situation like - shared/dedicated? Do you host other websites and if so have you ensured that these are all clean too?

How about page load speed? I know G is taking more of an interest in this these days.

carfac




msg:4177927
 2:22 pm on Jul 28, 2010 (gmt 0)

Thanks for your reply

>>> Have you been through every one of the Google Webmaster Guidelines and ensured you are metting every one?

Doing my best! I have looked at all the "HTML suggestions" and addressed each of those... Also, like I said, we are a 12 year old site with NO problems ever. Google has always seemed to like us.


>>> What is your hosting situation like - shared/dedicated?

I host on my own servers... MySQL back end and apache front. Up 100% of the time, error logs show no outages or problems I am not aware of.


>>>> Do you host other websites and if so have you ensured that these are all clean too?

Yes, I do host a couple, all shiny clean, and all on separate IPs. Mostly they are my and my wifes photo sites (personal;) that are locked off from Goog anyway. A couple others are professional, but high quality, no bad links...


>>> How about page load speed? I know G is taking more of an interest in this these days.

Possibly, but not bad. Goog shows us at 5-8 seconds a page, which is more than they like, but not terribly bad. I have been working in Facebook plug ins and such, and those take FOREVER to load.... but I am seeing a positive response from that, too. I had 5 people "Like" us overnight.... and have been getting 2-3 a day since I added some plug ins last week. But they slow loads down...

carfac




msg:4177928
 2:24 pm on Jul 28, 2010 (gmt 0)

Oh, I forgot. Over the past month, I am starting to see a dip in Yahoo... does Yahoo follow Google in any way? Bing shows a strong, steady uptick, as does Ask and others.... just started to see a dip from Yahoo.

As for what I have done with titles... lets say my site is about animals (it is not). Titles used to be on the order of "Lions @ My Site"- thinking i needed to promote my site name. Now, they are more like "Lions- Savannah Animals of Africa (Felix Lupis)"- MUCH more info there. This has been up over a month.... and I am seeing pages in this new format cached by Google... but I am not seeing these effect my place in the SERPS.

I used to have a fairly generic keyword and description tag generation. Now I have worked "lion", "Savannah", "Africa" and "Felix Lupis" into the lion page, and similar words into the other "animals". I make each of these words prominent in EACH page now, too.

morehawes




msg:4177939
 2:37 pm on Jul 28, 2010 (gmt 0)

Doing my best! I have looked at all the "HTML suggestions" and addressed each of those... Also, like I said, we are a 12 year old site with NO problems ever. Google has always seemed to like us.


Do a google search for "webmaster guidelines" and you should see their help page. Broken down into three sections G say that violating one of these can cause a problem. I know you haven't had any problems before but it's a good checklist.

Possibly, but not bad. Goog shows us at 5-8 seconds a page


Looking at "Site performance" in GWT they say that 1.5s seconds is the maximum load time for a "fast" site. So might be worth trying to get this down, but I agree - unlikely this will be causing the problem

Oh, I forgot. Over the past month, I am starting to see a dip in Yahoo... does Yahoo follow Google in any way?


Not in my experience - the results between G and other engines doesn't seem to have much of a relationship.

Do you see any geographical trend in the visitors you are losing?

carfac




msg:4177940
 2:40 pm on Jul 28, 2010 (gmt 0)

>>> Do you see any geographical trend in the visitors you are losing?

Not sure how to tell this... is it in Web Master Tools?

I went to Analytics, and compared 9/09 to 6/10. US sjhows a 32% decline, UK 29%, Canada 32%, OZ 28%, Germany 12%...

so they are all about even. In GOOD news, average time on site is DOUBLE, % of new visits is down 10-15%, Bounce Rate slightly less. Overall, # of pages per visit is up, too.

morehawes




msg:4177946
 2:55 pm on Jul 28, 2010 (gmt 0)

Not sure how to tell this... is it in Web Master Tools?


What I mean is in your statistics are you seeing a loss of visitors from a particular country etc?

carfac




msg:4178022
 4:18 pm on Jul 28, 2010 (gmt 0)

Near as I can tell, they are all down about similar amounts....nothing glaring at least!

dataguy




msg:4178029
 4:36 pm on Jul 28, 2010 (gmt 0)

I've got a similar situation. A clean, 150K page site with greatly varied topics. Since March, traffic has been going down steadily. It's about 30% lower than it was about the beginning of the year.

I've been working hard to try to reverse the trend, but nothing I've tried has worked. My site is mostly user-generated content, and it's not all unique, which is my best guess at why I'm hurting. I see my competitors in the same situation, all my competitors except for one which has always claimed that they had exclusive content.

I've been hoping it's just a caffeine thing and it will be reversed soon, but I'm starting to lose faith in that theory.

Fortunately I've also been optimizing my ads, so ad revenue has actually increased during this period of time. I sure wish I could have the traffic back, though.

drall




msg:4178038
 4:50 pm on Jul 28, 2010 (gmt 0)

You sound just like our situation Carfac. Our 12 year old site is going through the exact same stuff and we have done the exact same things as you.

It all really started during April/May. I have spent hundreds of hours trying to figure out whats going on and have come to the conclusion that I have lost any idea as to what Google is looking for now.

First I thought they had given social networking a stronger play but after doing research and seeing that our social networking has been stronger then ever (30,000 new SU followers just in the last 30 days)
I have thrown that out the door.

Then I thought it was due to backlink profile/authority stuff, but considering in the last 90 days we just landed several hundred new links from some of the worlds biggest companies and some of them are actually referencing our site in their marketing/branding and advertising I threw that out.

Then I tried to find patterns in data but our top 1000 serps are literally changing daily. As if they have 4-5 different search engines running under the hood.

All the while im watching mashup junk take over everyplace.

Im so angry right now I could kick a kitten through a...

jimbeetle




msg:4178065
 5:24 pm on Jul 28, 2010 (gmt 0)

I went to Analytics, and compared 9/09 to 6/10.

Can you compare 6/09 to 6/10?

carfac




msg:4178119
 6:38 pm on Jul 28, 2010 (gmt 0)

Glad I am not the only one, but sorry any of us are going through this.

We have some duplicate content just by the nature of our topic (getting back to the animal site analogy, the latin name of an animal does not change. many other facts do not change, and we sort of list facts. Perhaps I should put all the facts into a sentence structure?)

Drall>>> that kitten did nothing to google, let it be!

Jim: Just compared June 2009 to June 2010 (it was sad to see the traffic that high!):

US 25%
uk 29%
can 28%
OZ 27%
Germany 27%

So all have dropped pretty similarly. The order of most counties is about the same on both lists...

I think a LOT of my hits are strange, one-off searches. I have LOT of depth, and I think that is where I am leaking.

BradleyT




msg:4178152
 8:07 pm on Jul 28, 2010 (gmt 0)

Could it be that your internal link juice is no longer strong enough to carry the whole site? Perhaps the pages still ranking have external links pointing to them or are near pages with external links and the pages that are dropping out of the SERPs don't have external links.

carfac




msg:4178179
 8:56 pm on Jul 28, 2010 (gmt 0)

Hi Bradley:

Thanks for the help. Unfortunately, I do not know what you mean. WHat is my internal Link Juice? There is a definite hierarchy to my subject matter... a natural order. Everything links in that manner. I have considdered that perhaps things are down from root a bit too much. Here is my Linking Tructure (again, using ANimals)


http://www.example.com / animals / mammals or Bird or Fish

then, within Mammals, we might have Africa and Asia and America

Then, we have animals listed within that... though some may be one or two directories lower.

This is the way we have always been set up, and we have thousands of inbound links based on that. Good news is the individual animals are linked like:

[mysite.com...] / animal / 65755-Lion.html

(Note "animal" vs. previous "animals")

>>>> Perhaps the pages still ranking have external links pointing to them or are near pages with external links and the pages that are dropping out of the SERPs don't have external links.

That may be it... I dunno. I do not track that, nor do I know how. I am busy entering new pages on new animals. I have a LOT of content that there just is no other page that covers it. Wikipedia links to me a lot, takes my information more often with no link back... but it seems a lot like if they don't have it, it does not exist. And once they do have it, no one else matters....

I just ran linkto:www.mysite.com- results in 76,500.

So I thin what you are saying is I have too much of stuff no one links to... and that is diluting my site overall? Should I start dropping pages?

[edited by: tedster at 9:17 pm (utc) on Jul 28, 2010]
[edit reason] make the example URL visible [/edit]

freejung




msg:4178203
 9:33 pm on Jul 28, 2010 (gmt 0)

This really sounds like the classic Mayday Update problem. You had a lot of long-tail traffic on a very large, very authoritative site, and now it's gone down. There's been a lot of discussion of Mayday.

This didn't happen to me, I don't think any of my sites are big enough, but I can pass along what others seem to be saying about it, I've been following it pretty closely. It may have to do with link juice as BradleyT suggested. You should go through some of the Mayday threads and see how others have explained it, if you haven't already.

It appears that this has to do with Google raising the bar for ranking for long-tail keywords. On a huge site like yours, there will not be enough link juice to raise all of those massive number of deep pages above the new minimum requirements. So you will no longer come up for long-tail terms that used to bring traffic to these pages. I bet you could see the effect of this by delving in your analytics. Look at the total number of keywords for which you are getting traffic.

If this is correct, adding all of those new pages may actually be making the problem worse by spreading your juice even thinner.

Apparently the best thing people have been able to come up with as far as a solution is acquiring large numbers of deep links. Not sure how you would go about doing that in your niche, but it's worth thinking about.

carfac




msg:4178223
 10:33 pm on Jul 28, 2010 (gmt 0)

So in essence, this is a Google Catch 22?

I need content to get ranking. So I add more content, but faster than I get links in, so google stops putting some of my links up. Thus my slow death spiral.

That seems like that must be it. Because the site is clean as a whistle- always has been. Validated HTML and CSS. SOlid site for many years.

So I am not sure how I am going to get more links in, especially with less exposure in Goog. Should I start dropping content? If so, what? Newest? The pages that get the least hits?

aristotle




msg:4178226
 10:59 pm on Jul 28, 2010 (gmt 0)

You might try to direct more internal links to the pages that were bringing in the most traffic before the problem started. At the same time you could drop any pages that never brought in much traffic, or at least reduce the number of internal links that go to them.

Robert Charlton




msg:4178243
 11:48 pm on Jul 28, 2010 (gmt 0)

I need content to get ranking. So I add more content, but faster than I get links in, so google stops putting some of my links up.

If it's possible in your field, and it may not be, I'd focus for a while on adding more unique content per page rather than more pages of content.

"Content" is no longer, I feel, about how many different ways you can sort the same pieces of data that many other people have too. While there are various takes on that model that may still be working, ultimately it's going to come down to unique content that's unique enough that it's valued by users and trusted by trusted sites.

carfac




msg:4178285
 1:42 am on Jul 29, 2010 (gmt 0)

aristotle: I have 1000's of pages that don't bring in much traffic. If I drop them, or even consolidate them, that seems to fly in the face of being an information site... more like I am building the site to please goog rather than building it as the info dictates. I will thnk about maybe consolidating them as best I can.

REGARDING THE OTHER HALF (sorry!), if I put a little widget with "Today's Popular Animals" on every page, might that 'direct more internal links to the pages that were bringing in the most traffic'?

Robert- thanks for your direction, too. Content makes sense. A lot of my current stuff is not very original.... not copied, mind you, just facts that are the same on my page and anyone elses. Working on another idea along those lines now. Thanks for the spark!

carfac




msg:4178326
 3:02 am on Jul 29, 2010 (gmt 0)

OK, let me ask you this. In my effort to reduce internal linking, can I ban a "class" of pages via robots.txt.... or do the page links actually have to be gone? I can do a lot of page pruning by just banning through robots.txt.

Reno




msg:4178333
 3:15 am on Jul 29, 2010 (gmt 0)

"Content" is no longer, I feel, about how many different ways you can sort the same pieces of data that many other people have too. While there are various takes on that model that may still be working, ultimately it's going to come down to unique content that's unique enough that it's valued by users and trusted by trusted sites.

This is an extremely important & relevant observation that requires no comment from me.

....................

Lapizuli




msg:4178370
 5:06 am on Jul 29, 2010 (gmt 0)

I suspect it's what Robert Charlton said, too - content.

This may seem obvious to some, but it wasn't to me until fairly recently:

I think Google's slowly reached a new confidence threshold, making some of the established SEO techniques superfluous. In other words, Google's increased its confidence in its own reading of value signals. It doesn't need us to tell it what we were telling it before when we strategically optimized for search. So it's started ignoring obsolete signals and looking for new ones, and if those signals are there, it's ranking those pages well.

(Whether or not those value signals come from the user or Google's own definition imposed on the user is another matter.)

What I mean is, it seems like Google better knows what content means and is looking at a different part of it than it was before, like a blind robot that's become suddenly intelligent and proceeds to deactivate some of its own previous feedback sensors because now its eyes can see, and it's pretty new at seeing and rather surprised to find the world so different, and not altogether sure of what to do with what it finds. But now its eyes are opened, and no matter what it does, the world will always look different. (Yes, before you ask, I do think of Google as AI.)

Or whatever.

If it were me, after your minor SEO tweaks, I'd first tackle the content, make it better and more satisfying to the visitor, and then if that doesn't do it, only then address the SEO. But content first.

The reason? SEO might get you short term results, but then you'll be at odds with Google more and more, in a constant struggle to get seen.

SEO exists not because it's a game to be played, but because it helps search engines think. As search engines start to think better, SEO becomes less important.

With content that gives your users what they want and what nobody else who's visible gives them, all you have to do is tweak the SEO here and there over time and you're fine. At least, that's what I believe. For what it's worth. I've only been at this a couple of years, and I mostly write for other sites, not my own.

And one rather odd tip: if anybody thinks his content is the best it can be, and never doubts it, there's a very good chance that it's not as good as he thinks, because he never doubts it. Content that delivers is about the reader, not the text. In print, writers have to take into account their readers first and foremost or they won't get read; online it's not much different, especially with so much competition entering the field every day.

Google says that over and over in their help pages - think of your visitor. While website developers are busy trying to get Google to notice them by flashing their arms around, Google's saying, "Wait! I'm getting better at seeing! Give me something worth noticing, and jab me in the ribs every so often to make sure I'm listening, and I'll see you!"

And "good" content doesn't have to be in-depth content, though it often is. It can be quick and easy to read. People searching for "how to replace a widget gadget" don't always want to know every step - sometimes they just want to know if they're capable of it. If some article sites are ranking higher for these kinds of queries, it may not be a mistake - it may be because that's what those queries were really asking. And for those long-tail terms being lost by those same user-generated content sites - vice versa; for some queries, only in-depth will do.

Google's bound and determined to figure out the meaning and intent behind queries - thus their introduction of implicit triggering to definition queries.

So I hope all that doesn't come off as moralistic. I'm not trying to draw the line between good content and crappy content and say anybody's content is one or the other. As I said, it's the reader, listener, or viewer that decides whether something's worthwhile as far as search engines care, and that changes over time. And I don't think Google or any of the search engines are very good at what they're trying to do yet and they're making a lot of mistakes. I'm just talking about a way of looking at content that (so far) might help you.

Robert Charlton




msg:4178398
 7:40 am on Jul 29, 2010 (gmt 0)

...just facts that are the same on my page and anyone elses...

This is exactly what I'm seeing on many sites that are having problems.

freejung




msg:4178399
 7:47 am on Jul 29, 2010 (gmt 0)

Lapizuli, I love your blind robot analogy.

Carfac, I wasn't suggesting banning things with robots.txt. Others with more experience with very large sites would be more qualified to comment, but I wouldn't think you'd want to do that. After all, it's probably many of those low-traffic pages that were bringing in the traffic before they dropped below the threshold (or rather, before the threshold raised over them).

However, I think Robert has nailed it (not surprisingly) with a really brilliant insight. It's not that you have too much _content_, it's that you have too many _pages_ with not enough unique content _per page_. If you can consolidate (remember to do 301s to redirect link juice) the same amount of information into fewer pages, and also find ways to add more unique information, that might help.

It sounds like your pages are exactly the sort of thing Google was deliberately targeting with Mayday: pages that were ranking primarily because they were part of a big authoritative site, not because that particular page contained a lot of unique information or had a lot of external inbound links.

carfac




msg:4179245
 3:34 pm on Jul 30, 2010 (gmt 0)

Thank you all again VERY MUCH for your help... and your insight. I admit, I was paniced for a bit.... but I have looked at my site, gone over it, and I suddenly saw a BIG problem. I THINK I can see where to dump a huge number of pages... duplicate pages... but I am not sure how to remove these pages from Google as they are all dynamically generated. That is, even if I remove the links, the page will still be there if google asks for it.

Lets go back to the Animals site analogy. I have a "Lion" main page, with a point list of facts... and three sub sections.... lets say Images, Coloring and Habitat. Now coloring and habitat are large text fields, so on the main page, the first 200 characters are displayed, with a link to the "Coloring" and "Habitat" sub page for the full summery. I probably have something written up for 30-40 percent of each of these fields.

So my first inclination was to combine "Coloring" and "Habitat" into one page... but then (epifany!) I saw that when I had neither a "Coloring" or "Habitat" page, both those pages were almost exactly the same! ANd this was probably 150K pages... maybe 1/4 or more of the total number of my pages in Google. If I can loose these worthless, empty pages (all of which duplicate each other), I think I would go a LONG way to restoring link juice across the site.

So my idea is to remove links from pages that do not have a "Coloring" or "Habitat" information. Easy enough. The tough question is how to get these pages out of Google. The paging scheme is all done through Mod_Rewrite of dynamic links... so even if I remove the actual page link to the "Coloring" and "Habitat" pages from my site, Google will still see the page if it asks for it out of its own existing database of my site.

So do I rename the "Coloring" and "Habitat" links to something like "Colors" and "Habitats" and then block the "Coloring" and "Habitat" pages from Robots.txt? What is the proper way to get these out of the index (keeping in mind that since they are all dynamic they will still be made if asked for, even if not linked to)?

Thanks!

Dave

1script




msg:4179446
 8:54 pm on Jul 30, 2010 (gmt 0)

I think you should leave robots.txt alone for now :) Using the current political speak, it would be the "nuclear option". In your particular case you'd need to not only remove the links to "coloring" and "habitat" if there is none defined, but also find a way in your CMS to return 404 if the "coloring" or "habitat" page had still been called up, even though there was no link to it.

In fact, if you wanted to remove those "empty" pages quicker, you would leave links to them so they will be visited by Gbot with more certainty, then Gbot will get the 404 and know that they don't exist (at the time). However, to make the whole things even more convoluted, it looks from another thread here [webmasterworld.com] that having too many internal bad links (to non-existing pages returning 404 HTTP code) can damage your "quality score" and further lower your rankings so maybe you just remove the links and hope Google will eventually come for the none-existing "coloring" pages and learn that it does not, in fact, exist.

Anyone has a better idea about how to SAFELY "nudge" Googlebot or "speed up" if you will re-discovery of the pages you intent to return 404 (or 410 for that matter) on.

From personal experience, they don't seem to like 404s in the sitemap either...

carfac




msg:4179769
 2:22 pm on Jul 31, 2010 (gmt 0)

Shoot- damned if'n I do, damned if'n I don't...

>>> return 404 if the "coloring" or "habitat" page had still been called up, even though there was no link to it.

Can't really think of a way to do that... mainly because even if there is no link the the nonexistant page anymore, since it is dynamically created, if you ASK for it, it will still be there. I have no way to turn off half the dynamically linked pages, but have the good ones still create. The only thing I can think of is to move "Coloring" to "Colors" (that word is a sub/category IN the URL), and 404 all the "Coloring" links....

Unless anyone lese thinks I should do different, I think I will just do that, and let gBot slowly find the 404s not existing anymore.

carfac




msg:4179799
 3:54 pm on Jul 31, 2010 (gmt 0)

OK, I made some changes. I found a good thread by Apache Forum mod jdmorgan regarding sending 410's rather than 404's... So I changed the cannonlical "Coloring" to "Colors"... the linking system now has an "If... then" check so the link to "Colors" only exists IF there is a "Colors" field. I rewrote the entire site link system, and 410'd the old "Coloring" links...

Then I dropped all my sitemaps with "Coloring" links... luckily I have a couple sitemap versions... and I started a new sitemap spider. When that is done (in 2-3 days!) I will posty that to Goog.

We'll see how this goes.

1script




msg:4179989
 1:22 am on Aug 1, 2010 (gmt 0)

even if there is no link the the nonexistant page anymore, since it is dynamically created, if you ASK for it, it will still be there.
This is actually pretty bad because (if you are in a competitive environment) someone can actually create those links on a spammy blog of sorts and have Googlebot visit your site and get 1Mil identical (and empty) pages. Definitely not good. I know it is difficult with most out-of-the-box CMS' but you really should find a way to return the proper code (404) if the page is supposed to be non-existent.

Also, regarding this:
I found a good thread by Apache Forum mod jdmorgan regarding sending 410's rather than 404's...
Since they are different codes (404 is not found for now and 410 is Gone forever) you may want to reconsider the change if you plan on eventually getting around to more animals and fill in more of the data that the coloring pages are comprised of. Same for habitat, basically. If you are planning to have these pages in the future, 410 is not an appropriate response.
tedster




msg:4180006
 2:37 am on Aug 1, 2010 (gmt 0)

Right - and if you never had a particular URL, then 410 is also not appropriate.

This 43 message thread spans 2 pages: 43 ( [1] 2 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved