homepage Welcome to WebmasterWorld Guest from 54.167.249.155
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 44 message thread spans 2 pages: 44 ( [1] 2 > >     
50% Google traffic drop - identifying problem, steps to take
silverbytes




msg:3705566
 5:30 pm on Jul 23, 2008 (gmt 0)

Noticed a 50% Google traffic drop in one of my sites. Site has some 7 years and has some 170 pages in 3 different languages including English and Spanish.
Several pages was #1 and others in first 10 positions mostly.

Other search engines seems to be ok with rankings including some Yahoo's #1.

Even now wiht half the usual traffic I get lots of bookmarks what is a good indicator to me.

Regarding Google: some pages are still #1 but other pages are dozenz or hundred of pages down now for older keyphrase.

However I noticed that if I pick just some words of my keyphrase my page ranks in first 10 or 30 results, while using the old 2 words keywords phrase that used to rank #1, results in #300

I haven't changed site so I wonder why Google went from love to hate whith those pages...

So what I'm doing is "deoptimizing" pages, lowering keyword density and wait, but I'm not sure about it.
I'm asuming a kind of -950 penalty [webmasterworld.com...]

Perhaps I just need to wait (in case my case matches this: [webmasterworld.com...] )

Site was banned in wikipedia months ago, but never had problems with google about it and I don't think Google use wikipedia to consider a site good or not.

[edited by: Robert_Charlton at 6:00 pm (utc) on July 23, 2008]
[edit reason] fixed link [/edit]

 

tedster




msg:3706023
 6:47 am on Jul 24, 2008 (gmt 0)

Do you have many different urls where the content is close but slightly modified in each case to target different keywords and phrases, especially semantically related ones?

silverbytes




msg:3706406
 3:56 pm on Jul 24, 2008 (gmt 0)

I'd say no. Obviously it is same page structure but text and photos are completely different in page's body...

Marcia




msg:3708902
 1:42 am on Jul 28, 2008 (gmt 0)

Is there any kind of boilerplate text, especially containing keywords, repeated at the beginning of the main page content, that runs across a lot of pages?

I've seen that have a negative effect, and it helped when I tested removing it from some pages. If suspected, it's easy to see in the text cache.

Quadrille




msg:3708925
 2:21 am on Jul 28, 2008 (gmt 0)

Have to ask;

1. unique meta descriptions, or one across the site?

2. plain vanilla HTML, or cms with (possibly) duplicate URLs

3. is it in any way an affiliate site - in particular, does it use any 'standard' text in any serious amounts?

And, because it's Google, have you checked ALL of your outgoing links lately, and deleted all supsect links, all non-related reciprocals, all links you cannot personaly recommend, and any recips to directories or pseudo-directories (eg someone's 'link pages').

Run xenu for good measure.

Marcia




msg:3708934
 2:35 am on Jul 28, 2008 (gmt 0)

And, because it's Google, have you checked ALL of your outgoing links lately, and deleted all supsect links, all non-related reciprocals, all links you cannot personaly recommend, and any recips to directories or pseudo-directories (eg someone's 'link pages').

Agreed, especially important! I just recommended to someone that they remove some seriously tacky outbound links from their homepage (just as you've described), a couple of them sitewide, after their traffic and rankings totally sank.

Run xenu for good measure.

I always forget to do that, link checking utilities can save a lot of time.

silverbytes




msg:3709599
 7:30 pm on Jul 28, 2008 (gmt 0)

Is there any kind of boilerplate text, especially containing keywords, repeated at the beginning of the main page content, that runs across a lot of pages?

Bingo!

That may be the cause. Instead using image slogan I use text on that site, and I see is not helping at all.

About questions
1) Yes unique meta descriptions
2) I don't find duplicate urls
3 No

I'll check xenu.

silverbytes




msg:3710278
 2:41 pm on Jul 29, 2008 (gmt 0)

Wait a sec
have you checked ALL of your outgoing links lately, and deleted all supsect links, all non-related reciprocals, all links you cannot personaly recommend, and any recips to directories or pseudo-directories (eg someone's 'link pages'

Don't you have any links to linkpages or reciprocal link exchange at all? I think I might find some of these not in this particular site but in others (ranking better than this one)

Quadrille




msg:3710292
 2:55 pm on Jul 29, 2008 (gmt 0)

I have no suspect links, no non-related reciprocals, no paidlinks, no links I cannot personally recommend, and no recips to directories or pseudo-directories.

Any links placed by 'third parties' (eg guestbooks, forums) have nofollow attached.

I check ALL my outgoing links regularly, with xenu - and visual checking.

With Google, bad links are by far the single biggest cause of problems. High risk behaviour on money-earning sites makes no sense to me.

ichthyous




msg:3712328
 6:29 pm on Jul 31, 2008 (gmt 0)


Is there any kind of boilerplate text, especially containing keywords, repeated at the beginning of the main page content, that runs across a lot of pages?

I've seen that have a negative effect, and it helped when I tested removing it from some pages. If suspected, it's easy to see in the text cache.

Please explain how one can see this in the text cache Marcia...I have seen my traffic declining steadily over two months now. Slow and steady downward loss of traffic since May and so far it's down 40%. I haven't lost any good links that I can see, and even gained a few good ones. I do have very similar lists of keywords repeated across many pages at the top though. thanks

silverbytes




msg:3719997
 11:51 pm on Aug 10, 2008 (gmt 0)

Following up here.
Site was deoptimized week ago notices some "bugs" like 2 h1 tags in same page which I guess is possion.
Some few pages keep #1 result. Some others #11-20. Most are "gone".
Site has all pages indexed
(guess that is good but not good enough?)

Searching
site:example.com + "example.com."
I see all my pages with new titles updated so that tells me google crawled pages.

I hope that means I will see those in search results soon again. But that is not happening now with 90% of my pages.

My understanding is site is -950'd and probably will take some weeks / months to come back to reasonable serps.

I've read about useless of filing a reinclusion report. So do you think sit and wait is the proper think in this case?

Marcia




msg:3720033
 1:16 am on Aug 11, 2008 (gmt 0)

To see the text cache, you click on the cache link either in the SERPs or the toolbar with TBPR enabled, and then click on the "text only" link. It'll show not only anchor text of links and regular text, but also, it'll show the alt attribute of images that are used as links.

tedster




msg:3720048
 1:55 am on Aug 11, 2008 (gmt 0)

I've read about useless of filing a reinclusion report

This forum also has accounts of success. When you don't know the specifics of the site involved, you can't take every complaint your read to heart. Your situation could be entirely different.

The Reconsideration Request - when you've fixed all the problems you know of, and you're not trying to sneak anything past - can be an effective step. That's especially true, I feel, if your particular penalty involved a manual intervention.

Just be brutally honest with yourself. Remembering that the top SERPs may be evaluated by a human editorial staff, ask if your site makes a good showing. This isn't just an automated contest - Google is in the busines of giving their users the best resources that they can, especially on common query terms. It's our job to create something that measures up.

silverbytes




msg:3720271
 1:16 pm on Aug 11, 2008 (gmt 0)

Good tip Marcia!

Tedster: I never thought 950 would be mannualy applied... Do you know for good, Google editors check results and mannually penalize sites for "optimization excess" or similar?

Is that the case of google spam results reported sites?

tedster




msg:3720382
 3:55 pm on Aug 11, 2008 (gmt 0)

The squad of folks who manually check the SERPs for top keywords are not looking for over-optimization as such, at least as I understand their function. They are looking at whether the results are "good" results - in other words, will the Google user find a particular result to be useful and helpful for the query, quality that makes it really better than the URLs below it.

So, if a URL is ranking on the first page because of SEO, but the page is pretty hard to use or it offers "nothing special", then that's when the team of humans might all agree that it should be demoted. And one opinion won't do it - it takes agreement from several people who are all working independently.

Marcia




msg:3720384
 4:00 pm on Aug 11, 2008 (gmt 0)

From what I've read the quality raters don't have the authority to edit the search results.

silverbytes




msg:3720592
 8:30 pm on Aug 11, 2008 (gmt 0)

Do you think setting up adwords campaign on affected domain may speed up reinclussion? I mean if may be any connection in show those ads and make notice site is clean?

stcrim




msg:3720655
 10:01 pm on Aug 11, 2008 (gmt 0)

We have blogs that are squeaky clean, white hat as a blog can be that are dropping like rocks out of the sky. Started about last Thursday. For us it's more of a filtering than a penalty.

Some terms still do well, but 80 percent don't anymore. Google must have made some type of a change that sunk a lot of sites. Ours are small, 2 and 300 pages each...

-s-

Robert Charlton




msg:3720680
 10:36 pm on Aug 11, 2008 (gmt 0)

What I see happening is that Google is increasingly making it tougher to rank on phrases that used to be less competitive. So, pages that have ranked well for years just by virtue of an optimized title and heading and some sort-of-OK links may disappear for lack of "trust" factors that Google currently looks for.

To oversimplify... Google has been saying that it wants to return those pages that have the best content and attract the most good-quality link votes. Unless your site is genuinely useful, sooner or later Google is likely to demote it.

It's getting harder and harder to "trick" your way back to the top. Content, and promotion of that content, are what will ultimately win out.

silverbytes




msg:3721338
 6:20 pm on Aug 12, 2008 (gmt 0)

Stcrim: I've heard that about Blogs specifically. My guessing is Blogs are going down and I hope they do specially for images search since they get top liking to original sites. (I have nothing agains yours ;)

potentialgeek




msg:3721548
 11:34 pm on Aug 12, 2008 (gmt 0)

silverbytes:

I had several 950d sites which were adjusted and got their rankings back. The first one returned after many months because I didn't know what I'd done wrong. The last one in a few days because by then I did.

I didn't file any reinclusion requests. I just removed all anchor text of competitive words or phrases. Substituted the text with graphics (icons+links with no alt tag). It's easy to forget about alt tags when you're reviewing a penalized site because you don't necessarily see the text.

Incidentally I'm starting to suspect Google is going to or starting to incorporate the idea behind the 950 anchor text spam penalty into its SERPs algo without all the drama of a 950 penalty. So you can lose position (even one position) because of spammy anchor text.

After I cleaned up my site with spammy anchor text, I moved to #1 (after being #2 for a long time). The former leader, I noticed recently, still has a fair bit of spammy anchor text all over his site.

p/g

silverbytes




msg:3723492
 1:57 am on Aug 15, 2008 (gmt 0)

Wow. Alt tags? And how do your text links look like now? You completely blowed up your text links to became images? How could you possible rank at all that way?

tedster




msg:3723495
 2:28 am on Aug 15, 2008 (gmt 0)

You certainly can rank without text links in the menu - I see it all the time. We sometimes get tunnel vision from all that we think we know about SEO, whereas Google has been evolving their algo in all kinds of ways and the "truths" of a few years ago are often not so absolute anymore.

If someone has a phrase-based "over-optimization penalty" I can see how removing a text menu might releive that -950 pressure.

silverbytes




msg:3723909
 4:29 pm on Aug 15, 2008 (gmt 0)

Wait a second: how do google identify image links at all without title element or alt ?
My guessing is they can't, since letters in image doesn't count, so to me that's a blind link.

But is that the idea?

tedster




msg:3723920
 4:48 pm on Aug 15, 2008 (gmt 0)

right, it's a link that sends PR but no link text information - except that some context can be derived from nearby content.

There's no doubt that Google has OCR capability. I even heard about them experimenting with it several years ago. But to what degree they use it for image links, if at all, is a topic I've never heard or seen anything official about, or even rumored.

silverbytes




msg:3723960
 5:57 pm on Aug 15, 2008 (gmt 0)

I can't belive it. Found the real problem: some crappy freelancer copy pasted text I paid to write. That triggered some duplicated content penalty.
I wish knew it before paying...

Now things are clear: removing old text and writting new oringinal will put me back in serp? Any experiences on this?

ken_b




msg:3723964
 6:03 pm on Aug 15, 2008 (gmt 0)

how do google identify image links at all without title element or alt ?

Maybe the file name, at least in some cases?

silverbytes




msg:3724031
 7:32 pm on Aug 15, 2008 (gmt 0)

unfortunatelly filenames may not have anything to do with real image.

Sorry to insist but: how do I get ride of duplicated content penalty once I removed or rewritted detected or suspicious paragraphs? Sit and wait or beg Google love me again?

tedster




msg:3724178
 11:50 pm on Aug 15, 2008 (gmt 0)

Duplicate content is almost never a penalty - it's just that your URL gets filtered out of the SERPs. If the content is no longer duplicate, then your URl is no longer filtered.

That information is available in the Hot Topics [webmasterworld.com], which is always pinned to the top of this forum's index page.

Since there's no penalty, there is no need for a reconsideration request. When the unique version of the URL is spidered and indexed, then it will not be filtered out.

However, I note that you began this thread asking about a 50% traffic drop. That sounds like more than one duplicate article.

silverbytes




msg:3724269
 4:00 am on Aug 16, 2008 (gmt 0)

Perhaps 50% of my pages had duplicate content. Since 75% of content was developed by a freelancer. Sounds reasonable to me. Anyway site might be "overoptimized" what is a poisson cocktail. Let's see how fixing results.

This 44 message thread spans 2 pages: 44 ( [1] 2 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved