Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Does my page fit some new spam profile?

         

kennylucius

4:15 pm on Sep 22, 2006 (gmt 0)

10+ Year Member



The main page of my site <link removed> has a PR 5 and had a SERP in the teens for keywords <removed>. This was accomplished without any intense SEO--I just created the site I wanted and over the years it creeped toward the first page. Recently, perhaps on Sept 15, my SERP fell into the 400's for most data centers. Today, the final DC also knocked me down.

The only reason I can think of is that it now fits a spam profile.

A few months ago, my navigation submenus grew in size to hundreds of links. I decided to trim the initial page size by post-loading the submenus and sidebars. With that HTML out of the way, Google sees only the page's unique content and far fewer links than before. Google seemed to like this at first. Might Google's Sept 15 change be punishing me for this type of post-loading?

My keyword density is probably higher now that the page is shorter. How high is too high? The keyword tool I use reports <one keyword> at 9.4%. Would it be worth it to use <synonyms> more often?

<Sorry, no specifics.
See Forum Charter [webmasterworld.com]>

[edited by: tedster at 6:05 pm (utc) on Sep. 22, 2006]

kennylucius

8:36 pm on Sep 22, 2006 (gmt 0)

10+ Year Member



In another thread, tedster may well have identified a problem with my site:

No, I would NOT sugggest altering the text used in the same navigational link on different pages. For one thing, that can confuse a visitor. They expect and need consistancy from page to page.

For another thing, Google recognizes the page template and to a degree can "see" the navigation as contrasted to the body copy. The repeating parts of the template have a distinctive footprint, in other words. If you start varying that anchor text from page to page, it just "might" look like you're trying to play a game with Google.

When I began to post-load approx. 40K worth of navigational links, I set a placeholder for them (these are in hidden DIVs that pop-up) containing a single link to a site-map page. I varied the link text between pages, so the template's footprint probably looked suspicious. Now I am removing the textual variation so that the template is identical on all pages.

It seems unlikely that this is the cause of my recent SERP problem. My SERP improved slightly when I implemented this change a few months ago, but went sour about a week ago. Might Google's alleged analysis of template footprints take that long? If so, how will I ever know if this is the source of the problem?

tedster

10:28 pm on Sep 22, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I decided to trim the initial page size by post-loading the submenus and sidebars. With that HTML out of the way, Google sees only the page's unique content and far fewer links than before. Google seemed to like this at first.

I suggest taking a look at what you are calling "post-loading" here. Are you using a scripted solution of some kind? If so, is there still a regular html click path (<a href="a url">) to all the urls that used to rank well? Is this click path to your deeper pages longer than it used to be? Were any urls changed with this new approach? Do your deep pages still interlink as well as they used to -- in other words, is Page Rank still voted round your domain in the same manner?

A lot of times widespread changes, especially in linking patterns, can take a while to make their effect fully felt in the SERPs. I think this may be more what you are noticing, rather than having accidentally matched some spammer profile. I mean, you're not doing meta-refreshes or javascript redirects are you? In other words, nothing that changes the page without some kind of intentional user interaction.

However, widespread changes my also be causing some temporary filter so that the trustability of your new pages can be tested.

kennylucius

12:12 am on Sep 23, 2006 (gmt 0)

10+ Year Member



Thanks for your interest, tedster.

Currently, the page loads with 3 hidden DIVs containing 3 links that are usable in case the post-load fails. The post-load is a "body onload" script that retrieves the submenus using AJAX and replaces the innerHTML of the hidden DIVs. I use this script to load sidebars as well.

The result is that the page displays very quickly, and hidden/peripheral stuff fills in some seconds later.

That means the template has 3 navigation links readable by SEs (without javascript). These link to the main page and 2 full-page versions of the submenus.

Google has indexed 52000 pages, and crawls 2000-3000 pages/day, so I would expect to see some indication of a problem fairly quickly. I re-organized my site structure (changed some URLs) about nine months ago without any SERP problems.

I do use a meta refresh in one PHP file. I provide no-follow links to associated booksellers and the like, and they are all channeled through one disallowed PHP file that creates a "thank you" page and does a meta refresh to the associated site (and counts the click-through). I've been doing that for years. Do you think the recent Google update might be discouraging that?

tedster

12:41 am on Sep 23, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Not as far as I can see. I'd still look closely at the major change you described, how it altered click paths and PR circulation around the site.

Don't just look at toolbar PR, which may be way out of date. Instead map out how your pages interconnect with javascript turned off and see what you can discover. It sounds to me like at least some of your PR may have re-channelled to your "2 full-page versions of the submenus" instead of flowing directly from one important inner page to another. Just a guess, but worth studying, IMO.

Also, you just made a pretty global change, so it may need to sit for a while to re-establish full trust. Do you keep a changelog, especially for major changes to the site? This simple step, which can go unutilized for years, can suddenly be an essential tool if things start to go south.

[edited by: tedster at 2:33 am (utc) on Sep. 26, 2006]

kennylucius

1:59 am on Sep 23, 2006 (gmt 0)

10+ Year Member



Yes, I'll let it sit for a while. SERPs are changing wildly right now. Mine has not been worse than 50 in years, but in the last few hours it has fluctuated from the 400s to low 100s. Perhaps Google will soon fix their problem and everything will be fine again.

kennylucius

1:13 am on Sep 26, 2006 (gmt 0)

10+ Year Member



The SERPs seem to have settle down a bit, and I noticed something that has me confused. My preferred keywords, "blue widgets", has fallen from 18 to 160. The singular "blue widget" has only fallen from 9 to 32.

I have tried to use the plural form in my front-page text, but it was always second to the singular in the SERPs. I assumed it was because "widgets" contains "widget", and so the density for the singular would be higher.

I can't understand why the plural keyword has been kicked so much farther than the singular. (Actually, I can't understand why any of this has happened.) If I were more cynical, I would suspect this had something to do with my deleting an adwords campaign that used those keywords. I deleted it the day before this happened. Sorry, everyone!

Seriously, as I said before I have reduced the density of these keywords on my front page (by substituting "gadget" and so forth) and I'm waiting for Google to crawl it again. Hopefully this is just a matter of the density threshold being changed.