Forum Moderators: open

Message Too Old, No Replies

Indexed w No Title, No Snippet, Not in SERPs,

         

Phrankenscents

3:40 pm on Jul 7, 2004 (gmt 0)

10+ Year Member



OK, I know its been discussed but did anyone definitively find out why this recently happened to a bunch of previously fine pages? What is the remedy?

Marcia

9:50 pm on Jul 7, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Some people seem to think that Google is cutting down on number of pages or data indexed because they're "running out of room" but I don't buy that. All they'd have to do is a search for "cheap hard drives" and that problem would be solved.

Nothing can be proven, but I've got a gut feeling that they're going after duplicate or near-duplicate pages. Don't take my word for it, but see if your dropped pages come close - check percentages and see if that's a possibility; it can't hurt.

Disclaimer: Not intended to be taken as fact. When all else fails, the Accidental_ SEO relies on hunches but doesn't recommend the practice to anyone else.

Phrankenscents

10:48 pm on Jul 7, 2004 (gmt 0)

10+ Year Member



thanks for the reply...actually I've conceded that this is probably due to a dup penalty...but I don't think my case was really warranted (doesn't everyone)...see my other thread regarding Duplicate content Remedies...would love to get input there. thanks

somerset

7:11 pm on Jul 9, 2004 (gmt 0)

10+ Year Member



Duplicate content can be a real balancing act. Take for example large OSCommerce sites where all architecture is the same on 1,000's of products. The only difference are the specific product details.

The site is using the same template throughout (as do most shopping sites).

Would that trip the dupe penalty?

I cannot find any other reason as to why a site I care for has dropped.

The site in question had a PR5, was assigned a PR1 (even though the same quality links are present and seen in backlinks). The site performs as a PR1 site would - awful in a competitive area.

Have any other OSC sites suffered?

I wonder if Google is trying to cut out duplicate content generally on the web, and has just gone too far, with innocent sites suffering.

sasha

11:44 pm on Jul 11, 2004 (gmt 0)

10+ Year Member



Recent Google index shows URLs (no title no description) for about 90% of my pages.

- I did not see a Google deep crawl for about 3-4 weeks.

- The site was down for about 12 hours recently.

- About 6 weeks ago I added a large amount of visible common text to almost all of the pages, whereby only 10-15% of each page's content has remained unique. The rest of the content is identical throughout the site.

Is this a duplicate penalty or lack of deep crawling or something else?

I have already eliminated the comomn text. It was great for site users to have that instructional text on each of individual pages, but if it gets me in trouble with great Google - I guess I have no choice!

Perhaps in the future I will consider 'wrapping' that text inside a javascript or flash so that Google would not spider it...

Any suggestions would be appreciated.