homepage Welcome to WebmasterWorld Guest from 54.161.166.171
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Visit PubCon.com
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

    
Could google be penalising me for having too many sites?
kid80




msg:52402
 9:38 am on Sep 11, 2003 (gmt 0)

Hi,

I was wondering if anyone could help shed some light on my problem?

I have a number of sites that offer a hire service to customers based on their location. I have created a site per location and this strategy has been working very well for the last three months. When customers type in the product and their town, my site has appeared #1 and then from about #8 all the rest of my sites were listed (I have 70 in total). For some of the smaller towns my sites actually occupied the first 50 positions of the search.

On Monday I noticed that my sites no longer appear on the first page and they don't appear until you get to the third page of results.

All the towns but one are subdomains of my main domain and the one site that has a separate domain is unaffected by this problem.

I have searched the different datacentres and found that one datacentre brings back results that are more favourable with my sites appearing from #10.

One other thing I noticed about the results google brought back was that with the search terms I was using the category for the Google directory that was suggested didn't match the search I had done. And clicking on the category suggested took me to a 'category no longer exists' page.

Any ideas as to what might have happened?

 

The Subtle Knife




msg:52403
 1:04 pm on Sep 11, 2003 (gmt 0)

Looking back at comments on this site
and the google knowledge base.

Google will penalise if sites are too similar, and if any linking is obviously not right. (make sure link text actually makes sense to the user and relevant to the page)

It's not the number of sites, but how relevant and unique
each site is for the search.

It's important to make sure all the sites you have
have value to the customer for their search; i.e are each
unique - any duplication of any kind will immediately be spotted.

By focussing on user needs, penalities for duplication can be avoided; particularly look out for indicators that google can use to associate a group of
sites. (like headers and footers).
You need to avoid google spotting, "hey, these sites are actually, the same, or the same people, showing the same thing" hence have no value in google's eyes. google tends
to group all these sites in the same place with something like "other sites". Like pages of a company site.

Can everyone else reciprocate?

Hardwood Guy




msg:52404
 1:19 pm on Sep 11, 2003 (gmt 0)

Did we welcome kid to Webmasterworld? Welcome kid:)

There's a way of doing this without so many pages where the visitor will reach one page for the keyword and location as long as someone else hasn't optimized for it. Use a table, include all the locations in bold--at least it works for me. I've seen something simliar in the SERPS and it looks kinda ugly. It's also kind of misleading as it eventaully takes you to auto dealers and not what the original title said. First 90 plus results are the same title but with a different state or city mentioned.

Btw...you may be surprised how many people use state abbreviations in their search as well or other regional phrases. Nice to include them too.

[edited by: Hardwood_Guy at 1:30 pm (utc) on Sep. 11, 2003]

BlueSky




msg:52405
 1:29 pm on Sep 11, 2003 (gmt 0)

You may not want to hear this but I think what Google did was good -- maybe not for you but for the user. The worst thing from a user's perspective is to get five pages of results all coming from one site albeit different subdomains.

Any webmaster would love to box out everyone like you've been doing. Having been a surfer way longer than running a site, I'll tell you it's very, very frustrating having to dig past two or three pages just to get to another site. When I see results like you described, first thing that goes thru my mind is your site is pure spam and google is broken. By not having them all bunched up, who knows you may find yourself getting better results.

Small Website Guy




msg:52406
 6:22 pm on Sep 11, 2003 (gmt 0)

I get really mad when I do a search, and I see the same site pop up 50 times, each version only slightly different.

I hope Google figures out how to make this stop happening, so people can find honest websites.

Yidaki




msg:52407
 6:35 pm on Sep 11, 2003 (gmt 0)

Welcome to WebmasterWorld, kid80!

You aren't the one whose network of sites has been mentioned here [webmasterworld.com], or!? ;)

kid80




msg:52408
 10:13 am on Sep 18, 2003 (gmt 0)

Thanks for the feedback.

I would agree with comments about it getting rid of spam and how annoying it can be to get pages of results that all lead to the same site.

I do believe that my sites don't fall into this category because each site does not contain duplicate content for the user. Each site is specialised for the users local area, i.e. the town they live in and the smaller towns and villages that surround it. Each site may look the same from a design standpoint but provides a specialised service for its targetted area and so really is unique.

I wonder at what point does Google say that two sites are too similar or duplicate?

dougmcc1




msg:52409
 2:43 pm on Sep 18, 2003 (gmt 0)

I wonder at what point does Google say that two sites are too similar or duplicate?

I think pages aren't considered duplicates as long as the title, META tags and copy are different. Of course, that's assuming you have more than a 2 word title on every page, a 5 word description and one line of copy.

Some things to consider:
-The amount of code on each page
-The amount of text on each page
-The amount of difference in code on each page
-The amount of difference in text on each page

If you have a lot of code and little text and you only change the text on each page, then that might not be enough to make each page different. Similarly, if you have a lot of text and little code and you only change the code on each page, then that might not be enough to make each page different.

I have heard that a 25% difference on each page is enough. So if you have equal ammounts of text and code on each page, you could change half the text or half the code and that would be enough to avoid duplicity. I haven't actually tested this though.

[edited by: dougmcc1 at 3:10 pm (utc) on Sep. 18, 2003]

plasma




msg:52410
 2:56 pm on Sep 18, 2003 (gmt 0)

my site has appeared #1 and then from about #8 all the rest of my sites were listed (I have 70 in total). For some of the smaller towns my sites actually occupied the first 50 positions of the search.

And you wonder why you were penalized?

martinibuster




msg:52411
 3:00 pm on Sep 18, 2003 (gmt 0)

kid80,
If I'm looking for martini bars services in Chicago, I don't want to have ALL the Chicago martini bars drowned out by your martini bar listings for Des Moine, Hilton Head, Palm Beach, etcetera and etcetera.

Does that make sense to you? The above scenario is bad for the user.

It could be that perhaps you don't have enough competitors and that's why you can dominate from positions 8-70, or it could be you are dominating for a less trafficked keyword than other more competitive keywords.

But, the martini enthusiast in Hilton Head does not benefit from seeing your listing for a Chicago area martini bar. And I'm certain that Google would like to avoid this.

Duplicate content filters. The threshold for duplicate content, in my opinion, is larger than just one page or two page deal. I think it's more of a sitewide process of comparing one site to another site.

What is the threshold of duplication? Nobody can say with certainty.

kid80




msg:52412
 12:00 pm on Sep 19, 2003 (gmt 0)

martinibuster, i agree with you in that if you searched for a location specific product then you only want to see that site. It was never my intention to dominate the listings in the way the sites did.

When you break my sites down, the top half contains the information for that local area and also offers the user a chance to buy it from a local supplier. The lower half is a sitemap with links to each site in the network.

What appears to be happening is that google is picking up the reference's to other towns from the sitemap list. Displaying all the sites for a search on any of the towns targeted even though 69 only have 1 reference to that search term.

I don't have a problem with google filtering out duplicate content, the problem is at the moment they're not showing the sites even when they're relevant.

The frustration I have is that other sites in my category don't actually offer a user anything but a page of links whereas with mine they can actually get the product they were looking for. Yet despite this my site is the one that gets affected.

plasma




msg:52413
 5:25 pm on Sep 19, 2003 (gmt 0)

The frustration I have

The frustration _YOU_ have?
What about all the GOOD guys, that try to stick to the rules.
The ones that day by day try to compete against cheaters like you without cheating themselves.

It's people like you who destroy good services like google.

martinibuster




msg:52414
 6:46 pm on Sep 19, 2003 (gmt 0)

Ok, ok, let's not judge. kid80 says they didn't intend to dominate the serps so it sounds more like their competition isn't too tough more than anything else. I believe kid80 when they say this was unintentional, so let's move on with the question.

Any ideas as to what might have happened?

Possible scenarios:

  • Regardless of what you may believe about the uniqueness of your content, you may have triggered a duplicate content filter. If there is enough that is the same throughout all the sites, it's a dupe.
  • Examine your linking structure. Who links to your different sites? The link structure you mentioned could very likely caused it to receive a penalty. It's can look like you were kiting your PR. You have to seek different sources of inbound links for EACH web site. Variety of inbounds is highly important.
  • With the dominance you achieved, innocently or not, it is fairly likely that someone ratted you out to the spam police.

There are many who come to WW looking for keyword advice, or title tag advice, etc. Some folks will point them to the Overture Tool, etc. I think this is like handing out guns to people who are unprepared to use them. This case is a perfect illustration of that.

That's why I encourage people to read before they touch their websites. The next website blowing up could very well be yours.

kid80




msg:52415
 9:35 am on Sep 23, 2003 (gmt 0)

thanks everyone for the feedback.

i'm making some changes at the moment so i guess i'll see if the situation improves.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved