Forum Moderators: open
I was wondering if anyone could help shed some light on my problem?
I have a number of sites that offer a hire service to customers based on their location. I have created a site per location and this strategy has been working very well for the last three months. When customers type in the product and their town, my site has appeared #1 and then from about #8 all the rest of my sites were listed (I have 70 in total). For some of the smaller towns my sites actually occupied the first 50 positions of the search.
On Monday I noticed that my sites no longer appear on the first page and they don't appear until you get to the third page of results.
All the towns but one are subdomains of my main domain and the one site that has a separate domain is unaffected by this problem.
I have searched the different datacentres and found that one datacentre brings back results that are more favourable with my sites appearing from #10.
One other thing I noticed about the results google brought back was that with the search terms I was using the category for the Google directory that was suggested didn't match the search I had done. And clicking on the category suggested took me to a 'category no longer exists' page.
Any ideas as to what might have happened?
Google will penalise if sites are too similar, and if any linking is obviously not right. (make sure link text actually makes sense to the user and relevant to the page)
It's not the number of sites, but how relevant and unique
each site is for the search.
It's important to make sure all the sites you have
have value to the customer for their search; i.e are each
unique - any duplication of any kind will immediately be spotted.
By focussing on user needs, penalities for duplication can be avoided; particularly look out for indicators that google can use to associate a group of
sites. (like headers and footers).
You need to avoid google spotting, "hey, these sites are actually, the same, or the same people, showing the same thing" hence have no value in google's eyes. google tends
to group all these sites in the same place with something like "other sites". Like pages of a company site.
Can everyone else reciprocate?
There's a way of doing this without so many pages where the visitor will reach one page for the keyword and location as long as someone else hasn't optimized for it. Use a table, include all the locations in bold--at least it works for me. I've seen something simliar in the SERPS and it looks kinda ugly. It's also kind of misleading as it eventaully takes you to auto dealers and not what the original title said. First 90 plus results are the same title but with a different state or city mentioned.
Btw...you may be surprised how many people use state abbreviations in their search as well or other regional phrases. Nice to include them too.
[edited by: Hardwood_Guy at 1:30 pm (utc) on Sep. 11, 2003]
Any webmaster would love to box out everyone like you've been doing. Having been a surfer way longer than running a site, I'll tell you it's very, very frustrating having to dig past two or three pages just to get to another site. When I see results like you described, first thing that goes thru my mind is your site is pure spam and google is broken. By not having them all bunched up, who knows you may find yourself getting better results.
You aren't the one whose network of sites has been mentioned here [webmasterworld.com], or!? ;)
I would agree with comments about it getting rid of spam and how annoying it can be to get pages of results that all lead to the same site.
I do believe that my sites don't fall into this category because each site does not contain duplicate content for the user. Each site is specialised for the users local area, i.e. the town they live in and the smaller towns and villages that surround it. Each site may look the same from a design standpoint but provides a specialised service for its targetted area and so really is unique.
I wonder at what point does Google say that two sites are too similar or duplicate?
I wonder at what point does Google say that two sites are too similar or duplicate?
Some things to consider:
-The amount of code on each page
-The amount of text on each page
-The amount of difference in code on each page
-The amount of difference in text on each page
If you have a lot of code and little text and you only change the text on each page, then that might not be enough to make each page different. Similarly, if you have a lot of text and little code and you only change the code on each page, then that might not be enough to make each page different.
I have heard that a 25% difference on each page is enough. So if you have equal ammounts of text and code on each page, you could change half the text or half the code and that would be enough to avoid duplicity. I haven't actually tested this though.
[edited by: dougmcc1 at 3:10 pm (utc) on Sep. 18, 2003]
Does that make sense to you? The above scenario is bad for the user.
It could be that perhaps you don't have enough competitors and that's why you can dominate from positions 8-70, or it could be you are dominating for a less trafficked keyword than other more competitive keywords.
But, the martini enthusiast in Hilton Head does not benefit from seeing your listing for a Chicago area martini bar. And I'm certain that Google would like to avoid this.
Duplicate content filters. The threshold for duplicate content, in my opinion, is larger than just one page or two page deal. I think it's more of a sitewide process of comparing one site to another site.
What is the threshold of duplication? Nobody can say with certainty.
When you break my sites down, the top half contains the information for that local area and also offers the user a chance to buy it from a local supplier. The lower half is a sitemap with links to each site in the network.
What appears to be happening is that google is picking up the reference's to other towns from the sitemap list. Displaying all the sites for a search on any of the towns targeted even though 69 only have 1 reference to that search term.
I don't have a problem with google filtering out duplicate content, the problem is at the moment they're not showing the sites even when they're relevant.
The frustration I have is that other sites in my category don't actually offer a user anything but a page of links whereas with mine they can actually get the product they were looking for. Yet despite this my site is the one that gets affected.
Any ideas as to what might have happened?
Possible scenarios:
There are many who come to WW looking for keyword advice, or title tag advice, etc. Some folks will point them to the Overture Tool, etc. I think this is like handing out guns to people who are unprepared to use them. This case is a perfect illustration of that.
That's why I encourage people to read before they touch their websites. The next website blowing up could very well be yours.