Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: mademetop
Googlebot is still crawling the site, but doesn't seem to have added any pages since this happened.
Here are some things I may have done wrong:
1. The site is only a few weeks old, and I got quite a lot of links from 'do-follow' blogs (about 200) I didn't mean to get that many, but some sites show my comment on everypage in the sidebar.
2. The site is mostly built on a database of place names, the titles and descriptions are nearly the same on everypage i.e "Lovely great Widgets in Placename..." with only the placename changing. I was also using the placename a lot in headings and text on the page, but I have cut a lot of this out now.
3. I had a lot of internal links, I didn't realise this was a problem until I started reading about filters. I have since removed a lot of the internal links, and gone for more of a breadcrumb navigation.
4. I messed some of my scripting up, so Googlebot found a few hundred 404s before I realised and fixed it.
5. There were 2 duplicate pages which I didn't notice for a few days.
Can anyone suggest which of these is the most likely culprit of the problem? I guess all of them adding together doesn't look good... Am I doomed to be hidden in SERPs for months, or forever?
I am ranking O.K in Yahoo, but who cares :s
joined:Dec 9, 2001
Google is not nearly as impressed by self-generated links as it used to be. You'll need a lot more than just links in blog comments for Google to take you seriously.
the titles and descriptions are nearly the same on everypage
So what is the value of your site, exactly?
The site does have a blog section with unique titles and descriptions on each page. The majority of pages are location specific with user generated content, with the generic titles.