| 4:06 am on Jan 17, 2008 (gmt 0)|
I've been slapped by Google on my index page, going from high position first page to low position first page to position 60 or thereabouts.
Re-doing some on-page factors brought me back to the low 30s, working on links got me back to the first page. Total time - 5 months.
I suppose the first question should be what changes have you recently made that could account for the drop?
| 4:32 am on Jan 17, 2008 (gmt 0)|
I got the same problem, top 1 for years, and now stay at page 2.
My guessing issues:
- You got the same anchor text over and over in link exchange
- You are on top for long period of time, but you got no good links (natural links) - cite that your source is not so good that could be at top 1.
- You've made changes on your site so frequently -> yours is not reliable source.
- Google has discounted or devalued links exchanged
Any of the above you've done?
Another Florida update. :)
| 4:45 am on Jan 17, 2008 (gmt 0)|
When this happens, I check
1. Accidental empty links. eg <a href =""></a>
2. Links to dead sites. xenu works fine, if your cms does not do it for you.
3. Links to bad sites, rel nofollow all links unless you are sure the target is good.
4. dupe content. run something like gsitecrawler. yes, it will also find your index.htm vs / problems
5. Obvious over optimsation. repition of a word 10 times ina paragraph, or appearing in 100 anchors on a page etc,
6. Check if someone has copied your site.
This list, invariably fixes things for me within 2-4 weeks of a sudden drop
| 5:25 am on Jan 17, 2008 (gmt 0)|
|Also, I've read much debate over links back to the homepage should be |
/index.htm or just the /
Can someone explain if this matters for Google?
you need to study the duplicate threads section here [webmasterworld.com...] this answers your question, but read the rest as well as they are all very informative.
| 6:42 am on Jan 17, 2008 (gmt 0)|
I'm interested in your thought process with the duplicate content issue. I'd think, and I could be way off, that a duplicate content issue would be more inclined to produce a more gradual decline in SERP position for given keywords.
OP states they dropped 20 pages overnight on some keywords, but other keywords have retained position. That seems to me like reaction to an event, rather than adjustment based on availability of content.
I run mostly static pages, so duplicates aren't as big of an issue for me as they could be for other people, but I'd really like to know if it can whack you that big, that fast.
On a different note, I've also experienced same issues as lakr and nippi have noted, taking care of routine maintenence helps keep sites viable.
| 7:52 am on Jan 17, 2008 (gmt 0)|
It was only an answer to nkarnold question,
Have look at this thread, [webmasterworld.com...] basically my take on it is like this, Robots are pretty dim if you give them an option sometimes they get it right and others they donít. So to stop them being confused you only allow them to index one page not multi versions of that page. Hence redirecting from /index.htm -> / prevents the bot having to make a choice. Google appears to like the shortest version URL so having large numbers of links inbound to your home page with example.com/index.htm can sometimes cause the Bot issues not because of the links coming in to the different URL but that URL has identical content to the / version.
Iím guessing from Nkarnoldís statements that its his homepage SERPís that fell away as he mentions the other pages are still ranking,
So while Google is trying to decide which is the correct URL to appear in its SERPís they will some times filter those pages.
| 9:13 am on Jan 17, 2008 (gmt 0)|
Thanks for the time taken for responses,
I'll start looking at some of the suggestions straight away.
| 10:26 am on Jan 17, 2008 (gmt 0)|
Just to follow up.
GSiteCrawler shows my forum containing duplicate content
http://www.example.com/forum/search.php?search_author=CHEAPTRAMADOLONLINE (CompanyName ::)
http://www.example.com/forum/search.php?search_author=Dan (CompanyName ::)
http://www.example.com/forum/search.php?search_author=DarkRED (CompanyName ::)
http://www.example.com/forum/search.php?search_author=Devilbod (CompanyName ::)
http://www.example.com/forum/search.php?search_author=edigiompitoni (CompanyName ::)
http://www.example.com/forum/search.php?search_author=Fernando (CompanyName ::)
1. Should i let Google index my forum, it's not huge by any means
Thanks in advance
| 6:30 pm on Jan 17, 2008 (gmt 0)|
Looks like you should not let Google index your forum's SEARCH urls - that's where the duplicate trouble is showing up. The rest of your forum's urls probably hold unique content that can help you.
| 7:53 pm on Jan 17, 2008 (gmt 0)|
Yes, thanks for the info.
I'm beginning to suspect this may be the root cause of my problem.