homepage Welcome to WebmasterWorld Guest from 184.73.104.82
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 97 message thread spans 4 pages: < < 97 ( 1 [2] 3 4 > >     
Dropped from Google - a checklist to find out why.
Let all the sites dropped fill this checklist so we can narrow it down.
HostingDirectory

10+ Year Member



 
Msg#: 29656 posted 3:54 pm on May 29, 2005 (gmt 0)

We all know a lot of sites have been dropped from Google but we don't know why. Perhaps this update has a long way to go, perhaps it has not. If it is finished we need to find out why we got dropped from the index.
I have assembled a check list that i feel covers all the angles. If sites dropped could fill in the checklist we might see a pattern occur. Then we can work out what we might need to change to get back in.
My checklist is below, some of the points are explained why it might be a factor.

1) Site size
It is reasonable to believe that a homepage that size is too large, will have too many people click away so may loose relevancy to show in the top results.
2) Outbound links
How many do you have on your homepage?
3) Inbound links
How many does your site have?
4) Adsense
Since it may connect innocent sites with scraper sites.. do you use it?
5) Content updated regular?
Some sites do not have content updated too much because they offer tools over info, but Google may consider sites with rare content updates to be poor quality and drop positions for them.
6) Adwords
Do you use paid advertising like adwords, maybe loosing some places will make you pay more or perhaps Google protects their paying clients?
7) Age of site?
How old is your site, perhaps older sites are likely to be better because they survived... so Google keeps them listed high?
8) Use of no follow tags on forums?
If you offer forums or blogs, do you use the no follow tag? Maybe we need to stop bad sites linking inside our sites?
9) Location of host sever
Maybe our host location plays a part in how high we rank to certain users?
10) Dedicated or shared hosts?
Are we being punished for what other sites do in a shared hosting enviroment?
11) Redirects?
Do you use any kind of redirects that Google may be having trouble with?
12) Scrapper sites linking to you / content theft?
Do you have lots of scrapper sites suddently linking to you or using parts of your content.
13) Are you listed in dmoz?
Perhaps Google pays more respect to dmoz listed sites?
14) Listed in Yahoo directory?
Perhaps Google doesnt want Yahoo directory pages to be listed high or maybe it does prefer them linking high?
15) RSS feeds on site?
Using rss feeds might be causing some kind of duplicate content penalty?
16) Pagerank ( before it dissapeared )
What was your pagerank.. maybe a high pagerank gets immune to any penalisations?
17) Extra domains pointing to main domain?
Do you have other domains pointing to your main domain that might be causing problems in Googles eyes?
18) Search engine friendly archives producing same content on different urls inside site?
Some forums like vbulletin have a search engine archive that produces the same content with a static html url.. maybe this might be picked up as duplicate content?
19) Did you bother taking LSI into consideration with onpage content?
Basically it seems Goolge is now using Latent Semantic Indexing in search results - so a search for zoo trips may look at page content and realise that zoo , wildlife and trips are related. So search results could give you wildlife trips for the term zoo trips.

It's a long list but if we all fill it in, we could then put the results in excel and compare them.. maybe see a pattern that all sites dropped might have. Then we can test that pattern aganst sites will did not get dropped.

Might be useful.

 

trillianjedi

WebmasterWorld Senior Member trillianjedi us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 29656 posted 10:09 am on May 31, 2005 (gmt 0)

[webmasterworld.com...]

MHes

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 29656 posted 1:04 pm on May 31, 2005 (gmt 0)

danny

Its very unlikely to be scraper sites. We have a site that was running top for 2 years in a competitive sector. During Febuary it dropped 70% of Google traffic. It was replaced by another site of ours which is an old ppc directory with loads of dead links, few good links in and had not been touched for a year. Both sites have scraper sites linking to them in large numbers.

This month the original site is back with a vengence and the old site has dropped a bit. We have not touched either. Another seriously good site in the same sector has now dropped.

It is impossible to work out why sites fluctuate. We don't even touch them and if you do, you may prevent the possible 'come back'.

The truth is, sites go up and down. If you are completely out of the index then you are very unlikely to ever recover.

PatrickDeese

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 29656 posted 2:13 pm on May 31, 2005 (gmt 0)

> I am unable to do a 301 re-direct

If I were you, I would change all your internal links to absolute links in order to combat links to the non-WWW - that will at least prevent the indexing of the entire site without the WWW subdomain.

danny

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 29656 posted 10:49 pm on May 31, 2005 (gmt 0)

MHes wrote:
The truth is, sites go up and down. If you are completely out of the index then you are very unlikely to ever recover.

I'm in the index - Googlebot indexes every page on the site at least daily - just showing up on page five for any searches.

But you're right, things do go up and down - my site was fine in January, got worse in February, and now has been hammered nearly completely, all without my doing anything (except steadily adding more reviews to it).

shrimp

10+ Year Member



 
Msg#: 29656 posted 3:39 am on Jun 1, 2005 (gmt 0)

Very popular old and frequently updated site and subsite on that domain very hard hit in Bourbon. Here is what I see. Both sites now have 1000's and 1000's of scraper site links (yuk)

AND--first time for this--hosting geo location problem

both sites just happen to be hosted by a fine hosting company located in Canada. Neither site has anything at all to do w/ Canada, subject of one is non-geographical and just a family recreational niche.....but it has lost 1/2 USA GG traffic,

subweb, that was never intended to be a stand alone website, but just ended up that way thru odd circumstances, has nothing to do w/ Canada, has mostly associations with the USA deep south, but is getting NO USA Google traffic, only Google Canadian traffic. Major bummer--to me and probably to the searchers. Huge adsense decline. HUGE.

and a search for one type of lighting fixture that is featured on the subweb site, where my former number 1 page about ordering said fixture from overseas , step-by-step onsite assembly , etc...GG now has a sexual postion website in #1 SERP. Weird, and surely not what the majority of searchers are looking for under that term. I agree with others that Google has thrown out the (honeybaked)ham with the spam.

Reid

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 29656 posted 3:39 am on Jun 1, 2005 (gmt 0)

thanks for the post trillianjedi - good thread.

If you are using relative linking then I would reccommend a base meta tag on every page to prevent confusion of user-agents. Especially ones that are plotting your linking structure.

Reid

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 29656 posted 3:47 am on Jun 1, 2005 (gmt 0)

AND--first time for this--hosting geo location problem


thats interesting, did you get around it and how?

I have a location-specific site about a canadian locale but the site is hosted in LA. I get lots of US and CA GG traffic. And that's good because my major market is tourism (ie Americans).

shrimp

10+ Year Member



 
Msg#: 29656 posted 4:15 am on Jun 1, 2005 (gmt 0)

How did I get around it?
It was never an issue before. My US to CA traffic seemed about proportional to the number of searchers you'd expect from those locations for the subject at hand. It's just that the US traffic is now gone. Only Canada traffic remains.

max_mm

10+ Year Member



 
Msg#: 29656 posted 4:18 am on Jun 1, 2005 (gmt 0)

Don't be bothered by posts trying to discredit the scrapper theory. The posters are probably running a few of those. The scrapper theory (conveyed in threads like these) can't be good for their business aka adsense $$.

Scrapper operators enjoy the ride. It wil probably take Google months if not years to fix it. Their algo is based on the quality of incoming links and their only way to fix this is to ditch their current algo/filters and get back to the drawing board.

Googlís recent problems start and ends with millions of scrappers linking to innocent web sites. Full stop.

PatrickDeese

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 29656 posted 4:44 am on Jun 1, 2005 (gmt 0)

> Don't be bothered by posts trying to discredit the scrapper theory.

A) Its not scrapper - its scraper. A screenscraper is a program which "scrapes" the visible content from a page. People are using screenscrapers to go to sites like Yahoo and copy the SERPs.

B) Just about every top 10 site for any commercial niche in the world has links from scraper sites. Why didn't every top 10 site in the world get hurt by the Bourbon update?

C) I don't have any scraper sites - and I am discrediting the "scraper" theory.

The algo updates on a fairly regular basis. Sometimes a few good sites get taken down with the bad ones. Sometimes a good site that was successfully using borderline techniques gets caught.

Getting caught sucks. But whether you are a good site that got caught in the sweep, or a good site that was using borderline techniques, the point is moot.

Look at what sites stayed in your SERPs and take them apart. Learn from the survivors and make the appropriate adjustments.

danny

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 29656 posted 5:10 am on Jun 1, 2005 (gmt 0)

Look at what sites stayed in your SERPs and take them apart. Learn from the survivors and make the appropriate adjustments.

I've never competed for any particular keywords - except maybe "book reviews", but that was more vanity than anything else - and the SERPs mostly look fine to me. So the "survivors" include sites of all kinds - it's just my pages that are on page six instead of page one.

So I have absolutely no idea whatsoever what the "appropriate adjustments" might be - I realise taking AdSense off my site is clutching at straws, but I just can't think of anything else.

Reid

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 29656 posted 5:13 am on Jun 1, 2005 (gmt 0)

That's funny
I'm getting referrals from Google USA CA and UK. But USA outweighs the others by far. Maybe it's the geo-location of the server combined with the niche.
I wonder what character set are you using?
is it en-us or -en? That may help, making it en-us would at least give the site an american accent.
The plain -en is British and (by extension) canadian.
But i would be more inclined to believe that it is the server-location.
(pssst thats why they came up with .ca which you need a registered canadian business # to get)

max_mm

10+ Year Member



 
Msg#: 29656 posted 5:16 am on Jun 1, 2005 (gmt 0)

A) Its not scrapper - its scraper. A screenscraper is a program which "scrapes" the visible content from a page. People are using screenscrapers to go to sites like Yahoo and copy the SERPs.

B) Just about every top 10 site for any commercial niche in the world has links from scraper sites. Why didn't every top 10 site in the world get hurt by the Bourbon update?

C) I don't have any scraper sites - and I am discrediting the "scraper" theory.

The algo updates on a fairly regular basis. Sometimes a few good sites get taken down with the bad ones. Sometimes a good site that was successfully using borderline techniques gets caught.

Getting caught sucks. But whether you are a good site that got caught in the sweep, or a good site that was using borderline techniques, the point is moot.

Look at what sites stayed in your SERPs and take them apart. Learn from the survivors and make the appropriate adjustments.

Crap!

I already checked the competition out.

Here is what i discovered. Most of them are new sites with very little incoming links. And almost 0 scraper links. They are not performing well on yahoo for their keywords.

And lets see if you can answer this question:
Wouldnít a site with thousands of incoming links rank well for it's keywords? then why the exact the opposite happens? the site disappear!?

More question for you Mr. know it all:
1) Is it not a fact that poogle penalize for too many incoming links gained in a very short time?

2) Is it not a fact that poogle penalizes sites for duplicate content?

3) Is it not a fact that poogle can't handle 301 and 302 redirects properly and penalize the innocent linked sites?

What all the above 3 questions have in common. They are describing the technics scrapers are using to link to your content. Simple common sense Iím sure you, and many others would agree. Unless their brains are poisoned by poogleís false/outdated webmaster guidelines.

Now i would love to see you continue to discredit the scarper theory.

The amount of scraper links has not reached a certain threshold yet, this is the only reason it hasnít caught up with your sites yet. Give it some more time pal and you'll see for yourself what i am talking about.

Reid

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 29656 posted 5:39 am on Jun 1, 2005 (gmt 0)

Danny
try linking to relevant sites (or even non-relevant but clean sites who are a synomom of the keyword in some cases <this is gold because the others don't do it so you are the only 'link' between the 2 undistinguishable niches>) who are ranking top 10 for the keyword you are targetting.
Who you link to does nothing for PR so they say, but it can go a long way in placing yourself into their specific niche for that keyword.
This will only give the algo 'hints' of what niche you belong in, for whatever reasons unknown. Then you are truly competing with the top 10 for that word.
Just pick the best ones. even 2 or 3 out of the top 10. What better way can you let google know that you belong in that niche?

PatrickDeese

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 29656 posted 5:50 am on Jun 1, 2005 (gmt 0)

max_mm I would recommend that you get your nose out of your tiny niche, and try to get grasp of the "big picture".

I have over 90 sites in different niches, and only one was affected by Bourbon, and I've already managed to recover its positions by adding 3 new links to the site from on-theme pages.

danny

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 29656 posted 6:42 am on Jun 1, 2005 (gmt 0)

try linking to relevant sites (or even non-relevant but clean sites who are a synomom of the keyword in some cases) who are ranking top 10 for the keyword you are targetting

I'm not targetting any particular keywords - but it's book titles and authors that are in my page titles and headings, so I guess I can be considered to be targetting those.

Typical SERPs for the more obscure titles I've reviewed (searching on title or title and author) now look like this: 1) Amazon's page; 2) a review on an academic site; 3) an Amazon associate page with no original content; 4) a shop site with purchasing information but no other content; 5) another Amazon associate page 6) an academic site that mentions the book briefly; 7) through 119) academic pages or PDFs that cite the book in their bibliography, with more Amazon affiliates and other shops mixed in, plus pages that don't even mention the title but have all the words in it; and 120) my 600 word review.

Apart from my Amazon links, I only have links to publisher sites, official author sites, and one other quality review site. For many of the titles I've reviewed, that's all there is for real content - and for many there aren't even decent publisher and author pages.

In any event, I'm a book reviewer, not a directory maintainer... I don't have the time or infrastructure to manage definitive collections of links for popular books or authors.

max_mm

10+ Year Member



 
Msg#: 29656 posted 8:19 am on Jun 1, 2005 (gmt 0)

I have over 90 sites in different niches

Wow, Iím impressed. You must be an extremely busy webmaster to run and manage 90+ web sites. Makes One wounder, unless 90% of them are auto generated offcourse.

Good on yah. Happy scraping pal.

experienced

10+ Year Member



 
Msg#: 29656 posted 9:59 am on Jun 1, 2005 (gmt 0)

my site was 2 year old and were doing well without link exchange thing so far. I had started link exchange and offcourse i was getting good links from high PR sites. Site is not too much heavy byte wise. Its approx 60k for home and approx 25-30 for rest of the 450 pages. site was having PR 4 now its 3 with only URl listed in the index. Dont have Dmoz & yahoo listing for any of my site. Is there any way to get listed in yahoo & Dmoz. Tried a lot but nothing happened.

Exp...

MHes

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 29656 posted 12:52 pm on Jun 1, 2005 (gmt 0)

>Wow, Iím impressed. You must be an extremely busy webmaster

PatrickDeese may have 400 copywriters. You jump to conclusions, like the scraper theory.

The scraper theory is pure speculation and I don't buy it. Looking at links in shown by alltheweb for various sites we run, there is a clear correlation between number of links in and the sites that do well on google. Despite most links in being scraper sites, the more we have, the better our google ranking.

This may not mean scraper site links help with google rankings, but it does suggest they do no harm. The fact we have a lot of scraper sites showing as back links may be a signal that the site is generally 'visible' throughout the net and thus we acquire the important 'natural links' as well, in a comparable proportion.

max_mm

10+ Year Member



 
Msg#: 29656 posted 1:23 pm on Jun 1, 2005 (gmt 0)

This may not mean scraper site links help with google rankings, but it does suggest they do no harm. The fact we have a lot of scraper sites showing as back links may be a signal that the site is generally 'visible' throughout the net and thus we acquire the important 'natural links' as well, in a comparable proportion.

You'll do fine as long as it is a comparable proportion. Watch your rank tanks once the scales tip.

oddsod

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 29656 posted 1:36 pm on Jun 1, 2005 (gmt 0)

max_mm, you don't advance your theory by attacking others. I have great respect for PatrickDeese and have learnt a lot from his posts. Do yourself a favour and consider the possibility of him being correct. The furthur you go down the wrong road the longer the walk back to the start.

I have several sites that have been scraped extensively. Not all of them have fared badly in Bourbon.

PatrickDeese

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 29656 posted 3:15 pm on Jun 1, 2005 (gmt 0)

> Happy scraping

I've been making content websites since 1996, and not all of my 90 sites were made yesterday.

As a matter of fact, I have had at times, up to 4 full-time employees writing content for me.

I closed my business last December because I was making so much more via paid advertising, affiliate programs, Adsense that administering the business wasn't worth it to me any more.

My strategy is to only target Long Tail [google.com] terms, and I've been doing it for that since Adsense started, and that technique has paid off in spades.

Hate to break it to you, but not everyone started making websites last week, last month or even last year. I have slight head start. ;)

Just because someone disagrees with your tinfoil hat [google.com] theory, it doesn't mean that they run "scraper sites". That is what is a called an ad hominem [google.com] argument, generally employed by people who cannot defend their ideas with facts.

I am sorry to have to insist that you're wrong, but honestly, the sooner you get your head out of your tiny niche and try to figure out what's really going on, the better off you will be.

Reid

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 29656 posted 6:52 pm on Jun 1, 2005 (gmt 0)

That sounds like a pretty tough niche Danny, competing with lots of Amazon stuff with the same book titles.
You are definitely targetting specific keywords and phrases - Author and title.
You need to get pretty sly to compete in there.
There are some things you could make sure of.
1. look at your pages with a bot sim, you may be surprised at how it looks to a bot. You may see room for improvement.
2. link to the page using book title in the anchor text.
3 Make sure you have a well defined header structure
<h1></h1>
<h2></h2>
<h3></h3>
<h2></h2>
<h3></h3>
this is very important to bots
also book title and author should appear
In bold, itallic and normal text. Try playing with number of occurances.
Got an image? title and author in alt attribute even if the image is not of the book - let google think it is.

I'm sure you've done a lot of that already.
Also try to get the keywords up in the first 2-300 characters (including code) AND near the bottom of the page.
Just got to find the right mix to rank in your niche without going overboard. Then use the same tecnique on all your reviews.
For the Author I would also
firstname lastname AND lastname firstname try to cover all bases.
I don't know if Im preaching to the choir here but it sounds like you may be more of a writer than an SEO and some of these tecniques may sound 'spammy'. But to a bot the same word occuring in different contexts (bold, italic, Header, alt, title, ect) gives the word weight. Hope some tidbits of info you could use.

tigherr

5+ Year Member



 
Msg#: 29656 posted 10:16 pm on Jun 1, 2005 (gmt 0)

i suspect we will find the only commonality between the dropped sites will be a total lack of commonality, but for what it's worth, here are my responses to the proposed questionnaire:

1) Site size
Pretty big - no idea how many pages, probably >10k, maybe >20k. index page 26k.

2) Outbound links
couldnt say.

3) Inbound links
LinkPop reports 60,818, of which ZERO are from Google.

4) Adsense
Yes

5) Content updated regular?
Updated often.

6) Adwords
No.

7) Age of site?
About 1 year.

8) Use of no follow tags on forums?
No forum.

9) Location of host sever
UK

10) Dedicated or shared hosts?
Shared.

11) Redirects?
At the time the site disappeard from Google's listings, one. Now, 3.

12) Scrapper sites linking to you / content theft?
maybe. Gambling sites keep offering me reciprocal links, i always turn them down and ask to have any inbound links removed. who knows if they do it, though?

13) Are you listed in dmoz?
Not yet.

14) Listed in Yahoo directory?
Yes.

15) RSS feeds on site?
Both in and out.

16) Pagerank ( before it dissapeared )
Pagerank 4.

17) Extra domains pointing to main domain?
All my sites link to all my other sites, pretty much.

18) Search engine friendly archives producing same content on different urls inside site?
Not sure what this means, but yeah, if it means do i have directory pages.

19) Did you bother taking LSI into consideration with onpage content?
No.

20) Do you use any of the following words on your homepage - under construction, updating, re-design, upgrading, etc.
I used re-design i think for a little while, after a re-design (whaddyaknow).

21) Do you use more than 1 way to link back to your homepage from every other page in your site?
Two links back to homepage on every page.

A point worth bringing up. Has anyone else been hacked? My home page was removed and replaced by a page that was blank except for the words 'we are nerds' or something like that. It was shortly after that that i discovered i had zero entries on google, from thousands previously, and no page rank either.

PatrickDeese

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 29656 posted 11:34 pm on Jun 1, 2005 (gmt 0)

> Has anyone else been hacked? My home page was removed

You might want to check your robots.txt file. Someone in the supporters' forum had their vbulletin powered site hacked and the robots.txt file was changed to block all SE bots.

Reid

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 29656 posted 2:20 am on Jun 2, 2005 (gmt 0)

My home page was removed and replaced by a page that was blank except for the words 'we are nerds' or something like that. It was shortly after that that i discovered i had zero entries on google, from thousands previously, and no page rank either.

Did you look at the code on nerds page? Obviously they hijacked your site somehow - you need to check everything from the ground up - especially offsite URL's.
And change your password

oldpro

5+ Year Member



 
Msg#: 29656 posted 3:59 am on Jun 2, 2005 (gmt 0)

Also try to get the keywords up in the first 2-300 characters (including code) AND near the bottom of the page.

Reid...Excellent post!

Have a guestion regarding your comment. About what percentage of the characters in the first 200-300 should one shoot for in regards to the targeted keywords? With doctype, Meta tags included, this seems very hard to do with the exception of meta keywords. Or as I see alot of high ranking sites on google not using doctype or meta tags...would this explain their high ranking as it is easier to do as you suggest?

And when you say at or near the end of your code. Is it best to have a short sentence with your keywords right before the </body>?

tigherr

5+ Year Member



 
Msg#: 29656 posted 6:05 am on Jun 2, 2005 (gmt 0)

re checking robots.txt file

it looks ok - same as on my own system - but i never did know for sure what it SHOUKD say anyway

any comments?

Reid

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 29656 posted 10:37 pm on Jun 2, 2005 (gmt 0)

Have a guestion regarding your comment. About what percentage of the characters in the first 200-300 should one shoot for in regards to the targeted keywords? With doctype, Meta tags included, this seems very hard to do with the exception of meta keywords.

I would say 'at the earliest and lastest possible convenience'. You should write for the browsers but keep these concepts in the back of your mind while writing.
The description tag is a good place for keywords. And try not to have a huge amount of code between <body> and the first <p>
I use a doctype - to set browsers out of glitch mode.
But strip away any unnessessary meta tags and refine your code as much as possible to get the text started.
At the end of document - last paragraph is close enough to the end in my mind.
keyword density is a big topic. I have some pages that it's hard not to be repetitive.
reddish-blue widgets fall into the class of red-widgets and blue-widgets, but if you have reddish-widgets in with your blue-widgets then you must put a note on the red-widgets bin saying that the reddish-blue widgets are in the blue-widgets bin.
while in other cases it is very hard to be repetitive.

Richie0x

10+ Year Member



 
Msg#: 29656 posted 3:19 pm on Jun 4, 2005 (gmt 0)

1 Size: 395 pages, front page is 119kb inc images
2 Outbound: 1 outbound link to another site on front page, around 200 over the entire site
3 Inbound: 240 inbound links, but Google only seems to recognise several of them for pagerank
4 Adsense: Yes, only on the 30 most visited pages of the site
5 Updates: A few pages are updated weekly including the front page
6 Adwords: No
7 Age: Site is 6 years old, moved to a new domain 3 years ago
8 No follow tags: N/A
9 Server loc: Gloucester, UK
10 Server type: Shared
11 Redirects: Only 301 PHP redirects from deleted pages to new ones
12 Scrapper links: No
13 Dmoz: Yes
14 Yahoo dir: Yes
15 RSS: No
16 Pagerank: 3/10
17 Extra doms: No
18 Archives: No
19 LSI: Didn't know what LSI was ;)
20 Words: No
21 Homepage links: Only one link back to front page from every page of site

zeus

WebmasterWorld Senior Member zeus us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 29656 posted 8:16 pm on Jun 4, 2005 (gmt 0)

I think the most who has lost rankings now to 12 month ago, have troubles with

11) Redirects?
Do you use any kind of redirects that Google may be having trouble with?
12) Scrapper sites linking to you / content theft?
Do you have lots of scrapper sites suddently linking to you or using parts of your content.

and the googlebug 302 redirecting links.

For safty, look at inurl:yourdomain.com look for other domains with your title and description.

This 97 message thread spans 4 pages: < < 97 ( 1 [2] 3 4 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved