Forum Moderators: Robert Charlton & goodroi
1) Site size
It is reasonable to believe that a homepage that size is too large, will have too many people click away so may loose relevancy to show in the top results.
2) Outbound links
How many do you have on your homepage?
3) Inbound links
How many does your site have?
4) Adsense
Since it may connect innocent sites with scraper sites.. do you use it?
5) Content updated regular?
Some sites do not have content updated too much because they offer tools over info, but Google may consider sites with rare content updates to be poor quality and drop positions for them.
6) Adwords
Do you use paid advertising like adwords, maybe loosing some places will make you pay more or perhaps Google protects their paying clients?
7) Age of site?
How old is your site, perhaps older sites are likely to be better because they survived... so Google keeps them listed high?
8) Use of no follow tags on forums?
If you offer forums or blogs, do you use the no follow tag? Maybe we need to stop bad sites linking inside our sites?
9) Location of host sever
Maybe our host location plays a part in how high we rank to certain users?
10) Dedicated or shared hosts?
Are we being punished for what other sites do in a shared hosting enviroment?
11) Redirects?
Do you use any kind of redirects that Google may be having trouble with?
12) Scrapper sites linking to you / content theft?
Do you have lots of scrapper sites suddently linking to you or using parts of your content.
13) Are you listed in dmoz?
Perhaps Google pays more respect to dmoz listed sites?
14) Listed in Yahoo directory?
Perhaps Google doesnt want Yahoo directory pages to be listed high or maybe it does prefer them linking high?
15) RSS feeds on site?
Using rss feeds might be causing some kind of duplicate content penalty?
16) Pagerank ( before it dissapeared )
What was your pagerank.. maybe a high pagerank gets immune to any penalisations?
17) Extra domains pointing to main domain?
Do you have other domains pointing to your main domain that might be causing problems in Googles eyes?
18) Search engine friendly archives producing same content on different urls inside site?
Some forums like vbulletin have a search engine archive that produces the same content with a static html url.. maybe this might be picked up as duplicate content?
19) Did you bother taking LSI into consideration with onpage content?
Basically it seems Goolge is now using Latent Semantic Indexing in search results - so a search for zoo trips may look at page content and realise that zoo , wildlife and trips are related. So search results could give you wildlife trips for the term zoo trips.
It's a long list but if we all fill it in, we could then put the results in excel and compare them.. maybe see a pattern that all sites dropped might have. Then we can test that pattern aganst sites will did not get dropped.
Might be useful.
Its very unlikely to be scraper sites. We have a site that was running top for 2 years in a competitive sector. During Febuary it dropped 70% of Google traffic. It was replaced by another site of ours which is an old ppc directory with loads of dead links, few good links in and had not been touched for a year. Both sites have scraper sites linking to them in large numbers.
This month the original site is back with a vengence and the old site has dropped a bit. We have not touched either. Another seriously good site in the same sector has now dropped.
It is impossible to work out why sites fluctuate. We don't even touch them and if you do, you may prevent the possible 'come back'.
The truth is, sites go up and down. If you are completely out of the index then you are very unlikely to ever recover.
The truth is, sites go up and down. If you are completely out of the index then you are very unlikely to ever recover.
I'm in the index - Googlebot indexes every page on the site at least daily - just showing up on page five for any searches.
But you're right, things do go up and down - my site was fine in January, got worse in February, and now has been hammered nearly completely, all without my doing anything (except steadily adding more reviews to it).
AND--first time for this--hosting geo location problem
both sites just happen to be hosted by a fine hosting company located in Canada. Neither site has anything at all to do w/ Canada, subject of one is non-geographical and just a family recreational niche.....but it has lost 1/2 USA GG traffic,
subweb, that was never intended to be a stand alone website, but just ended up that way thru odd circumstances, has nothing to do w/ Canada, has mostly associations with the USA deep south, but is getting NO USA Google traffic, only Google Canadian traffic. Major bummer--to me and probably to the searchers. Huge adsense decline. HUGE.
and a search for one type of lighting fixture that is featured on the subweb site, where my former number 1 page about ordering said fixture from overseas , step-by-step onsite assembly , etc...GG now has a sexual postion website in #1 SERP. Weird, and surely not what the majority of searchers are looking for under that term. I agree with others that Google has thrown out the (honeybaked)ham with the spam.
Scrapper operators enjoy the ride. It wil probably take Google months if not years to fix it. Their algo is based on the quality of incoming links and their only way to fix this is to ditch their current algo/filters and get back to the drawing board.
Googl’s recent problems start and ends with millions of scrappers linking to innocent web sites. Full stop.
A) Its not scrapper - its scraper. A screenscraper is a program which "scrapes" the visible content from a page. People are using screenscrapers to go to sites like Yahoo and copy the SERPs.
B) Just about every top 10 site for any commercial niche in the world has links from scraper sites. Why didn't every top 10 site in the world get hurt by the Bourbon update?
C) I don't have any scraper sites - and I am discrediting the "scraper" theory.
The algo updates on a fairly regular basis. Sometimes a few good sites get taken down with the bad ones. Sometimes a good site that was successfully using borderline techniques gets caught.
Getting caught sucks. But whether you are a good site that got caught in the sweep, or a good site that was using borderline techniques, the point is moot.
Look at what sites stayed in your SERPs and take them apart. Learn from the survivors and make the appropriate adjustments.
Look at what sites stayed in your SERPs and take them apart. Learn from the survivors and make the appropriate adjustments.
I've never competed for any particular keywords - except maybe "book reviews", but that was more vanity than anything else - and the SERPs mostly look fine to me. So the "survivors" include sites of all kinds - it's just my pages that are on page six instead of page one.
So I have absolutely no idea whatsoever what the "appropriate adjustments" might be - I realise taking AdSense off my site is clutching at straws, but I just can't think of anything else.
A) Its not scrapper - its scraper. A screenscraper is a program which "scrapes" the visible content from a page. People are using screenscrapers to go to sites like Yahoo and copy the SERPs.B) Just about every top 10 site for any commercial niche in the world has links from scraper sites. Why didn't every top 10 site in the world get hurt by the Bourbon update?
C) I don't have any scraper sites - and I am discrediting the "scraper" theory.
The algo updates on a fairly regular basis. Sometimes a few good sites get taken down with the bad ones. Sometimes a good site that was successfully using borderline techniques gets caught.
Getting caught sucks. But whether you are a good site that got caught in the sweep, or a good site that was using borderline techniques, the point is moot.
Look at what sites stayed in your SERPs and take them apart. Learn from the survivors and make the appropriate adjustments.
Crap!
I already checked the competition out.
Here is what i discovered. Most of them are new sites with very little incoming links. And almost 0 scraper links. They are not performing well on yahoo for their keywords.
And lets see if you can answer this question:
Wouldn’t a site with thousands of incoming links rank well for it's keywords? then why the exact the opposite happens? the site disappear!?
More question for you Mr. know it all:
1) Is it not a fact that poogle penalize for too many incoming links gained in a very short time?
2) Is it not a fact that poogle penalizes sites for duplicate content?
3) Is it not a fact that poogle can't handle 301 and 302 redirects properly and penalize the innocent linked sites?
What all the above 3 questions have in common. They are describing the technics scrapers are using to link to your content. Simple common sense I’m sure you, and many others would agree. Unless their brains are poisoned by poogle’s false/outdated webmaster guidelines.
Now i would love to see you continue to discredit the scarper theory.
The amount of scraper links has not reached a certain threshold yet, this is the only reason it hasn’t caught up with your sites yet. Give it some more time pal and you'll see for yourself what i am talking about.
try linking to relevant sites (or even non-relevant but clean sites who are a synomom of the keyword in some cases) who are ranking top 10 for the keyword you are targetting
I'm not targetting any particular keywords - but it's book titles and authors that are in my page titles and headings, so I guess I can be considered to be targetting those.
Typical SERPs for the more obscure titles I've reviewed (searching on title or title and author) now look like this: 1) Amazon's page; 2) a review on an academic site; 3) an Amazon associate page with no original content; 4) a shop site with purchasing information but no other content; 5) another Amazon associate page 6) an academic site that mentions the book briefly; 7) through 119) academic pages or PDFs that cite the book in their bibliography, with more Amazon affiliates and other shops mixed in, plus pages that don't even mention the title but have all the words in it; and 120) my 600 word review.
Apart from my Amazon links, I only have links to publisher sites, official author sites, and one other quality review site. For many of the titles I've reviewed, that's all there is for real content - and for many there aren't even decent publisher and author pages.
In any event, I'm a book reviewer, not a directory maintainer... I don't have the time or infrastructure to manage definitive collections of links for popular books or authors.
Exp...
PatrickDeese may have 400 copywriters. You jump to conclusions, like the scraper theory.
The scraper theory is pure speculation and I don't buy it. Looking at links in shown by alltheweb for various sites we run, there is a clear correlation between number of links in and the sites that do well on google. Despite most links in being scraper sites, the more we have, the better our google ranking.
This may not mean scraper site links help with google rankings, but it does suggest they do no harm. The fact we have a lot of scraper sites showing as back links may be a signal that the site is generally 'visible' throughout the net and thus we acquire the important 'natural links' as well, in a comparable proportion.
This may not mean scraper site links help with google rankings, but it does suggest they do no harm. The fact we have a lot of scraper sites showing as back links may be a signal that the site is generally 'visible' throughout the net and thus we acquire the important 'natural links' as well, in a comparable proportion.
You'll do fine as long as it is a comparable proportion. Watch your rank tanks once the scales tip.
I have several sites that have been scraped extensively. Not all of them have fared badly in Bourbon.
I've been making content websites since 1996, and not all of my 90 sites were made yesterday.
As a matter of fact, I have had at times, up to 4 full-time employees writing content for me.
I closed my business last December because I was making so much more via paid advertising, affiliate programs, Adsense that administering the business wasn't worth it to me any more.
My strategy is to only target Long Tail [google.com] terms, and I've been doing it for that since Adsense started, and that technique has paid off in spades.
Hate to break it to you, but not everyone started making websites last week, last month or even last year. I have slight head start. ;)
Just because someone disagrees with your tinfoil hat [google.com] theory, it doesn't mean that they run "scraper sites". That is what is a called an ad hominem [google.com] argument, generally employed by people who cannot defend their ideas with facts.
I am sorry to have to insist that you're wrong, but honestly, the sooner you get your head out of your tiny niche and try to figure out what's really going on, the better off you will be.
I'm sure you've done a lot of that already.
Also try to get the keywords up in the first 2-300 characters (including code) AND near the bottom of the page.
Just got to find the right mix to rank in your niche without going overboard. Then use the same tecnique on all your reviews.
For the Author I would also
firstname lastname AND lastname firstname try to cover all bases.
I don't know if Im preaching to the choir here but it sounds like you may be more of a writer than an SEO and some of these tecniques may sound 'spammy'. But to a bot the same word occuring in different contexts (bold, italic, Header, alt, title, ect) gives the word weight. Hope some tidbits of info you could use.
1) Site size
Pretty big - no idea how many pages, probably >10k, maybe >20k. index page 26k.
2) Outbound links
couldnt say.
3) Inbound links
LinkPop reports 60,818, of which ZERO are from Google.
4) Adsense
Yes
5) Content updated regular?
Updated often.
6) Adwords
No.
7) Age of site?
About 1 year.
8) Use of no follow tags on forums?
No forum.
9) Location of host sever
UK
10) Dedicated or shared hosts?
Shared.
11) Redirects?
At the time the site disappeard from Google's listings, one. Now, 3.
12) Scrapper sites linking to you / content theft?
maybe. Gambling sites keep offering me reciprocal links, i always turn them down and ask to have any inbound links removed. who knows if they do it, though?
13) Are you listed in dmoz?
Not yet.
14) Listed in Yahoo directory?
Yes.
15) RSS feeds on site?
Both in and out.
16) Pagerank ( before it dissapeared )
Pagerank 4.
17) Extra domains pointing to main domain?
All my sites link to all my other sites, pretty much.
18) Search engine friendly archives producing same content on different urls inside site?
Not sure what this means, but yeah, if it means do i have directory pages.
19) Did you bother taking LSI into consideration with onpage content?
No.
20) Do you use any of the following words on your homepage - under construction, updating, re-design, upgrading, etc.
I used re-design i think for a little while, after a re-design (whaddyaknow).
21) Do you use more than 1 way to link back to your homepage from every other page in your site?
Two links back to homepage on every page.
A point worth bringing up. Has anyone else been hacked? My home page was removed and replaced by a page that was blank except for the words 'we are nerds' or something like that. It was shortly after that that i discovered i had zero entries on google, from thousands previously, and no page rank either.
My home page was removed and replaced by a page that was blank except for the words 'we are nerds' or something like that. It was shortly after that that i discovered i had zero entries on google, from thousands previously, and no page rank either.
Also try to get the keywords up in the first 2-300 characters (including code) AND near the bottom of the page.
Reid...Excellent post!
Have a guestion regarding your comment. About what percentage of the characters in the first 200-300 should one shoot for in regards to the targeted keywords? With doctype, Meta tags included, this seems very hard to do with the exception of meta keywords. Or as I see alot of high ranking sites on google not using doctype or meta tags...would this explain their high ranking as it is easier to do as you suggest?
And when you say at or near the end of your code. Is it best to have a short sentence with your keywords right before the </body>?
Have a guestion regarding your comment. About what percentage of the characters in the first 200-300 should one shoot for in regards to the targeted keywords? With doctype, Meta tags included, this seems very hard to do with the exception of meta keywords.
I would say 'at the earliest and lastest possible convenience'. You should write for the browsers but keep these concepts in the back of your mind while writing.
The description tag is a good place for keywords. And try not to have a huge amount of code between <body> and the first <p>
I use a doctype - to set browsers out of glitch mode.
But strip away any unnessessary meta tags and refine your code as much as possible to get the text started.
At the end of document - last paragraph is close enough to the end in my mind.
keyword density is a big topic. I have some pages that it's hard not to be repetitive.
reddish-blue widgets fall into the class of red-widgets and blue-widgets, but if you have reddish-widgets in with your blue-widgets then you must put a note on the red-widgets bin saying that the reddish-blue widgets are in the blue-widgets bin.
while in other cases it is very hard to be repetitive.