Welcome to WebmasterWorld Guest from 18.104.22.168
First some background. The site is a well-established, deep-information site with many, many thousands of pages and a PR 6 on the home page. While we have attempted to get some links to us, most of the hundreds of links to us are spontaneous from a variety of professionals who find our content useful. Therefore, we're not at all dependent on the "latest" SEO tricks - totally white hat.
Up until this week, we got >15000 Google referals a day. We are not dependent on ranking for "blue widgets" or any other identifiable term - our referals come from thousands of different keywords a day which reflects the diversity of our content. Therefore, only a massive drop in the SERPs across the board can cause a >90% drop in referals, as we are seeing.
We still are in the index with the same number of pages and our backlinks don't seem changed. We still have the same PR showing throughout the site (for whatever that's worth since if there are changes, they probably wouldn't show immediately anyway).
Here's the kicker: Another site we own, let's call it widgetville.com, is showing up ahead of our real site, widgetville.org, in the SERPs when you search Google for "Widgetville". The higher widgetville.com site is shown without title or description. Widgetville.com has been 301 redirected to widgetville.org. Widgetville.com does have a backlink or two out in the world, but not the hundreds that the real site, widgetville.org, has so I don't understand the higher ranking.
If you search for "a bunch of widget words that you find on the front page", three other web sites who quote our mission statement appear on the page and our page doesn't. However, if you click on the link to show "omitted" results, we are listed as the omitted page.
In a way, it seems almost like our home page has been hijacked by our own non-functioning site. And it also seems to be like the whole canonical root problem that trips up some site owners except in our case it is between two domains, not a problem of Google getting confused between widgetville.com and widgetville.com/index.html.
We've had this problem before - a year ago - and I queried Google about the problem. I was told that it was a problem on their end, not mine, and they would fix it. The widgetworld.com listing was removed and within weeks, my traffic grew from a trickle to where I started hiring people to deal with the blossoming new customer base. Now all of that is threatened.
So, anyone want to take a crack at explaining this or giving advice on how to handle it?
I have taken one step to see what happens. I've removed the 301 redirect from widgetville.com and put a simple sentence on the page that says to click on the link to widgetville.org. I did this to disassociate widgetville.com from widgetville.org in case Google was somehow seeing duplicate content from the 301. Not sure how that would happen exactly since that is the prefered method of dealing with pages that are no longer valid, but this whole thing throws me for a loop.
Graph is starting to sky rocket around the beginning of April and goes from 150 to 20,000 links by mid May.
I checked other affected sites and i get a very similar picture. Exponential growth in incoming links over a very short period of time.
Another temporary solution maybe to change all file names on the server and re-submit to G while deleting the old pages. I think the penalty (over linking from scrappers) is on a page by page basis. This would be a very temporary solution though till Y next crawl which will then make all the new file names available again to scrappers via its free feeds.
If a site magically gets 20,000 links in a short time from scaper sites then doesn't that kind of say something about the quality of the site? It means it was a top result to be or it would not have been found by the scrapers.
I think google tried to get a little too clever with their algo the past few years. There was nothing wrong with google two years ago.
I've had fair success with my hotel site <snack> for about 4 years.
Friday it completely dived from top 5 to pages 7 or 8.
This happened over a number of days across different DCs, but finally went completely this morning (it hung on in google.co.uk).
I've done nothing recently but add legit content and back-links.
When I set the domain up 4 years ago I pointed another domain at it:
I'm not sure if I'm doing something wrong with redirects.
When I do a site: check for www.example.co.uk I get 36,300, exactly the same for example.co.uk
The .com domain gets 15 for www.example.com and example.com for the same check.
When I check the http header response I get 200 for all of them.
Is this a problem?
[edited by: Brett_Tabke at 2:12 pm (utc) on May 24, 2005]
[edit reason] examplified [/edit]
I've done some additional research. I have found crazy scraper sites, with nothing but a bunch of links to our sites, and many others.
Example: www.a-keyword-phrase-here.biz complete with the dashes and everything. What a pile of crap.
I highly recommend surfing your referrer URLS; you're a "deep content site" like many of us have, you'll likely find similar sites, and its hard to believe this doesn't have *something* to do with all our sites getting slammed.
Give it more time. It will catch up with them on next updates unless g do something about their crippled algo. "
If I understand the concept, adding a relatively LARGE amount of links (compared to what they have) quickly is what kills them in Google. But what if these sites ALREADY have a disproportionately large amount of links grandfathered in from a couple years ago before G started checking. Like, say 250,000-300,000 (according to atw anyway - G only shows about 1200 for them)? Albeit some 220,000 of them are site internal. Adding another couple THOUSAND every month is barely a 1% increase, and so probably well under G's radar and thus only serves to make their PR that much stronger such that even a well-mounted toppling campaign would be futile! Thus no matter what happens, the slimeballs never deviate from #1.
I GAINED backlinks... Bad, bad site. Going down in the SERPs.
Therefore, if I want to do better in the next update, I should try and lose some backlinks!
I did look at one site, complaining, (efv) viewed the source code, and it appears to me the problem may be this.
The Adsense code is above the meat and potatoes of the site. Meaning it is taking up the top part of the page, where the most important information, and kw density, should be. Given that, that one act likely was enough, with the changes in this update, to be removed from a premium serp.
In the one I looked at... the source code shows the kw that webmaster does have, above the ad code, is his second level kws, not level one. Level one kw are below the ad code. That is a problem. A big problem. The level one kw are too low in the code. <efv
I suggest that the html code for the Adsense ads be moved in the code, to a position that places it(them) lower in the hmtl of the doc. Yes, I know for those of you who hand code your pages that:
1. You can not put Ads at the top of your page, with the code for it at the bottom without extensive rewrite. That is unfortunate. Now those who have made negative comments about my server, know one of the most important reasons I have stayed with that server for so many years. I said it before, and I will say again, seo. I can put content visually at the top of my page, while the actual code sits at the bottom in under one minute. It enables to me to seo a page, that would otherwise be difficult, if not impossible to seo.
Sometimes, we just need something to be at the top of the page for visitors, but that page element may not be really relevant, or important to our content...off topic, or perhaps ad code. It is a juggling act to get the visitor's attention, and not lose seo. So, the code has to be written so that the most important content of the page is at the top, for maximum seo, and the lessor important content, is in the bottom of the code.
Think that thought avenue through, and you will know what I mean.
2. You need the ads above the fold for max CTR. It is an unfortunate position you find yourself in. But, there is a solution. See #1...then look below.
Look at your code, for those who didn't write it, don't write it, view your source. What is in the top 50-60 lines of your code? Is it your major kw, in proper density, or is it fluff and ad code? You should not have ad code there. Get it out.
It is not an Adsense penalty, it is a bad design problem, originating with the webmaster.
Hope this helps.
We could sure use GG at a time like this to either explain or debunk the "scraper penalty" concept.
The problem with the idea that large numbers of incoming links would downgrade a site is that incoming links are the heart of the ranking process. G does make a distinction between "good links" and "bad links" but unless you linked back into the bad guys (showing cooperation and a link farm deal) it seems G would penalize the site that is linking IN, rather than the recipient of the massive incoming links. In what case would that type of penalty make sense as a part of G's algo?
But think about it, what if thousands of new links are acquired within a very short time frame. In many cases we are talking within a month or two (in my case- one of my site which was affected heavily went virtually from 150 incoming quality links to 20,000 within one month [19,850 incomming scrapper links within one month]). If you were google, wouldn’t you suspect spam, link farming and link buying in such a case? AND PENALISE?
And let me ask the question backwards to further prove this theory. Wouldn’t a site with 20,000 backlinks enjoy top position for my keywords on G? (and we are talking a very old site, an authority site in it’s niche, Very niche industrial topic), then why did it disappear from G all together or come up for very obscure key phrases? why it's G traffic went from 2500 referals per day to almost zero?
Think about it, how can google possibly determine that it was’nt you that created all them scarppers to improve your site rank? This point may also explain the reason for G buying a domain registrant authority recently. They are lost, they can’t tell who is who. Who is linking to who, how much links and why. And everyone falls into the “suspect” category immediately until proven innocent (panelised for an exponential growth in incoming links).
As I said earlier, to me this tells that G's algo of determining site quality and reliance on incoming links is busted. I don’t think you’ll see any of their representative here admitting it. What you would see however is more and more webmasters starting “my site is lost and no where to be seen on the serps” starting new threads here every update, unless G fixes this problem and fast.
To be honest, I don’t see how they are going to fix this problem without reverting to the way they used to count links a year or two years ago.
I think we should assume they give huge incoming links a good look, but many times these are legitimate (e.g. some big new software product launches. It would seem they might *ignore* them or not count them highly, but penalizing the victim seems like it hurts us and their results. I think that all reasonable hypotheses assume Google is sincerely seeking good results. Have you eliminated all the usual suspects at your site such as canonical problems, duplicate content, 301 redirects and outright 302 hijacking?
Same issues you are seeing as well. If you search ourdomainname it is showing the .net version of the name (owned by someone else - mine was registered before it and is registered for a longer period of time) and us (.com) a few spots down. Checked all the usual suspects - nothing "amiss".
So, I'll add a few new pages to show someone still cares and let it ride and see what happens. I think if anything, take it as a prime example on why diversification is so important. You know what they say about a watched pot...
Have you eliminated all the usual suspects at your site such as canonical problems, duplicate content, 301 redirects and outright 302 hijacking?
Comes without saying. Nope it ain't it.
It is the scrappers problem i described and i'm willing to bet my house on it. It all started when Y started to offer it's free feeds to web sites. I have thousands upon thousands of scrappers linking to 4 of my sites and dragging them down (on G serps offcourse, who else.)
Crappy search engine, crappy algo, crappy serps, incompetent PHDs and my business and years of hard work down the toilet.
"Can't Google address majority of scraper links, ROS links problem by devaluing links from pages that have same set of outgoing links?, instead of penalizing the sites?"
I mean credit the site with the benefit of just one link, as in a link from dmoz will not also benefit you with a link from Google directory.
Just hoping the site don't fall any further than #9 out of field of 9,100,000.
Did make it to #8 but slid back down.
I think it is becoming more clear to me now, what happened to all these people in the last update. You have been relying on longevity, inbounds, and traffic levels to carry you, and you did that with bad code.
People... check your code and quit whining and rambling about penalization for this and that. It is the site code. Period. You can no longer ride on longevity and inbounds, and the other long term industry standards at google.
If you fix the code, and get your ducks in a row, all the other will fall into place.
How do I know... you ask. You read my post and dismiss me...and keep rambling. The disaster of 2001 is how I know. (see my other posts) We had this problem back in 2001. After looking at my code, for hundreds of hours, one day it finally jumped out at me, and was obvious.
All I did was test it, one page, fixed my code, but the visual look of the page, never changed at all. Next index, I was #1 again, after three months of toiling away banging my head, and yes whining just like you people. Nothing was added to the page, Nothing was removed. We just plain fixed the code, so everything was in the right order.
Fix the code. Let me say it again. Fix the code.
Fix the code. Let me say it again. Fix the code
In any event I know sites that plain HTML with nothing at all fancy that still got hammered. (And my own site is hardly elaborate: the "content to infrastructure" ratio on my book reviews is pretty high.)
Sorry but i have checked all of the top 10 of my serps (used to be number 1) i dont have actual content (no affiliate code)anywhere in the top 250 lines of code
(very large site so navigation pretty big)
none of my competitors even have content in the top 300 lines of code and some do have affiliate code.
Whilst this maybe a factor i have va hard time believing that this is the problem for everyone
still its a theory and ill do what i can to implement it without causing fall out
P.s FYI i was hijacked 2 months ago now - started to come back just before the update but the update hasn't helped thus yet. A fraction of our bl's have been added to the domain so far and much if the site was deinxed during the hijacking. Our home page had no cashe last week but over the weekend it got cashe.
Unfortunately i am now mia on my main money phrase but i am in the rankings for all of the othr home page terms. No internals are ranking at the moment.
Can't be without Yahoo! traffic -> What's a Yahoo!? -> Can't be without IE -> IE stinks -> Google is Great -> Can't be without Google traffic -> Oh no, Google dropped me I am doomed -> I can live without Google's traffic
Only one step left: "What's Google?", that is when/if searches switch to a better SE.
All I did was test it, one page, fixed my code, but the visual look of the page, never changed at all. Next index, I was #1 again, after three months of toiling away banging my head, and yes whining just like you people....
If you'd changed the color of your type from black to blue, would you assume that the color change was responsible for your improvement in rankings?
FWIW, I just checked the top-ranking page on the Google.com SERP for one of my most important keyphrases (one where I sometimes ranked #1, nearly always ranked in the top 5, and never ranked lower than #8 until last weekend). It has *far* more navigation and other code before the body text than mine does. Yet it's moved into the #1 position, while mine has dropped several pages down in the rankings. Ditto for the #2 site (which is only vaguely related to the keyphrase's topic and has the keyphrase only twice on the visible page although the keyphrase *is* in the domain name).
Even if the evidence tended to support your hypothesis (and it doesn't), common sense would suggest that Google wouldn't take that approach in calculating rankings. Why do I say that? Because, if Google made code placement or organization a major factor in search rankings, the SERPs would quickly be dominated by companies that have the profit incentive, financial wherewithal, and technical resources to optimize all of their pages' code for Google. Pretty soon you wouldn't be able to search on "Jesus" without finding Jesus Jeans in the #1 spot.
I've read through this entire thread, and think I've covered the basics;
- we don't have any 302 redirects in place
- straight html
- white hat SEO
- no duplicate domains
- no major changes to the site recently
- same number of backlinks as before (link:www.mysite.com)
- PR stayed the same
Even stranger, last month I got a phone call from a Google rep. They were preparing Local Search for the UK, and thought my site was a good source of info and wanted to find out more about our company. Wow! Great!
Now this. Phrases I have ranked in the top 5 for several years, now I'm nowhere to be found. Not in Local Search, not anywhere. Other serp show my pages dropping from top 10 to #98 or whatever.
Now, given that G loved my site last month enough to warrant a personal phone call, and now its dropped waaay down the rankings, what's likely to be up? I emailed the rep from Google and have had no reply - he could be away on vacation, for all I know :-)
Perhaps they screwed up while integrating the Local Search results with the main results?
Investigating this, I found that 2 sites had scraped my content. Aha! Perhaps I was penalized for duplicate content - I'm talking hundreds, if not thousands, of copied pages! But these 2 sites seem to have been online for many months, and they hadn't hurt my rankings before (AFAIK)
I checked to see how many of my pages are indexed and found
- site:www.mysite.com shows 43,400 links
- "mysite.com" shows 53,000
Certainly some of those links could be bad neighborhood sites, but its hard to check through all 53,000 to find out.
I'm tempted to just sit tight and work on adding content; that's what got my site the traffic in the first place. Is there anything else I can do other than keep bugging Google about it?
Maybe "mostly original content" doesn't cut it anymore.
I am just as guilty of not having totally original content but I took my licks already with the last few updates. Maybe I had "somewhat original content".
I'd be interested to know what the phrase was that you searched for. I worded my comment to avoid saying "100% original content"!
We get a database of accommodation from a booking agency, so we have several thousand hotel descriptions that are undoubtedly the same as those used by other travel sites that work with the same booking agency. And we do republish some legitimate travel press-releases and a few articles by freelance writers, but aside from that all our content is written in-house.
Just now, if I search for our branded site name - we come up # 1, like we did before.
However, if I search for one of our popular articles, we don't rank in the top 50.
In fact, if I type in "popular article title" and "oursitename" - I get a scraper site first, and a few other sites, and then our page at # 6.
Traffic and revenue is even lower than it was before today. Even the two other small sites that did so well are performing poorly today. Previous to today, we were making 1/2 our normal revenue - enough to survive. The way things are going today, we may be below that, too.
I wonder - all our articles are in a database. If our domain is getting burned, I wonder if we should buy a similar domain name, and "start again"?
In the future, if our main site came back, the other site would just dissappear in the rankings, wouldn't it?