Forum Moderators: Robert Charlton & goodroi
I mean absolutely no backlinks AT ALL - currently #6 out of 14m results for its keyword. It has been sitting dormant for over 3 years (I don't have the opportunity to make money from it
There are many other niche websites which rank like that - with only a few links, probably the majority of the internet. For example - a guy who crafts something in his garage. He doesn't need millions of backlinks to rank high for his crafted goods, but his competitor doesn't need but a few dollars to send him away. The new algo would punish him instantly, no review needed. Is he a "bad webmaster" because he did not get thousands of natural links?!?
non-$$$ phrases isn't a valid claim especially for this thread... your competitors are equally disinterested in making money so I don't see any reasons for them to want to harm a "no backlink website"... with even free spammy backlinks that only cost time.
Lol, assumptions again. When did I say that it ranks for a non-money phrase?!? It actually ranks for "buy widgets online", hahaha. I said the opportunity to make the money was gone, because you actually have to sell the widgets to make the money.
Ya I didn't just fall off the turnip truck yesterday ... so your posts do need to making a little sense... not a lot - just a bit.
fathom wrote:
How do they determine "NOT PROVIDED" on search queries?
How many Google products do you use?
How often are you logged in?
How many of your websites are included in your accounts?
Surely the company that controls all this and so much more... it isn't too much of a stretch for them to develop footprints using your online activities.
Still waiting for your website. You have the guts to stand behind your words - then put up or shut up. I am willing to put my money where my mouth is, now it's your turn.
I think you're implying that if I did a Google search for [link sellers] or [linking schemes], or had an email conversation using my Gmail account discussing linking schemes, all while having my website in Google Webmaster Tools or Google Analytics, and then Google notices a sudden increase in links pointing to my website, that they could reasonably determine that I am responsible for those links.
It's a logical scenario, but it's also based on no evidence that I am aware of, nor have you provided any in support. It's 100% speculation. Even worse, it comes off as a "Big Brother is watching you" style conspiracy theory. I don't think it currently qualifies as a valid explanation.
Edit: Also, I hadn't looked into the issue in any detail, but as far as I'm aware, it's never been confirmed (at least publicly) exactly who was responsible for J.C. Penney's spammy backlinks. That they didn't continue after the firing of their SEO company is a pretty good indication, but that's after-the-fact. If Google penalized them based on an accusation, that's a bit frightening.
Competitors can harm your rankings with links. I am 100% certain.
The number of links and amount of money it will take, will depend on many different factors i.e number of links already pointing to the domain.
Though I guess if you're that certain your rankings can't be hurt. Hand your URL over to the guy offering to prove the point. I wouldn't, but if you're so certain and keen to prove you're right, go for it
Negative SEO works, I tried it on two of my domains that were ranked on page #1 for years, for a keyword with 400-500k/month searches. Got it to hit page 6-7 after the negative SEO. All it cost me was a few hundred bucks using xrumer.
[edited by: fathom at 9:22 pm (utc) on Apr 13, 2012]
enigma1 write:
I know for sure there are ways to make spiders see irrelevant or spamy content and penalize someone but that implies there are issues with the site's code or configuration and can cause duplicate content, invoke error pages for valid links or generate new links and much more, all artificially created pretty quick with the aid of a botnet. But that's not up to Google to fix so I would make sure my site is clean of errors before getting into the conclusion it's the spider's fault.
[edited by: fathom at 10:33 pm (utc) on Apr 13, 2012]
Otherwise, Google isn't smart enough to see through a $200 buck scam. Do you really buy that?
Do I think a $200 trick would take down a solid authority site, of course not. But it could certainly take out a site that might have some issues and is not yet established itself as a trusted source.
[edited by: fathom at 11:01 pm (utc) on Apr 13, 2012]
So if you are already doing shady stuff that should remain trusted because you got away with it?
You must work for Google or something because you sure do seem to think everybody is doing something wrong.
I never said that. Those problems could be the way the site is designed and it yet may not have enough juice to counter a negative attack.
My comment was in general, and not really about the OP of this thread so take if for what it's worth.
if you were doing an experiment you would have much more than a posted claim that you did this you would have stats and graphs and... well honestly... when you buy $200 bucks of something for a domain
"I know for sure"... a bold statement based on what?
I know for sure there are ways to make spiders see irrelevant or spamy content and penalize someone but that implies there are issues with the site's code or configuration and can cause duplicate content, invoke error pages for valid links or generate new links and much more, all artificially created pretty quick with the aid of a botnet. But that's not up to Google to fix so I would make sure my site is clean of errors before getting into the conclusion it's the spider's fault.
Yes I know for sure that a code side-effect can be exploited in many different ways. If in doubt you can read other threads of people wondering how they endup with all sort of errors in GWT or just browse questions in the apache server forum. Just one mistake can make the spider see the planet through the site. That can be intentional or unintentional that wasn't my point and I've no idea how it relates to the google webspam team?
Or perhaps you meant code problems may exist but noone knows about them and so you want to see a step by step instruction manual how to do it in order to be convinced? From the angle I consider this may happen is not guessing.
Instead, you should be asking how webmasters react to these problems. Some blame the spiders, some themselves and others are in a limbo. IMO instead of worrying first if others do shady stuff, fix your own problems.