Welcome to WebmasterWorld Guest from

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Penalised Due To Link Spam?

3:02 pm on May 6, 2010 (gmt 0)

New User

5+ Year Member

joined:Dec 3, 2009
votes: 0

New Domain
I bought a new domain on 15th March 2010 for a new site that we are building. The domain is heavily targeted at our main phrase and comprises of a three word search term.

Holding Page and Kick Start SEO
We created a holding page with optimised content, title tag, meta data and made it live about a month ago. I started getting a few inbound links to the holding page to get a head start. Google indexed it within a couple of days and the page started to show up on page 2 for our three word search term. This climbed to first page after a few more days. Everything seemed to be going in the right direction.

Website Completed and Goes Live
The website was completed and made live last Friday ( 30th April 2010 ) with 18 pages, all optimised content, titles, meta data. **I also amended the homepage title slightly**. Sitemap.xml created, submitted to Google. Robots.txt created and all verified within webmaster tools.

Dropped From Google SERPS
Now here's where it gets interesting - on the very next day ( Saturday 1st May 2010 ) the site drops from Google's 1st page for the same three keyword term that it was appearing for before and completely disappears. I get a call from my colleague at around 13:00, whilst I was out shopping with my girlfriend, notifying me of this. I got home around 16:00 to check and in webmaster tools Google said sitemap was downloaded about 3 hours ago and robots.txt downloaded about 15mins ago. So I left it alone for Google to do it's thing and index the new site. Now between Saturday and Monday the site kept coming back on first page and disappearing. It also kept swapping between the old title tag and new title when displayed in the SERPS. Checked Google dance tool and results were mirrored. On Tuesday it has disappeared again and not been back on the first page since.

Possible Causes?
1. This may just be a temporary sandbox related issue and it will resolve itself but it just seems a bit wierd. As mentioned in this post [webmasterworld.com...] i'm seeing the same incorrect readings in Google WMT - submitted 18 urls, indexed 1. But when i perform site: operator it was showing all my pages.

2. Another possible reason may be a link spamming issue that I noticed. Whilst doing my initial link building to get the site indexed I came across a directory that competitors were using. I submitted to this directory and when searching for my domain name "example.com" in Google it's now showing over 8k results and most of them are from this directory. I checked these pages that are supposed to be linking to my site but my link doesn't appear anywhere on the page or source code. The only time when i can see my link is when i check the cached versions of these pages and the source code of the cached versions. How bizarre...

Any advice will be greatly appreciated. I'm unsure of whether to ask the directory to remove my link or wait for Google to settle down.

[edited by: tedster at 5:40 pm (utc) on May 6, 2010]
[edit reason] fix formatting [/edit]

5:43 pm on May 6, 2010 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
votes: 0

Two quick impressions:

1. Yes, it is too early to expect good rankings to be stable.

2. It sounds like the directory may be cloaking - I'd get out of there.
8:07 am on May 7, 2010 (gmt 0)

New User

5+ Year Member

joined:Dec 3, 2009
votes: 0


I will certainly contact the directory now to get them to remove my listing. I'm just concerned It's going to take a while before Google gets rid of all those cloaked pages from it's index.
8:34 am on May 7, 2010 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member themadscientist is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 14, 2008
votes: 62

WOW, what a tactic...

Buy or create a directory. Put links to your site visible for the competition to see and even add a couple more for good measure, then cloak it and remove your links from the cloaked version, but multiply everyone else's by 8,000 knowing the worst that will happen is a 0 gain, but more likely your competition gets penalized and you take their rankings. (I'm not saying this is what the directory did, but it sure sounds like they might have done something.)

I don't like to post possible spam tactics, but maybe someday they'll wake up over there at the plex and start completely discounting (IOW: Create a G-side 'nofollow' reference) the questionable ones quit penalizing sites for inbound links.

IMO All the inbound link penalties are silly, because anyone can manipulate your inbound links and inbound link text if they work at it.
9:05 am on May 7, 2010 (gmt 0)

Full Member

5+ Year Member

joined:Dec 30, 2009
posts: 249
votes: 0

Some serious insight there, TheMadScientist, that would be a pretty full on tactic. I also agree that inbound link penalties are stupid - bad links should just be worth nothing.

Going back to the OP, There could be other problems here - if the site was doing well in search, only had a smallish number of pages and no spidering issues, what was the need for an XML sitemap? In situations like that they are more likely to cause harm than good.

I would seriously investigate the robots.txt too, issues showing only one page indexed can indicate problems here. Unless there are pages or directories that you don't wish to have indexed, again there is little need for a robots.txt and the only thing it can really do is allow you to make errors that harm your site.
6:11 pm on May 7, 2010 (gmt 0)

Junior Member

10+ Year Member

joined:Apr 29, 2008
votes: 0

Hoegaarden would this directory be something along the lines of "Gadgetworld"?

if so chill, I think it's just Google coincidence with all the different data floating around out there and the situation will likely rectify itself with no action from you.

Ive seen this same question countless times, since I used it myself and had the same experience (once out of maybe 25 times back then)

easy to jump to that conclusion but almost definitely not correct.

if its the same directory you just pick up a sitewide link for a a few days, but that can be a lot of pages, after a week or so you drop back to one link, visiting the pages at a later date, youre not in the sidebar anymore, so you dont see the link, and a loss of rankings is 99.99% not due to that.

this is if it's the same directory, otherwise disregard all the above :)
10:08 pm on May 8, 2010 (gmt 0)

New User

5+ Year Member

joined:Dec 3, 2009
votes: 0

Well i'm still waiting to hear from this directory regarding my request for removal.

I totally agree with you guys though, bad links are too easy to deploy in order to knock out competition. I've never been a believer of this tactic actually working.

The sitemap was uploaded because i've always found it good practice to use one. Can you elaborate on the point you made about it causing more harm than good please.

The reason for robots.txt is because the site was built in Joomla and it comes with a standard file. To block administrator and other directories. I've used it many times on other sites and never had any issues.

Oh and thanks for info but, the directory i used is not "Gadgetworld".