homepage Welcome to WebmasterWorld Guest from 54.205.122.62
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Is this a hi-jack?
hijack google loss of rankings
webdevfv

5+ Year Member



 
Msg#: 30061 posted 1:28 pm on Jun 22, 2005 (gmt 0)

Hi All

Last week my rankings in google tanked. From having numerous #1 spots for good keywords to nothing in the top fifty and most everything dropping by a 100 or 200 positions. My rankings during bourbon update increased so my feeling is that this isn't the reason so I'm trying to find out the root cause.

Have I been given a penalty for 'over optimisation'?
Was it because I removed all my images from google images using robots exclusion?
Or is it more sinister than that?

I've been looking at the hi-jack thing and I want to know whether this is an example.

There are only 4 results using inurl:mysite.co.uk - all are supplemental, two are my site, one is a search directory and another is a redirect which shows my title and description. This last one appears distictly dodgy and I want to know whether it is what you call a hi-jack.

 

Lorel

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 30061 posted 3:47 pm on Jun 22, 2005 (gmt 0)

Hi Webdevfv

When Keyword rank drops validate all your pages first, and not just W3c.org (it doesn't catch broken table tags and other broken tags). Also run a link checker on our site.

If you're concerned about over optimization do a keyword density check to make sure you haven't focused too much on particular phrases/words. Anything over 20% is suspect (Google doesn't use the meta keyword tag so eliminate that from your searches but do include the meta description tag).

The exclusion for Googlebot re images improved traffic for a new site I installed it on when the only traffic it was getting was google images. so that is probably not the reason for your fall in rank.

Re hijacks -- Google Guy said to look at the site command. Hijacking URLs will appear there and not in allinurl/inurl commands. Those are probably tracking codes.

Those scraper sites that steal your content and title can indeed hurt your rank however, so best to try and get it removed.

I wrote an article on how to stop hijackings which you can find from my profile.

Clint



 
Msg#: 30061 posted 3:52 pm on Jun 22, 2005 (gmt 0)

Just how do you do a keyword density check anyway...manually counting? Or is there a tool to use? The % figure at which you arrive, is that counting every single word on a page including the meta tags, then counting how many times the key word(s) appear in the tags and the page's body, then getting the % from that? Or is it some other method?
Thanks.

webdevfv

5+ Year Member



 
Msg#: 30061 posted 4:15 pm on Jun 22, 2005 (gmt 0)

My validation is near perfect - I am a web developer by trade.

I'm completely baffled because I don't employ black hat (at least I didn't do - maybe the rules have changed) and while everyone was having problems since mid-May my site was slowly moving up the SERPs for all manner of keywords and keyword combinations. It's only in the last few days that things have gone pear-shaped.

Visits from the googlebot have been sparse during this time but I noticed that today the bot has been spidering a great deal. Maybe I was forgotten about - hopefully, once the bots have been and gone things will improve.

Hmmm?

My gut feeling is that all this movement is purely to increase Adwords revenues from those shaken out.

Clint



 
Msg#: 30061 posted 1:47 pm on Jun 23, 2005 (gmt 0)

Bump for msg #3. ;)

I found a "Keyword density check" tool, and it said "Keyword density is terrible" and that it was 25% on the page I tested. I assume it's saying that is too low since I tried another page that was 14% and it said the same thing. HUH? I heard that something like 25% is way too high!? So what is the correct amount?

helleborine

10+ Year Member



 
Msg#: 30061 posted 3:17 pm on Jun 23, 2005 (gmt 0)

This morning I noticed a referral from a scraper directory firing 302's like there is no tomorrow, madroll dot com.

I am really worried that my site is going to tank again out of the blue at the next update.

I know GG stated that a site: command would reveal hijacks, but when my site tanked, all my backlinks had been attributed to the phantom page Google created for the script link. And that didn't show up in the site: command at all.

I repeat: a link:http:\\directory.com\php?mywebsite showed exactly my backlinks.

So, no, I'm not reassured. I'd really like some very firm reassurances that this is NOT going to happen again, to me or anyone else whose site gets picked up by these parasites.

Reid

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 30061 posted 6:02 pm on Jun 23, 2005 (gmt 0)

if I had a inbound link with my title and description showing up in allinurl or anywhere in the SERP's I would nuke first ask questions later.

What googleguy said is that they have 'filtered' them out of the site: results. You read into that what you want.

theBear

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 30061 posted 6:29 pm on Jun 23, 2005 (gmt 0)

What this critter reads into that is they are now hidden from folks looking in site: searches.

Sort of like test accounts in real production databases. They just sit there till all he¦¦ breaks loose. Like various regulators finding out about them in insurance company databases or bank databases or credit card databases, etc... and more etc...

I can name some companies, however you can all do searches.

helleborine

10+ Year Member



 
Msg#: 30061 posted 9:38 am on Jun 24, 2005 (gmt 0)

When did GG say Google was filtering out the site: results? Since he also stated that the site: command was the only way to find a hijack, does that mean that Google has rendered us incapable of telling whether we've been hijacked, or identifying our hijacker?

I must understand incorrectly.

Reid

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 30061 posted 10:48 am on Jun 24, 2005 (gmt 0)

[webmasterworld.com...]

msg# 105 is the one googleguy talks about the filtering. he didn't actually say 'filtering' but read the posts afterwords - we got no reply about it so we ended up assuming he meant 'filtering' the site: results.
google HAS made a lot of changes since then - I'm not even sure if the 302 hijack exists anymore.

helleborine

10+ Year Member



 
Msg#: 30061 posted 11:02 am on Jun 24, 2005 (gmt 0)

I'm not even sure if the 302 hijack exists anymore.

We are busy testing just that, under controlled conditions. A large number of "victim" pages has been created for the sole purpose of carrying out the experiment, on a non-sandboxed domain. We are waiting for them to rank. When they do, we will attempt to hijack them in a myriad of ways, and see whether there is an effect on rank.

Maybe hijacks exists, maybe they don't, but we'll find out once and for all.

We don't know when we will have results, because we do not know the timeline of execution of a googlejack. It might depend on which googlebot picks up the attacking link, the schedule of duplicate content filter application, who really knows? We might see no effect until the next update or two.

We'd love to be wrong.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved