Welcome to WebmasterWorld Guest from 18.104.22.168
Last week my rankings in google tanked. From having numerous #1 spots for good keywords to nothing in the top fifty and most everything dropping by a 100 or 200 positions. My rankings during bourbon update increased so my feeling is that this isn't the reason so I'm trying to find out the root cause.
Have I been given a penalty for 'over optimisation'?
Was it because I removed all my images from google images using robots exclusion?
Or is it more sinister than that?
I've been looking at the hi-jack thing and I want to know whether this is an example.
There are only 4 results using inurl:mysite.co.uk - all are supplemental, two are my site, one is a search directory and another is a redirect which shows my title and description. This last one appears distictly dodgy and I want to know whether it is what you call a hi-jack.
When Keyword rank drops validate all your pages first, and not just W3c.org (it doesn't catch broken table tags and other broken tags). Also run a link checker on our site.
If you're concerned about over optimization do a keyword density check to make sure you haven't focused too much on particular phrases/words. Anything over 20% is suspect (Google doesn't use the meta keyword tag so eliminate that from your searches but do include the meta description tag).
The exclusion for Googlebot re images improved traffic for a new site I installed it on when the only traffic it was getting was google images. so that is probably not the reason for your fall in rank.
Re hijacks -- Google Guy said to look at the site command. Hijacking URLs will appear there and not in allinurl/inurl commands. Those are probably tracking codes.
Those scraper sites that steal your content and title can indeed hurt your rank however, so best to try and get it removed.
I wrote an article on how to stop hijackings which you can find from my profile.
joined:Jan 12, 2004
I'm completely baffled because I don't employ black hat (at least I didn't do - maybe the rules have changed) and while everyone was having problems since mid-May my site was slowly moving up the SERPs for all manner of keywords and keyword combinations. It's only in the last few days that things have gone pear-shaped.
Visits from the googlebot have been sparse during this time but I noticed that today the bot has been spidering a great deal. Maybe I was forgotten about - hopefully, once the bots have been and gone things will improve.
My gut feeling is that all this movement is purely to increase Adwords revenues from those shaken out.
joined:Jan 12, 2004
I found a "Keyword density check" tool, and it said "Keyword density is terrible" and that it was 25% on the page I tested. I assume it's saying that is too low since I tried another page that was 14% and it said the same thing. HUH? I heard that something like 25% is way too high!? So what is the correct amount?
I am really worried that my site is going to tank again out of the blue at the next update.
I know GG stated that a site: command would reveal hijacks, but when my site tanked, all my backlinks had been attributed to the phantom page Google created for the script link. And that didn't show up in the site: command at all.
I repeat: a link:http:\\directory.com\php?mywebsite showed exactly my backlinks.
So, no, I'm not reassured. I'd really like some very firm reassurances that this is NOT going to happen again, to me or anyone else whose site gets picked up by these parasites.
What googleguy said is that they have 'filtered' them out of the site: results. You read into that what you want.
Sort of like test accounts in real production databases. They just sit there till all he¦¦ breaks loose. Like various regulators finding out about them in insurance company databases or bank databases or credit card databases, etc... and more etc...
I can name some companies, however you can all do searches.
I must understand incorrectly.
msg# 105 is the one googleguy talks about the filtering. he didn't actually say 'filtering' but read the posts afterwords - we got no reply about it so we ended up assuming he meant 'filtering' the site: results.
google HAS made a lot of changes since then - I'm not even sure if the 302 hijack exists anymore.
I'm not even sure if the 302 hijack exists anymore.
We are busy testing just that, under controlled conditions. A large number of "victim" pages has been created for the sole purpose of carrying out the experiment, on a non-sandboxed domain. We are waiting for them to rank. When they do, we will attempt to hijack them in a myriad of ways, and see whether there is an effect on rank.
Maybe hijacks exists, maybe they don't, but we'll find out once and for all.
We don't know when we will have results, because we do not know the timeline of execution of a googlejack. It might depend on which googlebot picks up the attacking link, the schedule of duplicate content filter application, who really knows? We might see no effect until the next update or two.
We'd love to be wrong.