Forum Moderators: open
On a site I've optimized, Google has gone totally bananas.
The site has about 20 links like this:
index.php?page_id=12&lang_id=1
We initially had no problems getting all pages spidered properly, but lately, Google has started to come up with links like:
index.php?page_id=12&lang_id=w
.. and about 500 other non-existant URLs. This has cause ranking to drop dramatically.
I fear some algorithm over at Google has run loose, and I'm wondering how we can make Google stop visiting all those URLs it has started to index.
Any ideas?
Design an error page that is delivered if the wrong params are used.
Ensure the error page includes :-
1) the robot meta NOINDEX.
2) a static link to what you believe is the correct page.
3) a javascript redirection to the same.
2 & 3 are for users in case they are sent to the wrong pages.
You should also use the link: command on ATW and AV to try to locate the source of the problem - it may not be google.
If there are bad links to your site out there, you should also consider using http redirects. I've never used them but people around here talk about 301-permanently moved a lot.
Hope this helps,
Kaled.
PS
You could consider being sneaky and trying to fool Googlebot into thinking you have more content than you really do. ie make the fault work for you rather than against you.
i would definetly do a search in google and other search engines for the incorrect link. I am using php all the time and I pretty doubt it was Googles fault.
Google can now spider 3 strings deep which is great.
A search should easily find the page or a cache of the that is causing the fault.
I alsway make sure that if a id= is not found in the database to make sure it returns an error message by using IF statements.