In reviewing my website logs I've noticed Google 'fishing' for dynamic url's that don't exist more often recently. The pages being called have never existed and do not show up in GWT as being 404. In some instances 4-5 seemingly random parameters are tested.
My site is dynamic and parameters do in fact change the content of the page although Google hasn't touched on a combination that would make a change(yet). I have a list of roughly 100 parameters that will change the page in a way I intend and potentially 10,000 in ways I do not want.
Should I place the 100 parameters in an array and whitelist them, returning 404 error codes if the parameter passed is NOT on the list?
Would having 100 parameters in an array/whitelist significantly slow down a page from rendering over time or is that an acceptable number?
Just wanting this 'loose thread' tucked away before Google yanks on it, other suggestions?