Forum Moderators: Robert Charlton & goodroi
We just installed a new firewall/ids. I noticed that my sites started dropping all together from google.
I just now talked to the sys admin, he said that the IDS had a filter on for all requests to robots.txt that it will just drop the packet.
Would this make google think the site was offline if it didnt even respond to the request and drop the pages?
I had this filter removed and see googlebot coming back now whereas it wasn't making it through before. Am I on the right track?
The firewall was a sonicwall firewall. and its one of the security settings in the IPS and GAV.