Am trying to get a web site back to the front page of its search results and have rebuilt it 3 times to better improve SEO and phone friendliness. Each time making it better. Also hired an SEO expert to refine things and work on its ranking. That part is working ok but for amusing results because each week we check rankings on set keywords by doing searches in India, USA and Australia and counting how far back each keyword is.What is amusing is the inconsistency ecah week because instead of moving forward at a steady pace, they are jumping back and forth like a mexican jumping bean. There is also a huge difference between the localities.
But that is not why I am posting here today.
My Adwords account had been stopped by me for sometime but it I have found it most useful in the past for tracking keywords and their usage, so I enable the Adwords account and revised the ads and landing pages. After about 2 weeks I started getting emails about "Ad disapproved" due to its landing page not working.
Well all landing pages worked perfectly and could be verified from several locations around the globe, so I contacted them and reported the anomaly. To which they replied... "Get a web designer to fix my web site".
So I sent them screenshots of my main landing page taken from different locations around the world. To which they replied... "See my network administrator to fix the problem".
So then I sent them a link to click from their end which would send me an email confirming the activity and IP address. They must have clicked that link, or someone on their end did, because their IP address was logged and that was useful in determining that we had no IP ranges blocked that would prevent Google access.
So then I sent them a screenshot taken from their very own mobile friendly testing pages. That's right... they reckon that from their workstation that they cannot reach my web site, but their site and mobile testing tools can!
Now the latest excuse is that robots.txt is blocking them. This is what was in my robots.txt...
User-agent: *
Allow: /
Nothing more and nothing less. But they reckon that it should be like this...
User-agent: Googlebot
Disallow:
User-agent: Googlebot-image
Disallow:
Are they pulling my leg or what? They are suggesting that each search engine must be specifically allowed, and that a generic allowance is no longer acceptable?