Welcome to WebmasterWorld Guest from 54.162.184.214

Forum Moderators: phranque

Message Too Old, No Replies

Decoding webmaster tools

     
8:12 am on Aug 9, 2017 (gmt 0)

Full Member

Top Contributors Of The Month

joined:Mar 10, 2017
posts:227
votes: 94


For those of you who haven't read my sorry tale, I switched from m.dot to responsive in June, and lost all my mobile traffic during the switch. The designer then set up a redirect from m.dot to my homepage (regardless of what page they searched for). Eventually the m.dot pages were removed from the index and the correct responsive pages showed for mobile. But, my Adsense dropped from day one and has not increased despite all the errors being fixed.

I emailed them, and they are very cagey about it all. They say it's a huge drop in 'valid' mobile traffic. According to Analytics, all is normal, but Adsense emphasise VALID.

I'm not the most techy of people, but know there's a bug in the system somewhere. I've now been looking at webmaster tools and see a few strange things.

1) When I click on Mobile Usability I see 'clickable elements too close together', 2,673 pages results (I only have 800 pages on my site, but do have forums too). I am seeing two recurring themes. example.com/blue-widgets/html?=s or example.com/blue-widgets/html?s=&no_redirect=true

2) 132 results for 'content wider than screen', again the results are html?=s or html?s=&no_redirect=true

3) I go to HTML improvements and see 173 duplicate title tags. For every one I see the same...

example.com/blue-widgets.html
example.com/blue-widgets.html?s=

Could this be the cause of Google invalidating 2/3rds of my mobile traffic, and what does it mean?
9:19 am on Aug 9, 2017 (gmt 0)

Administrator

WebmasterWorld Administrator phranque is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Aug 10, 2004
posts:11293
votes: 135


mobile usability issues aside, those requests for urls with query strings attached (the part after the question mark) should be 301 redirected to the url without the query string.
1:34 pm on Aug 9, 2017 (gmt 0)

Administrator from US 

WebmasterWorld Administrator not2easy is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Dec 27, 2006
posts:3723
votes: 205


I routinely use "Disallow: /?s=" in robots.txt for sites on the WP platform because it appears as a link for the search function and turns up taxonomy results which are noindexed (as duplicates).

When you see 'clickable elements too close together' and 'content wider than screen' the important detail is not the pages or number of pages affected, it is that those messages indicate Blocked Resources. If you follow through those notices where the >> is on the right, it will tell you what resources are being blocked. It will indicate some css or js file(s) that is blocked in robots.txt. You can use the robots.txt tester with those blocked resource URLs and see what is preventing their being available.

Since robots.txt is also used for unfriendly fishing hints, you can disallow some specific directory and then allow (disallow must come first) the resources you want available for bots to evaluate the rendering results. for example I use these lines for a WP site:
Disallow: /wp-includes/
Allow: /wp-includes/js/
Allow: /wp-includes/css/
Allow: /*.css
Allow: /*.js
to allow access to blocked resources.

Unfortunately afaik, Googlebot is the only bot that follows the Allow: directive in robots.txt though I have not checked into that recently. That may have changed.
8:32 pm on Aug 9, 2017 (gmt 0)

Full Member

Top Contributors Of The Month

joined:Mar 10, 2017
posts:227
votes: 94


Thanks for your replies.

When I click on the >> I see three options.

1) Confirm the issue still exists. When I select that, I'm taken to a search page on my site, not the actual url which contains the .html?s= So all those pages www.example.com/blue_widgets.html?s= just go to a search page. But the url example.com/blue_widgets.html takes you to the blue widgets page.

2) Learn about sizing links and buttons. When I select that I'm taken to a Google developers page.

3) Replace your Joomla template. Erm, I switched to Wordpress.

I'm gonig to do as you both suggested and change my robots page and set up redirects. Thanks again.
10:26 pm on Aug 9, 2017 (gmt 0)

Administrator from US 

WebmasterWorld Administrator not2easy is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Dec 27, 2006
posts:3723
votes: 205


I was taken aback last month on a site that changed their theme in April and changed to a newer version of that (WP) theme in June. It was showing me pages with the same notations 'clickable elements too close together' and 'content wider than screen' and it was timestamped April 7, 2017. That was prior to the theme change and the updated version of that theme change. Two new robots.txt files had been submitted and it was still using one from prior to the first change. No wonder their traffic was going slowly downhill. I used the robots.txt tester to hasten the update to a newer version. I had thought that with all those requests for robots.txt between April and late July they might have noticed it was not the same at all, but no. Not impressed with finding such useless data in their UI. Data that they were using to decide its position. :(
3:27 am on Aug 10, 2017 (gmt 0)

Full Member

Top Contributors Of The Month

joined:Mar 10, 2017
posts:227
votes: 94


Why would they be using old robots files? That sounds crazy.

My file manager is in such a big mess at the moment I need to find the robots text. I can see one, but it relates to my old Joomla site. I'll have to contact the guy who installed WP.

I heard back from Adsense again. Initially they said it was a huge drop in 'valid' mobile traffic, and I can understand as it all bounced initially, but that was fixed. She also said my mobile is slow, so I switched to AMP. That didn't fix it. She emailed today and said it's to do with my desktop site and invalid clicks. She was cagey again but talked about buttons near my right hand side ad (she said she can't get too specific), I was aghast as there's so much white space around that ad unit you could drive a truck through it. It then dawned on me, with my WP theme there's the previous and next 'buttons' (for want of a better word) on left and right, and the right hand one does cover my ad when you hover over it and it expands. So I've now removed that. Still very confusing my AS is down by 2/3rds and one minute she's saying it's invalid mobile, the next it's my desktop site. This is costing me $XXX every day.
3:50 am on Aug 10, 2017 (gmt 0)

Administrator from US 

WebmasterWorld Administrator not2easy is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Dec 27, 2006
posts:3723
votes: 205


You can see what version of robots.txt they are using in your GSC account. Go to the robots.txt tester, it will show what version they are using and there's a date at the top of it.

When you checked for Blocked Resources, you should ignore the URLs with "?s=" but on actual page URLs follow the >> on the right to see the Blocked Resources. It goes to the page, but the results tell you what files are blocked, causing Google to think that the content is wider than the screen and clickable elements are too close together. It is because you are blocking those .css files and .js files so they think the site is not mobile friendly. When you have a list of blocked resources, then you can edit the version of robots.txt and test it in GSC to see what changes you should make so that those resources are not blocked. When you have a good version of robots.txt, upload it to replace the old version, then visit GSC's robots tester to submit the new file. Check back in a few minutes as their instructions say, to be sure they have fetched it.

As long as they see it as desktop only they may be serving different size ads if you use responsive sizes and they may not list your pages for mobile search.
6:46 am on Aug 11, 2017 (gmt 0)

Full Member

Top Contributors Of The Month

joined:Mar 10, 2017
posts:227
votes: 94


Wow, thanks for that not2easy, I didn't know about the robots tester, I just checked and it's using the Joomla robots file, so I'll have to see if I can find the WP one. This is what I see...

# If the Joomla site is installed within a folder such as at
# e.g. www.example.com/joomla/ the robots.txt file MUST be
# moved to the site root at e.g. www.example.com/robots.txt
# AND the joomla folder name MUST be prefixed to the disallowed
# path, e.g. the Disallow rule for the /administrator/ folder
# MUST be changed to read Disallow: /joomla/administrator/
#
# For more information about the robots.txt standard, see:
# [robotstxt.org...]
#
# For syntax checking, see:
# [sxw.org.uk...]

User-agent: *
Disallow: /administrator/
Disallow: /cache/
Disallow: /cli/
Disallow: /components/
Disallow: /includes/
Disallow: /installation/
Disallow: /language/
Disallow: /libraries/
Disallow: /logs/
Disallow: /media/
Disallow: /modules/
Disallow: /plugins/
Disallow: /templates/
Disallow: /tmp/

I'm going to look into the blocked resources a bit more now. Thanks, it's a bit over my head so I have to go very slowly with this.
4:42 pm on Aug 11, 2017 (gmt 0)

Administrator from US 

WebmasterWorld Administrator not2easy is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Dec 27, 2006
posts:3723
votes: 205


WP does not create its own robots.txt file, you need to create one. It needs to be placed in the root folder to be found. It is a simple .txt file that usually starts with:
User-agent: *
Disallow:

translation:
User-agent: * = "all bots in general"
Disallow: = "Nothing is disallowed, crawl everything"

then you add on specifics, the files and folders you do not want crawled, there are some commonly disallowed resources such as:
Disallow: /?s=
Disallow: /cgi-bin/

where you list and folders or file formats you do not want crawled.
You can disallow a folder that contains files you don't want crawled and then Allow file extensions that you want accessible. That needs to be in the format of Disallow: /Folder/ and then Allow types of files within that folder - as in the example I posted earlier:
Disallow: /wp-includes/
Allow: /wp-includes/js/
Allow: /wp-includes/css/

There is a useful list of parameters [support.google.com] from Google to help you learn how to instruct their bot. I do not know if any other bots are capable of following their Allow: syntax - something to keep in mind when adding Disallows. Also remember that this is a wish list and it does not force any bot to do or not do anything. It is primarily a list of instructions for Googlebots.
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members