Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

"Indexed Pages" in GSC not showing AJAX and JS pages of new site

         

Aneela

9:57 am on Jul 18, 2017 (gmt 0)



We built our website using Angularjs and Dynamic Html content using Ajax calls. We already submitted sitemap in google search console. but google search console shows only 2 pages indexed out of 18 . When we checked manually on google SERP (site:example.com) it shows only 11 pages out of 18(we are happy that 11 pages indexed). But in google search console, the count of indexed pages are showing still "2" out of 18.


Can any one suggest us on how to deal with google search console to solve our indexed pages.

keyplyr

10:10 am on Jul 18, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Hi Aneela andwelcome to WebmasterWorld [webmasterworld.com]

First, the reports in GSC can sometimes be buggy or show misinformation.

However, if your pages haven't aquired many backlinks, it can be difficult. It may also take a while for the same reason.

You can try waiting another week and if there isn't any movement, try resubmitting the sitemap.

However with the upcoming Mobile-First Index [webmasterworld.com] many seconday, low traffic pages seem to be getting removed.

Robert Charlton

10:32 am on Jul 19, 2017 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Mods note...
Thread title changed to make it more descriptive, from...

Can any one suggest us on how to deal with google search console to so...
- to...
"Indexed Pages" in GSC not showing AJAX and JS pages of new site

-----

Aneela, as keyplyr suggests, it's very likely that either you haven't yet gotten the necessary inbound links to your pages, or that the GSC (Google Search Console) hasn't yet received the new information to include it. This would be true of any type of pages.

You might try searching a unique text string from one of the pages you don't see indexed, by putting the text string in quotes to test whether Google returns it. If Google does return the page with the string in quotes, then that page is in the index. Whether the page will rank for anything competitive is another question entirely.

I should mention that until Google was confident that it could crawl and index JS and Ajax, it used an AJAX crawling proposal it had suggested in 2009, but which it deprecated in Oct 2015. If your indexing problem doesn't resolve itself over time, there's a possibility that your implementation might be a problem, though it appears that the new implementation is backwards compatible.

I'm not a developer, but here's an overview of the changes, along with links to Google's recommended new implementations...

Deprecating our AJAX crawling scheme
Wednesday, October 14, 2015
[webmasters.googleblog.com...]

I'm assuming that if your pages render correctly in Chrome, then technically they should be OK with Googlebot. Look also at whether the AJAX states you've chosen should work as addressable pages. Again, I'm not a developer.

Please keep us posted on what works or doesn't work, and how long it takes. Good luck.

NickMNS

1:40 pm on Jul 19, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Hello Aneela,

I have recently dealt with some of the same issues, not specifically for an Ajax/Angular site but for a dynamic website.

1- Given that you only have 18 pages you can easily submit your pages to the index manually in GSC (fetch and render).
2- GSC in index report is flawed and lagged. If you find your pages with "site:" then they are indexed. But be aware, "site:" is also flawed, but differently.
3- The best tool I have found for assessing indexing and ranking, is the search analytics report. If you check the "pages" check box the report will show you all the pages that appeared in the search results. This will give you a good idea of not just what is indexed (because if it shows up in search then it is indexed) but also what is being seen and how. This can give you a valuable insight into how the pages are being perceived by Google.
4- Sitemaps and manual fetches are all useless unless your content is linked to internally. If you simply throw up a form and then hide all the content behind it Googlebot will never get to it and you can submit manually all you want, but Google will not hold the content in the index unless it can get to it. And yes, Google can crawl form but they generally don't, specially for a new website with no links pointing to it. I know this from experience! So make sure that they are links to the content, and the more important the content the closer to the home page link needs to be, ie: if it is really important place a link on the home page directly to that page.
5- Make sure the link is on the page before the page is loaded, the link must be there at document ready. I have added links to a page using AJAX
$(window).on('load', function() {
so that the call is made immediately after the document is ready, and the links appear in Fetch and Render "how Googlebot sees your page" but the links were never crawled.

Writerly

3:39 pm on Jul 26, 2017 (gmt 0)

5+ Year Member Top Contributors Of The Month



GSC is updated in intervals, not in real time, therefore it takes more time to see the different metrics correctly. If it doesn't change for a period or a week or two, even after you checked it manually, you should try to resubmit your sitemap or just consult another keyword searching tool.