I implemented the change on the page. Specifically,
$(window).on('load', function() {
$.post( "/my-url",
$( "#form" ).serialize(),
function(j){
if (j.ok) {
$("#added-links").html( j.data );
}
});
});
This loads links (typically one or two) to the page after the page has fully loaded.
The experiment Methodology:
A group of pages was selected to meet the following criteria:
- The pages must use/benefit from the feature, that is at least one other page exists that will be linked to the submitted page and the only link to that page is by the AJAX injected link.
- None of the pages of the group were ever in the index and were ever submitted to the index in the past.
- There must be a reasonable expectation that once the pages are submitted to the index that the pages will be accepted into the index.
- The page to submit and the linked pages are similar, in that one could easily be substituted for the other with little impact, but sufficiently different that the pages would not constitute duplicate content. (example based on an e-comm page: one page is for a specific model of a product and the other page for different model for the same product and bot pages display the information and specifications for the product but that information varies by specifics of the model.)
The page to submit was tested for speed before and after the change with PageSpeed Insights and with the performance tab in Chrome (Opera) developer tools.
Only once the changes are implemented the "page to submit" is submitted to the Fetch and Render tool in GSC. The both rendered page views, "How Googlebot sees the page" and "How a visitor to your website would have seen the page" are compared to see if the injected links appear in both pages. Then the page is submitted to the index using the feature Indexing requested for URL and linked pages. Once submitted to the index a site: search is done to see if the group of pages in question appear in the index.
Results:
There was no measurable impact on page speed, PageSpeed Insights score did not change from before to after. And there was no change to load speed up to "DOMContentLoaded" event. when checking performance with developer tools. Obviously, after the DOMContentLoaded event there is additional loading as the ajax call for the links and the rendering of the links to page is executed. In this case the ajax call takes less than 50ms.
Google fetch and render shows the links in both views.
The page was submitted to the index and after a short time (1 minute) a "site:" search was done, only the submitted page was found, the linked page was not included. The search was repeated 30 minutes later and there was no change.
Conclusion:
Googlebot is able to find the content that was loaded after DOMContentLoaded event was triggered as the links appear in both views of the fetch and rander feature of GSC. But the linked content is not immediately indexed when submitted to the index using the "Indexing requested for URL and linked pages" option. But there are some unanswered questions:
1- Will the linked content appear in the index given more time?
2- Is the fact that the content is linked to after DOMContentLoaded event the cause or is that linked content, even links that appear in the static html of the page, less likely to be indexed when submitted to the index with fetch and render tool?
One more aspect that was not controlled for, there was no check to see if including the links directly in the page would have or not made a measurable impact on the page speed.
These are my findings and I would love to hear some feedback.