Welcome to WebmasterWorld Guest from 54.224.108.85

Forum Moderators: open

Message Too Old, No Replies

Google crawling Angular JS site?

     
3:49 pm on Apr 7, 2016 (gmt 0)

New User

joined:Mar 29, 2016
posts: 5
votes: 0


We have an Angular JS eCommerce site (windows server).

We tried using a caching service (prerender) that served up a
<meta name="fragment" content="!">
on each of the internal product pages. But there was something that was preventing the cached page from being shown to Googlebot. We could not figure out what that something was. We're not running any scripts/etc that would interfere with Googlebot receiving the cached page from the 3rd party service.

I was brought in to do the marketing/SEO and my technical experience (and past success) has been with Wordpress and Linux servers.

Do you think if we remove the 3rd party caching system, Google would be able to crawl our site (I was told the reason they went with a 3rd party caching system was due to all the soft 404's they were seeing in GWMT...I just started the job a few weeks ago).

Or do you have have any advice on how to get Google to crawl and index our site normally? "Switch to Wordpress or other CMS" isn't an option at this point in time (maybe towards the end of the year).


thanks

<<edit: spelling>>
4:26 pm on Apr 7, 2016 (gmt 0)

Full Member

5+ Year Member

joined:July 11, 2008
posts:215
votes: 2


Hi, i have some experience in this. I recently built a js site using node. Google, crawled it but the source code didnt include any of the content/keywords etc so from a seo standpoint was dead in the water. I also went to prerender.io and using the htaccess method it works flawlessly, really great. It's one line of code in that file and all requests from bots get re-routed. No fragment urls either, nice clean structure. Obviously you're on a windows server, and for a whole load of reasons, including this one id move over to Linux.
4:29 pm on Apr 7, 2016 (gmt 0)

Administrator from US 

WebmasterWorld Administrator not2easy is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Dec 27, 2006
posts:3723
votes: 205


You can find out for certain in the "GWMT" or nowadays "GSC" by using the Fetch as Google tools. It isn't very obvious but you can see any blocked resources listed there once it renders. It shows you what a visitor would see vs. what Google can see and if you follow the >> on the same line where it shows results (the >> is at the far right) and click on >> it lists all blocked resources whether it is yours or third party's resources that are blocked.
7:18 pm on Apr 7, 2016 (gmt 0)

New User

joined:Mar 29, 2016
posts: 5
votes: 0


Hi thanks.

We're using PreRender and Google is not picking up the cached URL's. We can't figure out why. Yeah, I dislike Windows with a passion...but the devs are trained in Windows Server.

We blocked the parameters in both GWMT and PreRender admin control panel. I fetched as Google and they are not being served the cached pages.
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members