Beyond the question of what Google sees now....
My emphasis added... This way google bot sees zero changes when I setup a pay wall.
I'm thinking that what Google
might see when you set up the paywall is a change in user behavior, probably time on page.
I've consulted for sites that have added registration requirements after several visitor views. I predicted there would be roughly a 50% loss of traffic past that point. It turns out that it was much larger than 50%. If you continue to keep the first 5 PDFs user-accessible, you may avoid this problem.
You need to be careful when setting something like this up that you're not intentionally frustrating users in an attempt to get them to pay. I've seen a site that offered links to paid PDFs to users, but when users clicked they were presented with tiny, unreadable thumbnails and a message that they could see more by paying. IMO, this as an extremely poor approach.
Regarding just keeping the PDF content out of the
index, you might look into using
X-Robots-Tag: noindex. Though I don't offhand see how you can use this tag for some PDFs on a page and not others, it's very possible you can figure something out. See this Google Developers article...
Robots meta tag and X-Robots-Tag HTTP header specifications [
developers.google.com...]