So let's say, I don't want pages in the folder "/some_folder/" to be indexed by search engines, BUT, still want to be able to display ads on these pages (as well as anywhere else). Should my robots.txt file looks like this?
User-agent: * Disallow: /some_folder/
User-agent: Mediapartners-Google Allow: /
I am still confused, if we should use "allow" in robots.txt file, or a "disallow:" with an empty string (nothing after the dots)
You can disallow bots and allow the AdSense bot as in your code above BUT if those pages are set to noindex AND Googlebot is not permitted to crawl, they won't know that those pages are set to noindex.