|Does Robots.txt make a difference for AdSense|
| 2:45 am on Jul 29, 2003 (gmt 0)|
We have been running AdSense for just under 2 weeks now and for the most part everything is very targeted. There are a few dynamic pages with session ID's and? that they are having a hard time bringing targeted ads in.
Currently we are not using the MediaPartners-Google Robot Rule. Will using this help us out. Has anyone found that using the AdSense robots.txt rule helped?
They say to add this line:
Where in the below robots.txt would you place it?
This is what we are using
# Welcome to SiteName.com
# and thanks for asking about our robot rules
Also, how do you edit the subject line? made a typo and can't change it!~ :0
| 3:12 am on Jul 29, 2003 (gmt 0)|
I didn't add anything to my robots.txt yet, and they are handling it fine. I think it was mentioned there primarily for those who may have robots banned from certain parts of the site that you may place AdSense on. The bots do request the robots.txt file with nearly every visit, if not all visits.
If you do decide to make changes, be sure to run your robots.txt through the validator here before unleashing it on your site.
I had session IDs on some of my pages, and with every page view, the AdSense bot would come along behind it to get the page URL - including the session ID. Because it considered each URL with a SID a unique URL, it would only show PSA ads, unless the person with that SID happened to revisit that page during the same session. I removed all session IDs from those pages, and AdSense has been showing targeted ads ever since.
And a mod will have to come along to correct the title :)
| 3:21 am on Jul 29, 2003 (gmt 0)|
|Thanx for the great answers...|
How do I get pages like this more spider friendly?
Most of these pages have a lot of great content and none of the AdSense Ads will work at present.
Is there something you could recommend to transform URls like this into something AdSense Friendly? A perl program or a re-write what's the best option?
| 3:59 am on Jul 29, 2003 (gmt 0)|
I asked the techies who did my SID pages, and they told me what to edit to remove them. The pages were static URLs, but with session IDs on the end.
Yours appear to be dynamic. The AdSense FAQ says "Form-loaded content, dynamic content... cannot be used to target ads to your pages." From the looks of your link, it seems that this applies. Can you contact with the techies who programmed it?
There are ways to make dynamic URLs static, but it is too far out of my area of expertise ;) You might also try in one of the programming forums here - I know I have seen threads on the active list recently that deal with turning dynamic URLs into static ones.
| 4:21 am on Jul 29, 2003 (gmt 0)|
My feeling is that your problem has nothing to do with robots.txt and 100% to do with the dynanmic nature of these pages and the long dynamic URL's. What Jenstar says is correct I think. I dont think Adsense is capable of, nor designed, to effectively index such pages. Even if you coule rewite your URL's to be more spider friendly, if the content changes on every call, it wont work very well in getting relevant ads for various reasons, the main reason being is that ads are not served "on the fly" but by what your page looked like the last time the media partner robot called by, a few minutes to a day to weeks ago, which may be different according to the form input used to generate the page this time. The only thing i could siuggest is to have some signiificant block of common descriptive content at the top (and maybe the bottom) which is always common to the page, no matter what dynamic content is in the rest of it, and summarisses the content of whatever dynamic output could appear in the rest of it.
| 4:34 am on Jul 29, 2003 (gmt 0)|
Thanx for the info everyone!
I got all the pages changed from dynamic to static through a backend script that the programmers had developed.
The pages now look like this
The AdSense is now working!