Forum Moderators: goodroi
The site only has a few dozen pages, but because of all these items, it would show that I have hundreds of pages in their database!
Can I use robots.txt to prevent the spider from following any external links that lead to Amazon?
All the product feed link addresses are very long and they all start out the same:
[rcm.amazon.com...] .......
(But then of course every one is eventually a different address, since each is a different product).
I'm wondering if something like this might work in my robots.txt file?
User-agent: *
Disallow: [rcm.amazon.com...]
That is probably too simplistic, but thought it best to ask.
Thanks for any help...
............................