Forum Moderators: open

Message Too Old, No Replies

Paid Listing URLs showing in SERPs

Are there good ways to prevent Google from indexing these?

         

sublime1

9:17 pm on Jun 18, 2004 (gmt 0)

10+ Year Member



We're seeing the URLs from paid listings showing up in our Google SERPs. We tag our listings with source=overture&kw=widget (and various others for the PPC and Paid Inclusion programs we use) so we can track internally. I guess the pseudo-directories have harvested these links and google has gobbled them.

Needless to say, it screws up our tracking, it looks bad and is generally confusing.

Would adding a meta-noindex tag to pages with the tracking codes in their query strings do the trick? Any other better ways?

This seems obvious, but I don't want to discourage Google from indexing the real page. I would rather have ugly URL and confusing logs than no traffic!
I just wanted to make sure I am not doing something stupid first :)

Thanks in advance.

Marcia

5:05 am on Jun 19, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've been seeing that with URLs with AdWords and others showing up - with Page Rank, mind you, but not the same as the homepage. www.example.com/?adwords

I'm wondering if by any remote chance it could be fouling up with works with duplicate content. Too many homepages have been disappearing lately for no apparent reason.

jcoronella

2:44 pm on Jun 19, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You could use robots.txt to exclude them. Put them in a directory, and exclude that directory.

sublime1

6:52 pm on Jun 21, 2004 (gmt 0)

10+ Year Member



Marcia -- I am also seeing different PR when I do a page with and without the tracking parameters (e.g. www.mydomain.com/someproduct.html vs. www.mydomain.com/someproduct.html?source=looksmart) but it appears only when the page is in the (site:) index under both URLs. One has the usual PR8, the one with the tag has PR6.

And in our case, neither show up in the SERPs since our site was a victim of the recent issues I have been blathering about elsewhere.

Curious...

sublime1

6:54 pm on Jun 21, 2004 (gmt 0)

10+ Year Member



jcoronella -- the robots.txt solution doesn't really work (without changing our ad campaign approach); would the meta noindex accomplish the same thing without having to change a bunch of things?

my3cents

8:46 pm on Jun 21, 2004 (gmt 0)

10+ Year Member



I am also seeing in allinurl that many of our "missing pages" have been replaced with tracking urls, like:

www.domain.com%2F&engid=1694&gid=2370&af=0&qtype=0&qw=+keyword1+keyword2&ts=1082392794&cs=1dc0f/2

www.domain.com/pagename.html?trackcode=variousads

and dozens of others. How can you make a robots file tell google to not spider these pages when the pages are on another server, or when there are way too many different tracking urls to include. We must have over 3,000 different tracking urls we have used over the past year for doazens of different advertising networks and hundreds of keywords. Now there are all of these others that have been dynamically generated. I don't think a robots file can be made for these.

Also I note that the allinurl has been showing several old pages that have been gone from our site for over a year and has cached versions of other pages that are also very old.

Our website took a pretty big hit when our home page and several main category pages were replaced in the index with a tracking url. Our ranking are way down for these pages and have titles and descriptions, probably because the pages have several duplicate listings with different tracking urls behind them.

Now watch G slap me with a duplicate content penalty. I wish they could get this straight: 1 Page = 1 url