Forum Moderators: Robert Charlton & goodroi
A few months ago that section dropped to supplemental. Now I'm losing about $2k/mo potential rev.
I'm wondering if being dropped in the rankings is due less to the affiliate feed content and more to the fact that there are many affiliate (well-known affiliate) links on the page (ie for <edited>). Can I tone this down and get back into the regular index? Or is it better to start new pages from scratch on the same domain?
In other words...I would have mainly the same content on the page (both orginal and affiliate), but perhaps I could remove a lot of the affiliate links (or redirect them)--is that going to get me back in the ballgame?
Any opinions?
[edited by: tedster at 11:12 pm (utc) on Mar. 31, 2006]
Imo this was/is a planned attack on affiliates.
Companies with affiliate programs are feeling the
negative impact and dropping their programs. All affiliate
companies should pressure Google to give their affiliate
webmasters fair treatment by letting them compete in the
full index.
Google has a cadre of very smart intellectuals who could
remedy this problem. Why won't they?
What is funny is that if you run an affiliate directory thats directly selling your own affiliate links with the link urls going via your own site prior to them going to the customer- your site will rank top of google.
The reason being that Google is stupid enough to think that its links to own content and doesnt pick up on the affiliate issue.
Meanwhile if your links show, overture, adwords, miva or another you have a problem imo
However, just wanted add that I have done some testing on whether my pages are coming up as "duplicate content", ie Copyscape, and on average they are not or only have like 1 dupe result (which I also have on other sites which are not flagged supplmental).
Therefore this corroborates my belief they are not flagged for duplicate content. The other thought is they are ranked just too low (a minimum of inbounds/pr). More thoughts later... thanks
I get the database of all the products from them with info on the product, image etc.--so that part is what I am calling the duplicate content part, since other sites are using the same exact info. I pull the product images from their servers, so the img src would show their url. Also on each page there are several links and a form submit that go to their site with my affiliate code on it.
This is mixed in with some unique content.
Previously it was not supplemental, now it is. Just wondering what changed--algo/dupe filter got more strict or if anybody else is having similar issues...
I mean not exactly duplicate but rather "nothing especially new compare with the original".
35% original content probably is not enough to be original.
Goggles priority, probably, are the interest of the searches and not the site owners.
For the searches it is in general better to find the original content. So I believe this feature is going to stay.
Vadim
I get the database of all the products from them with info on the product, image etc.--so that part is what I am calling the duplicate content part, since other sites are using the same exact info. I pull the product images from their servers, so the img src would show their url. Also on each page there are several links and a form submit that go to their site with my affiliate code on it.
Even without Google's evident elan for complex algorithms, this reads like duplicate content to me.
In my experience there are many online traders who's idea of ecommerce is to throw a load of images and biolerplate descriptions online, and call it a 'store'. That's fine for ebay, but the trend online is for more unique, hand-crafted sites that offer something new. Why would you expect your similar/identical site to be considered any better than all the other people doing exactly the same thing?
In my view this is a throwback to bricks and mortar style stores. A big supplier might supply thousands of retailers across the country, each with the same products and info on the products, which was fine because they were all in different towns and cities. Customers wouldn't really have the opportunity to see lots of them in one go because of the geographical distance.
In short, this model doesn't transfer well online. If I search for consumer goods, e.g. a TV, I can sample lots of different stores in one go.
It makes sense for Google to try and do something about this. A very simple method is to make sure unique content is considered more important than duplicate content. At the very least this favors people who are doing something different.
It has been a long time coming but I reckon all you're seeing is the death of eccomerce operations who have put in little or no effort to develop and sell their products. It was only a matter of time really.
I have been persitently seeing affiliate urls ( of the form [affiliate.networksite.com...] Etc)in the serps and actually ranking for terms, albeit not so great.
It seems that if it smells like a url, might be a url, could be a url google is gonna try and index it.
I am not sure if Google is targeting actual affiliates, the merchants, or both? HiltonHead made a comment that sounded like merchants are dropping their affiliate programs. Are they really dropping their entire aff program to remain in the Google index?
Does anyone have any info on whether Google has really declared war on both affiliates and merchant sites w/ aff programs, or do you think it could have more to do with the type of links/redirects that are being employed, i.e. 302's or other techniques Google may find too spammy?
As for links with affiliate IDs actually showing up in the SERPs, you should disallow those in your robots.txt file so that they cannot be indexed.
You want their page to be indexed with their URL. You want your page to be indexed using its native URL. You do not want your page with their affiliate code in the URL to be indexed.
You can control that using robots.txt, or by adding <meta name="robots" content="noindex"> to the page whenever it has been called with an affilate ID tagged on to the URL. The code for doing that is very easy.
Also regarding a url being supplemental for one search and normal for another. Are you sure its the same url from googles end? I have urls that i can bring up as both supplemental and normal at the same time. Clearly they are two different urls to google.
I changed a phone number on a page.
If you search for the old phone number, the page is returned as a supplemental result, and the snippet shows the old phone number.
If you search for the new phone number, the page is shown as a normal result, and the snippet shows the new phone number.
In both cases the page links through to a cache page that is only a week old, and shows the new phone number.
As far as the robots.txt--I don't think that really covers my problem, although it may be tangentially related. My urls pointing to the affilates site are not being indexed. The problem is the pages on my site (which have those urls) are not being indexed.
right thats what i see. Different versions of the same url. But you are saying its the same url. I am saying it appears to me that the supplemental version and the normal version are infact different urls to google.
In other words google see's them as two different urls and my observation is that you can get both to appear in a single search.
[edited by: soapystar at 9:36 pm (utc) on April 2, 2006]
MikeAA - Does anyone have any info on whether Google has really declared war on both affiliates and merchant sites w/ aff programs
or do you think it could have more to do with the type of links/redirects that are being employed, i.e. 302's or other techniques Google may find too spammy?
I don't use any spammy techniques.
All my affiliate webpages include links directly back to each company and they all are Supplemental.
No one from Google will answer these questions because it was part of their plan to bust affiliates
and increase their own revenues. I've had a ton of cash flow based on historic sales of my websites
go somewhere and the suspected recipient is Google.
It would be shocking to receive an explanation from Google - they will not respond to email requests
personally right now i see an increase in pure affiliate links being returned for competitive phrases. However the domain is the main site but with the affiliate id in the url. Not isolated urls but for different searches different affiliate urls.
Of course these are the links FROM the webpages. The pages that link to them dont rank. Is that what he was saying?
personally right now i see an increase in pure affiliate links being returned for competitive phrases. However the domain is the main site but with the affiliate id in the url. Not isolated urls but for different searches different affiliate urls.
They simply need to make those URLs serve a <meta name="robots" content="noindex"> tag if the affiliate ID is present in the URL. This involves a very simple change to the script that serves the page.
Sadly this is already happening: one of my clean sites was manually checked and then penalized by Google. The only reason was that the main content came from an outside source (think about 99% of all ringtone sites). I think it's outrageous. Yes, I will use Adwords but how willing am I do to that if my domain has been manually filtered?
The Internet is a good place to find shopping information and do your shopping and affiliates are doing a good job helping people find good online stores. Now many affiliate sites are gone from the Google index and have been replaced with non-relevant content, forum messages and sblogs. For some keywords I follow I can no longer find what I'm looking for where as 6 months ago the same keyword listed several pages of what I was searching. Mostly affiliate content but it was better than nothing.
bd dc's now try to resolve all index pages to one url. So supposse domain.com has its affiliate links as
domain.com/index.jsp/pageName=hotNetList&cid=#*$!&city=xxxxxx&stateProvince=&country tect etc
all those urls are given the pr of domain.com, not of the linking page. The urls are then shown in the serps as though they were the best choice for domain.com. So if it has millions of pages all under index.jsp..then they are all considered relevant for 1000's of different searches with the url most relevent being shown. Dio you follow? Its though google now allows 1000's of pages all to be considered the legitimate index page and will root through them all to get the most relevent one for each search.
Did you mean all affiliate url which create under domain.com/index.jsp/ (target product A) such as...
www.domain.com/index.jsp/pageName=hotNetList&cid=#*$!&city=#*$!xxx&stateProvince=&country
...will rank better than real url of merchant? such as...
www.domain.com/abcde/abce.htm (target product A as well)...
Please advice more...
A header check of the URL gives this
HTTP/1.1 302 Found
Date: Mon, 10 Apr 2006 09:53:20 GMT
Server: Apache
Set-Cookie: R~JLX_##_XX=xxs9BJORDDj.PaQN3xXYsqN#*$!x7PfzEUmSg7N5bP38;etc
Cache-Control: private, max-age=0
Pragma: no-cache
P3P: policyref="#*$! CP="NOI DSP COR NID CUR OUR NOR"
Location: [xxx.com...]
Connection: close
Content-Type: text/html; charset=iso-8859-1
#2 Server Response: [xxx.com...]
HTTP Status Code: HTTP/1.0 302 Object moved
Date: Mon, 10 Apr 2006 09:54:55 GMT
P3P: CP="ALL DSP ETC"
X-Powered-By: ASP.NET
Location: [xxx.com...]
Content-Length: 202
Content-Type: text/html
Set-Cookie: Session etc etc
Cache-control: private
Redirect Target: [xxx.com...]
#3 Server Response: [xxx.com...]
HTTP Status Code: HTTP/1.0 200 OK
Date: Mon, 10 Apr 2006 09:54:56 GMT
P3P: CP="la di la"
cache-control: no-store, must-revalidate, private
Pragma: no-cache
Content-Length: 29220
Content-Type: text/html; Charset=ISO-8859-1
Expires: Mon, 10 Apr 2006 09:53:56 GMT
Set-Cookie: expires=
Cache-control: private
The final target page is relegated to [more results from www.xxx.com] for another page that appears in the results yet seems less relevant for the search phrase, with the affiliate URL positioned 3 or 4 places below.
If anyone can tell me whats going on here I'd be interested to know.