Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

from #2 to #9

         

MelissaLB

4:11 pm on May 25, 2011 (gmt 0)

10+ Year Member



For any of the paid members I've posted a very detailed outline of some changes we had our developer make to our site int he past few weeks as a continuum of a Site Review thread I began a few months ago here: [webmasterworld.com...]

It seems as though one of our most coveted 2-word keywords that directs users to a category page which we have held onto #2 spot in google.com for years has plummeted to #9! This happened either sometime early today or yesterday.

I've been certain that our site, has not been effected by Panda but now, I'm not so sure. This decline in serps happened to some other category pages, but not all.

My big question is how can we find out if this decline is caused by something that we have done to our site (with the intentions of improving user experience) or is this due to google testing of algorythims, or the dreaded being pandalized?

[edited by: tedster at 6:07 pm (utc) on May 25, 2011]
[edit reason] fixed the link [/edit]

tedster

4:43 pm on May 25, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I have one client who had 20 keywords that ranked in the top 3. On the date of Panda 2.1, 19 of those rankings fell to the bottom 3 positions on the first page, and one fell to #15.

The key I think is to look at what kind of site leaped past yours. In my client's case, they are a re-seller (not affiliate) and the manufacturer's pages are what got boosted. And his pages include chunks of manufacturer's content that is reproduced verbatim.

Of course, the manufacturers usually are not offering sales, and my client is. But for some reason, right on the date of Panda 2.1, the taxonomy for these query terms shifted and information became the prominent part of the SERP, rather than e-commerce choices.

Planet13

5:07 pm on May 25, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Moderators:

The link MelissaLB posted in her original post is broken - throws a 404 error. (I know it is supposed goes to a subscriber only page, but it currently goes to a file not found).

@ MelissaLB

Are you in the US, the UK, Europe or other?

Is it only your site that has dropped? Or has there been a lot of change in the top ten?

indyank

5:16 pm on May 25, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Of course, the manufacturers usually are not offering sales, and my client is. But for some reason, right on the date of Panda 2.1, the taxonomy for these query terms shifted and information became the prominent part of the SERP, rather than e-commerce choices.


Is this because Matt cutts felt (which he claimed as feedback that he received) that e-commerce pages are ranking higher for many keywords?

indyank

5:18 pm on May 25, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



MelissaLB, did you make any changes to Title or links (anchor text) etc? I found Google to be reacting this way whenever any changes are done to these elements.

tedster

6:11 pm on May 25, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Is this because Matt cutts felt (which he claimed as feedback that he received) that e-commerce pages are ranking higher for many keywords?

It goes even further, I think. Google clearly maintains taxonomies for various query phrases by user intention, and it dynamically updates those taxonomies and their assignments on a periodic basis. Something shifted with Panda, in that those queries used to be nearly 100% "transactional". Now they seem to be much more mixed, with informational taking on a large portion of the results.

MelissaLB

6:35 pm on May 25, 2011 (gmt 0)

10+ Year Member



MelissaLB, did you make any changes to Title or links (anchor text) etc? I found Google to be reacting this way whenever any changes are done to these elements.


No changes to any titles on the site in about a year. We update the anchor text about 6 months ago...

MelissaLB

6:40 pm on May 25, 2011 (gmt 0)

10+ Year Member



Are you in the US, the UK, Europe or other?

Is it only your site that has dropped? Or has there been a lot of change in the top ten?


We are located in Canada but our server is inthe USA as is the majority of our customer base. As well, the results I speak of are the google.com results, not google.ca (those are still the same).

Desn't look to be any movement other than our site, the only site that ranked above us prior to this was the official store for a particular license and we were always #2. Now #2-#5 are all the UK based stores that traditionally would land below us in a US search.

walkman

7:45 pm on May 25, 2011 (gmt 0)



"Now #2-#5 are all the UK based stores that traditionally would land below us in a US search. "

Very odd. After Panda google has, IMO, given more country specific (based on links and server location) results. I used to rank #1 for a very popular "domain name" in most countries and now I'm #2-#5+ which is fair since I don't offer the "domain name" for that area at as much and almost all the links are from US hosted sites.

Can you add maybe a paragraph on that category page? I know with templates and some CMS that might not be as easy but you should try. It could also be a temporary thing, days...Google is changing a lot more these days.

HuskyPup

8:25 pm on May 25, 2011 (gmt 0)



Now #2-#5 are all the UK based stores that traditionally would land below us in a US search.


Do they have multiple links within the stores?

I have seen a lot of trade widget Chinese sites with multiple store owners all "selling", I suppose I should say "featuring", the widget appear from nowhere.

MelissaLB

11:12 am on May 26, 2011 (gmt 0)

10+ Year Member



Update, still no movement for that keyword. Also, upon further investigation, this downward movement is effecting more of our Category pages!

WE have made a few changes to our site in the past few weeks such as
-anti hot linkinn
-enabling right click capability
-adding a link out to our twitter and facebook company profiles static from the left column
-changing the pagination on the category pages from noindex/nofollow to noindex/follow

We are a reseller, not an affiliate, but we are also a wholesaler and a manufacturer of some goods. In most cases the only site that would ever place before ours is the manufacturer. Now, many UK and AU sites place before! I'm pretty certain this happened on the 24th.

However, based on the changes we've made I'm still very concerned that something our developer has updated has inadvertently allowed google to read through the java script that was disallowing the bots from seeing the massive list of hovermenu links we have static on each page (this was a fix we had implemented in February and it fixed a decline in the serps that looked almost identical to this decline (mostly select category pages triggered by 2 word keywords). I was told that using java script isnt foolproof, are there times when it does work, then eventually google realizes and starts reading it? (we're not trying to break any rules here, we just decided we didnt want G to be reading the same list of over 300 links on EVERY page)

Im no genius, im learning as I go so is there a tool or a way that I can check how google is crawling these pages, if it is crawling our hover menu again?

MelissaLB

11:46 am on May 26, 2011 (gmt 0)

10+ Year Member



Just one more thing, In the past 2 days, although sales are down, the only sales coming in are from the UK and Australia! We've lost our Canadian and US sales!

So, The UK sites in our niche are doing well in the US and now, our US site is doing well in the UK! Something's wrong! I submitted a spam report (although this may not exactly be spammy sites, its a spammy search result)... Fingers crossed!

Kenneth2

1:04 pm on May 26, 2011 (gmt 0)

10+ Year Member



-anti hot linking

hmmm... Anti backlinks ?

indyank

1:41 pm on May 26, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Why don't you block that javascript via robots.txt?

and yes, a sudden increase in the number of links on any page will trigger an adverse reaction.

MelissaLB

2:16 pm on May 26, 2011 (gmt 0)

10+ Year Member



--hmmm... Anti backlinks ?

hmmm. possibly. about 3000 of our images were being hotlinked by a site in india, anti hotlinking was our measure to stop this.

--Why don't you block that javascript via robots.txt?

I'm not sure I understand this. Our developer used java script as a means to disallow the bots from reading all the links in our hover menu. So, would 'blocking' that java script via robots.txt then allow the bots to read those 100's of links again destroying any keyword density the content to the pages have?

------
I'm considering un-doing some of the minor chances we've made in the past few weeks to determine if something we did has this negative effect (but it would be guess work at best). As far as changing site content, we've only added facebook and twitter buttons. Other than that the only changes were a few bot directives regarding follow and nofollow in pagination as well as a few 301's that were missed a few months ago.

Planet13

3:33 pm on May 26, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Our developer used java script as a means to disallow the bots from reading all the links in our hover menu.


Have you used the fetch as googlebot function in web mater tools to look at your pages? It might tell you whether those hover links are visible to bots or not.

MelissaLB

4:44 pm on May 26, 2011 (gmt 0)

10+ Year Member



---Have you used the fetch as googlebot function in web mater tools to look at your pages? It might tell you whether those hover links are visible to bots or not.

Yes, I have done that, and yes in fact, as I scroll through, I can see the list of hover menu links. Unfortunately, I do not have any historical fetch data that shows those hover menu links not being visible when our rankings were ok, but I will pass it along to my developer. Thanks.

indyank

5:26 pm on May 26, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



As long as you don't block the javascript from googlebot, it will fetch and execute it and you will see the hover menu.

If you don't want googlebot to read and parse the javascript, you might consider blocking it via robots.txt. But this will also be considered as cloaking as this is what eHow were doing earlier.

indyank

5:35 pm on May 26, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yes, I have done that, and yes in fact, as I scroll through, I can see the list of hover menu links.


if you are seeing the hover menu through the Fetch as googlebot tool, then it is probably created by something other than javascript.

This tool will usually display the javascript code and not the menu created by it.It is equivalent to what you see through "View source" in a browser.If you are seeing the hover menu, I am suspecting it to be created by some server sided code.

MelissaLB

5:57 pm on May 26, 2011 (gmt 0)

10+ Year Member



Any thoughts on this:

I brought this to the attention of our developer and here is what he did:

our Javascript was previously still showing a link as: <a href="/page_name">Page Title</a> within the Javascript. They still shouldn't have been picking it up, but if they changed their process to identify links by just parsing all links out of the code rather than what's displayed on the page, the potential exists that they still might've considered those links. Now, we've changed them to look more like this which, if they used that same approach, wouldn't be viewed as a link: ["/store/Widget","license_row","Widget Merchandise"].

Still, fingers crossed.

indyank

6:05 pm on May 26, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You have just made it one step more complicated for googlebot to parse it by including relative links rather than absolute links in the javascript code.Since we all know that googlebot can parse a javascript, it can read through the relative links as well (though I haven't tested it).

But you were saying that you could see the hover menu links through the "Fetch as googlebot" tool.Was it within the javascript code or outside it?

If it is outside the javascript code, then it probably is created by some server sided code.

MelissaLB

6:20 pm on May 26, 2011 (gmt 0)

10+ Year Member



Apologies, I dont have a perfect grasp on what I'm looking at when I'm looking at fetch as googlebot. I don't actually see the links, as in urls, I see the Titles of all the links that are in the hover menu. I have directed my developer to this thread to view the conversation. I'm doing my best, but it's a language that I don't speak well!

I'm feeling a little stumped now, the reason I am so focused on the error being caused by the hover menu is adding the java script in February is what caused our serps to return during that fiasco!

I am inclined to want to remove the anti hotlinking code we added 2 weeks ago. But I don't want to just take shots in the dark, if adding the anti hotlinking code messed with out backlinks structure, is it possible that it could have taken upwards of 2 weeks to reflect in the serps as we've grown accustomed to making a change and having the effect shown within 24 hours?

Thanks for all the help and suggestions guys, we really appreciate it.

MelissaLB

1:44 pm on May 27, 2011 (gmt 0)

10+ Year Member



Still no change in serps. So still trying to chase down the culprit that is hurting our placement.

I am trying to backtrack through the recent changes we've made and am wondering now if implementing the anti hotlinking could have hurt us. has this happened to anyone before where disallowing hotlinking hurt your backlink profile (someone mentioned this above).

Logic would tell me that it wouldn't because they aren't actually linking to us (are they?) they are just stealing an image and using our bandwidth. I need some clarity here!

I found this old thread here [webmasterworld.com...]
about targeting one specific domain that is hotlinking. Would this be recommended rather than a "catch-all" anti hotlinking script?

Thanks again, for the help and suggestions everyon. for someone who is learning, like me, this place is priceless!

indyank

8:15 am on May 28, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



what is the hotlinking script you are using? Are you on apache?

indyank

8:17 am on May 28, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Another question that I have is, do you still find all the images on Google images. You can find this by doing a site:domain.com in google Images.

lucy24

9:07 am on May 28, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Logic would tell me that it wouldn't because they aren't actually linking to us (are they?) they are just stealing an image and using our bandwidth. I need some clarity here!

google can't tell. (Or they don't feel like telling, since it would be trivial to count only links to pages, not to non-page material like images or sounds.) Hotlinks show up in "links to your site" in gwt right alongside the real links. Same principle as the "referer" in your logs: for an image file it's the page that requested the image, whether it's your own page where the image really lives, or some other page in someone else's site; for a page it's the link somebody clicked. The logs don't distinguish between <img src ... and <a href ..., and google doesn't seem to either.

But unless google is counting sheer numbers-- which it keeps swearing it doesn't-- getting rid of those hotlinks shouldn't have a bad effect. Sites that do a bunch of hotlinking tend to be pretty worthless anyway. Might even do you good, if the computer can say "Oh good, they've cut their ties with those losers".

MelissaLB

11:05 pm on May 28, 2011 (gmt 0)

10+ Year Member



indyank: yes, we are on apache. and yes, we can still find our images on google image search. Our developer added somethiing to the anti hotlinking script to still allow access to googlebot. (im not exactly sure of the script as I cant seem to find it in the source code)

lucy24: we did notice a major change in GWT when we added the anti-hotlinking. For instance our internal links went from about 9million down to 3million (the next day) and bounced back to 6million another day or two later. So it looks like it may have had a major effect on our internal linking structure.

We tend to not get a lot of traffic from image search, mostly from shopping results and organic.

Its interesting to know that google would consider it a link either way as the main culprit (that we knew of) that was hotlinking to us was a site out of india ( in dot com) with an alexa rank in the top 300. So is it possible that in trying to find a solution for one problem we are figuratively cutting off our nose in spite of our face?

aakk9999

12:28 am on May 29, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



With hotlinking script, you will only lose the benefit of that link to the image if the indian site that hotlinked your images has removed links to your images from their pages.

However, if they have not removed links to your images, then this would still be a backlink even though your image would not be displayed on their page owing to hotlinking script.

For instance our internal links went from about 9million down to 3million (the next day) and bounced back to 6million another day or two later.


I do not think this is connected because links hotlinking to your images are external links, not internal links. So hotlinking script should not have influence over number of internal links. I would also find it unusual that the effect is shown the very next day - it usually takes much longer than that and is more gradual rather than big drop or raise. I am more inclined to think that this is GWT error as the general opinion is that these numbers are not really accurate anyway and GWT is known to be buggy.

If however this drop does reflect what G. saw in your site then I would look into other changes you have done to your site in the period starting with few weeks before you saw the changes in GWT.

indyank

3:11 am on May 29, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



indyank: yes, we are on apache. and yes, we can still find our images on google image search. Our developer added somethiing to the anti hotlinking script to still allow access to googlebot. (im not exactly sure of the script as I cant seem to find it in the source code)


if they have taken care to let all bots from google in (not just the googlebot), you should still find the images in google images.

Since the anti-hotlinking script doesn't seem to block the google bots, I feel that it shouldn't be a problem too.

Dropping by 4 to 5 positions looks more like a panda symptom, but no one has figured out the real causes.

MelissaLB

9:04 pm on May 29, 2011 (gmt 0)

10+ Year Member



yes, google images does seem to be fine and I feel convinced that this isnt due to the anti hotlinking or the java script in the hover menu.

I've been looking at the site to try to understand why our site may have been 'pandalized'. We do use wiki content in our product descriptions (which we stopped doing around 3 months ago but left all previous products with their wiki content) as well as original content describing the product that we write ourselves.

Our pages are also templates so they generally all look the same (what you would expect from an e-commerce site). For instance there are many static items on every page such as our menu, hot items, about 9 'ads' pointing to different categories on the site (which we update weekly).

Before just going ahead and removing all wiki content, which could take days to weeks to do manually, it's difficult to know what kind of (if any) help that might be.

I've been having a hard time finding information on panda effecting e-commerce sites as we're not a farm or producing loads of useless content like many of these sites that have been effected.
This 44 message thread spans 2 pages: 44