Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

GWMT - Links to your site. Should I be first?

         

JacoRoux

6:46 pm on Apr 5, 2011 (gmt 0)

10+ Year Member



Hi,

In Google webmaster tools, under Links to you site, we are listed first.

Is this list not supposed to only show external linking domains?

JacoRoux

7:21 pm on Apr 5, 2011 (gmt 0)

10+ Year Member



A bit more:

Our site is listed first, with a staggering 495,745,762 links. When you go into any of them, it shows links from an old sub domain (m.) but this domain redirects to our www site with a 301, and this has been redirected for more than a year. But still the links in webmaster tools are growing?

Any suggestions?

tedster

7:31 pm on Apr 5, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



A different subdomain is considered a different "website" in webmaster tools. If you are not experiencing ranking issues I'd suggest ignoring the Webmaster Tools report - outside of verifying that the 301 redirect is actually what googlebot is seeing (sample a few URLs using the "Fetch as Googlebot" tool.)

JacoRoux

7:41 pm on Apr 5, 2011 (gmt 0)

10+ Year Member



That is just the thing, we are experiencing massive ranking drops.

Another point is that it shows up links to pages that has been removed months ago, and they are blocked in robots.txt so when you fetch as googlebot from the sub-domain, it says blocked by robots.txt.

metalhead

10:24 am on Apr 6, 2011 (gmt 0)

10+ Year Member



I noticed the same thing on one of our sites and it really confused me until I found out that it's the SSL version of that site. It seems that links from the SSL site show up under external not internal links. But they're not marked with https or something so it's not easy to figure out.

BenFox

11:00 am on Apr 6, 2011 (gmt 0)

10+ Year Member



HI JacoRoux,

So am I right in thinking that you have 301'd the m. subdomain and blocked the destination pages of those links in robots.txt?

The links will show up even if the destination page is blocked somehow - if you want to remove that link then you need to go to the source page (the page that it links from) and address the problem there.

indyank

11:13 am on Apr 6, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Wow...this is one more thing i noticed. In GWT, many of the pages in the site had suddenly seen a huge jump in the number of links to it. On clicking them I do find a list of domains and the number of links from them.But what is reported in the main report is way too high.This wasn't the case before. (pre-panda)

I checked for the links to a few top ranking pages on the site. Pages that have dropped just one or two positions are reporting lesser number of "Total links" while those that have dropped to page 2 or 3 are reporting too many under "Total links" to them.I am not sure from where G discovered so many "total links".

g1smd

5:59 pm on Apr 6, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The WMT reports are at best strange, and at worst completely wrong.

For a 5000 page site, WMT reports 140 www pages of the site having internal links pointing at them.

In the external links report, it also says there are 25 000 links from non-www URLs within the site. There are no such links. All non-www URL requests are redirected to the www version.

JacoRoux

6:05 pm on Apr 6, 2011 (gmt 0)

10+ Year Member



Should I be concerned about the wrong figures, or should I just ignore them...

I am thinking of changing the way the redirects work on non-www links..

currently we have the redirect setup that it just replaces whatever domain was typed with a www, so if a user goes to m.site.com/page.html we redirect them to www.site.com/page.html. I am thinking of changing this to redirect m.site.com/page.html to the root of www.site.com

Another concerning part of the link report is that it even shows links to pages that are in the robots.txt and that have been removed with the WMT URL Removal tool...

We have been trying to find reasons why our rankings have dropped and cannot find it, but not sure if I am grasping at straws here :)

JacoRoux

6:12 pm on Apr 6, 2011 (gmt 0)

10+ Year Member



BenFox

Examples:

Link to www.site.com/products/product.htm
Link from m.site.com/category1/

Currently when a user goes to m.site.com/category1/ he will be redirected to www.site.com/category1/

/category1/ and /products/ is in the robots.txt, and was removed with the WMT URL Removal tool. The URL removals however was just done for the www domain, now I am thinking I should maybe do the same for the m. domain as well.

JacoRoux

6:23 pm on Apr 6, 2011 (gmt 0)

10+ Year Member



Just noticed this:

http://m.site.com/robots.txt according to WMT exists and is all fine, although it redirects to http://www.site.com/robots.txt

[edited by: tedster at 9:10 pm (utc) on Apr 6, 2011]
[edit reason] make example urls visible [/edit]

indyank

3:34 am on Apr 7, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



g1smd, I do hope that what you say is true but the "total links" to internal pages changes post-panda. Earlier they were almost reporting the right numbers but now the jump is huge.

Eg: A page pre-panda had 4 external links from 4 different domains.Now GWT reports it to have 3K+ external links from 5 domains.The 5th domain hasn't been listed but it could be anything like the www version. (Note that all the www version is redirected to non-www version of the corresponding pages on the site).

Interestingly, a few months back the ownership of www version of the domain was asked to be separately verified in GWT. I even reported it here then and found quite a few people seeing similar notifications in GWT.But not many.

I did verify the ownership.But it stays idle in the dashboard and I haven't done anything else to it.

Significantly this happened only to one site and not to all others in the dashboard.This site is now affected by panda update.

Is google ignoring 301 redirects when it comes to determining links? Are these being reported as external links? Should we be blocking www versions of the site instead of doing a 301 redirect? what is the best way to do that?

JacoRoux

3:56 am on Apr 7, 2011 (gmt 0)

10+ Year Member



I am now thinking of taking the redirecting on the sub-domains off.

m.site.com will have a landing page, with a href link to www, all other pages requested will give 404 errors, every page that Google claims to be linking from will be in the robots.txt.

Maybe this will help to clean all references from this domain, as I am not sure how Google handles the 301s at this stage