Welcome to WebmasterWorld Guest from 54.161.110.186

Message Too Old, No Replies

Webmaster Tools Link Reports Are Very Inconsistent

     
2:31 am on Feb 12, 2008 (gmt 0)

5+ Year Member



I find the webmaster tools very inconsistent when it comes to internal and external links. Some days, it lists a hundred or sites and dozens of my pages with links pointing to them. Then suddenly it lists none, even though the external links pointing to my site are still there. A few weeks go by and the links magically reappear. Very frustrating...
4:40 am on Feb 12, 2008 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Yes, this can be very frustrating. I'm not sure if the Webmaster Tools team is struggling under a massive job, or if the variation is one way that Google tries to protect all its link data from in-depth reverse engineering.

But there's one consolation in the chaos: these reports are just that - reports. That is, they do not directly represent the full data set that is currently powering the ranking calculations, and Google never promised us that that kind of report.

The Webmaster Tools reports are usually more in-depth than the publicly available link: operator results. This means that only a validated owner can see the fuller list, but even that list is still incomplete compared to all the link data Google has gathered.

5:31 am on Feb 12, 2008 (gmt 0)

10+ Year Member



I can add, as of late; many reports in WMT seem to be odd for me.

My sitemaps are currently reporting zero urls in them and then the next day they report correct.

An index sitemap (one that just links to the other sitemaps) is reporting the full amount of urls on the site.

So, I tend to think that Google is working on some changes and not all reporting is correct, IMHO.

And as Ted says, they are just reports and we have no idea when/how they are updated.

6:53 am on Feb 12, 2008 (gmt 0)

WebmasterWorld Senior Member jetteroheller is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



For me, the link report is to consistent!

The reason: my main domain with all subdomain is since 27th October filtered.

I think this is because in the year 2000, I made one of the subdomains as central for all cgi services. Mainly forms.

Each page hat a contact link to a form on this subdomain.
The service page builts up the complete navigation of the calling site, only the content is replaced by the form to fill and send.

So it looked like this subdomain has with about 10.000 pages about 300.000 links to all my clients domains.

I removed this comlete begining December.
I think the filtering will stop, when all this links disappear.

But when I look in webmaster tools on client domains, there are at external links still all this links from this subdomain, mostly dated last discovered in November 2007.

4:58 pm on Feb 12, 2008 (gmt 0)

5+ Year Member



It makes perfect sense for Google not to return all your link info. It would be very easy to see which links were being counted and which weren't, and to start putting up or taking down links based on that report. I think this would help me enormously, but Google isn't really into helping us on the link front, huh?

As with every other piece of Google info, best to use it as a guide only and continue to try to add value with great content, good on-page SEO and lots of effort to get the great quality links.

Yahoo's link data is MUCH worse!

7:53 pm on Feb 12, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It's not only the inconsistent but also the stale data which make this experiment quite useless for me.
 

Featured Threads

Hot Threads This Week

Hot Threads This Month