Welcome to WebmasterWorld Guest from 22.214.171.124
I am one of those webmasters who has suffered a penalty for the past few months, proportedly for trying to manipulate pagerank.
After reviewing the links report for my penalized site, I found that there is a page on another one of my sites which links to the penalized site, and because of query string values, the link report shows that instead of having one page linking to the penalized site, I have about 6,000 pages linking to that site.
Could this be the reason for the penalty? I don't know but I have nofollow'ed the link and have just finished submitting a reinclusion request. My fingers are crossed.
I have been incredibly frustrated over the past few months after being told that I was trying to manipulate page rank with no means to prove or disprove the accusation. I know there are a lot of other webmasters in the same situation. Do you think it's possible that this was one of the main reasons for the release of this new report?
If there was any miscommunicqtion up until now, and their facts were not correct, hopefuly with this new info, and your honest immediate attempt to fix it, they may give you a second look and help you out.
When I hear of issues with thousands of links pointing to this or that, I always think that some variation on "duplicate content" may be at play. You may have already fixed your problem with your nofollow and you may now be on your back in 2 days - 6 weeks (even without their direct human involvement), which is the time frame I've experienced in the past getting things right again after I fix a goof-up.
If that is really the reason for your drop (the sitewide link) let me state something:
I do have the same problem with a russian guy that built a banner rotation solution he really likes and that generates me more then 1000 identical links from all of his sites. I don't need them and I don't want them and I never asked for them...and I hardly get new visitors from his sites...
Who is Lasnik? Is he one of the guys that are directly and deeply involved with the team that really builds an decides about the core algos? Which search engine would count 6000 links from one domain (sitewides) several times? Braindead! They would force webmasters worldwide into this "Can I place a link or not here mummy?" - Ratrace... even the (non aware) ones that still place their "Designed by ABC WEBBIMBO CORP" on the bottoms of their customers pages...
On the other hand I do believe it COULD be one of the reasons for my georgeous way down...I just wrote to those guys to remove their links.
It does sound like you may have discovered at least one issue that could trip up an algorithm, dataguy. Best wishes on a speedy resolution all around.
I was really hoping to discover some other ways that have been found to use the new link report to help diagnose site problems.
Not your fault - it was my harsh reply. Sorry to Adam ... didn't wanted to get personal. It's just that my website has basically been wiped out from Google and that means much too much these days. In fact it means you're finished.
Google has too much market power and Adam works for them. That made his statement that you have tried to "increase your pagerank" different from all the stuff I hear and read from others nowadays.
No offence... just fired a trigger when I read it...
-Irrelevant links for PR gain
.... and probably a whole host of other things.
The bottom line is that these should be easily identifiable without the use of the Link report by the webmaster.
With the Link Report , you may like to remind yourself of links that could cause a problem - but it's not complete in it's coverage yet, from what i can see. I'm also seeing site links ( in our case ) from the same C block used for the legitimate purpose of multi language translation, excluded, so I'm wondering what's happening based on the new link tool.
Basically, I'm unsure of it's use at this point. Maybe that's why members are a bit quiet (IMO)
On the downside, there are two issues that limit its usefulness for me:
1. They still don't show every link they have in their index
2. They do show links that are not passing any influence
That said, I can see where some new information becomes available - especially when you're over the 1,000 backlink mark. Yahoo's sample will definitely be different than Google's sample then, so you do stand an extra chance of uncovering "things that should not be".
Using ANYTHING in Webmaster "Tools" is exceedingly dangerous. Doing things like downloading the Links report is basically saying; "Hello, Im trying to do SEO."
Webmaster Tools is a brilliant coup on Google's part. "Hey Serge, let's not bother trying to invent algorithms to catch these SEO dudes, let's just get THEM to TELL us who they are and which sites they manage." Hahaha, that is sooooo clever!
With digg, I found I get linked for every comment people made, and for being "dugg" by members, as its listed in their profiles. some threads have 100's of backlinks to digg totally generated by their own progams. I don't think that is necessarily good.
I was also wondering if the links listed at the top of the listing are more favourable, in terms of strength, than those at the bottom of the listing.
From my experience, yes I believe that Yahoo ranks higher PR sites first (there are often a few lower PR mixed in) but the average is usually high on the first few pages. After a few pages the PR definitely starts falling.
A spammy site will never have original content that makes sense (isn't jibberish words stuck together in a mock paragraph). And legitimate sites will have content appear on them that doesn't appear anywhere else. That will identify a spam vs. non-spam site.
I would hope Webmaster Tools will eventually be expanded to include better communication from Google to the webmaster/owner. For instance, duplicate content alerts so we can fix them before they become a problem.
It's to Google's advantage to help legitimate webmasters improve their sites, as that will in turn improve Google's SERPs. We all know how many quality URLs on quality sites are buried somewhere, or ranked improperly due to silly errors or lack of knowledge on the part of the webmaster. How many good sites are penalized right now because of a forum issuing multiple session IDs? Or a server in a shared environment programmed wrong so it doesn't return 301s properly?
The average webmaster probably doesn't understand all this, and likely doesn't want to. So is it right to deprive others of good information on their site just because they aren't server techs?
I've learned more about htaccess, server environments, html coding, and related things due to issues with Google than I really ever wanted to know. It's been good in that I've fixed problems that I never guessed would be problems. But I've also not been adding to my site at the rate I would have liked to, due to the time spent learning how to fix these problems. So I lose, my visitors lose, and Google loses because there are hundreds of new, unique, original content pages still waiting to be created and published.
I hope Google expands Webmaster Tools and improves communication, it will only benefit everyone if they do. Spammers will never be helped by this, because they will never build sites with original, unique content. If they did, they would no longer be spammers.
I hear they are also secretly developing Grassy Knoll Tools for the CIA. It's just a matter of time now until we know who really shot JFK. (During early beta testing last summer I hear they got a lead on the whereabouts of Jimmy Hoffa's remains... Turns out that was wrong but they have since tweaked the algo.)
How does webmaster tools help me to improve my site?
OK, the crawl report
There used to be something like site stats ala webalyzer. All I can see is if Google errors match the sie stat errors. Where can I report that Google gets this and that wrong, or my site was down to a MySQL bug and that's why the 500 errors are there.
and I can send my sitemaps
and then I can opt into that image game.
Ok Google wants the game so let's do it.
Ok i can tell them to crawl www.example.com instead of example.com
Crawl rate: OK I can tell Google to nuke my site of the index.
I can tell Google which backlinks it can discount cause they are my domains (my sites)
Backlinks, nice and? The best strategy is now to not fiddle with links and hope someone else will link it.
Keywords fine is the same in webalyzer and it will have changed in the SERPS tomorrow, so why bother?
Analytics, nice gimmick, I play with it in case Google gets a kick out of it and uses the return rate.
Where is the control in there that would increase my sites traffic tenfold so I finally get the country retreat I deserve?