Welcome to WebmasterWorld Guest from 184.108.40.206
He's stated that one of the reasons he started his blog was to see the 'net from a webmaster's point of view. So, what has he found out? Has it changed his thinking from viewing the web purely from the search side? What would Matt, as a webmaster, like to see SEs do better?
A site has been reported to have hidden text on its home page and at least 10 interior pages. These pages rank on first page SERPs. The home page as well as the interior pages that have hidden text are all reported via Google Spam Report.
After filing a spam report every six weeks to two months, the site is finally removed from the Google index. (Yay!).
Three weeks later, the site appears back in the index and SERPS but only the home page has changed - it no longer has hidden text but the ten interior pages still have the hidden text (boooooooo).
It was reported to Google again but no action has occurred.
Matt specifically says in his blog that you're supposed to promise not to do it again when they let you back in the index but this certainly did not happen in this case.
Have him publically say that AdWord Reps have no influence on getting banned sites back in the index. This is rumour running rampant in the bars.
What is he going to do about it?
I have to vote for this one. I think a lot of people are feeling that they are being hurt by internal site duplicate content penalties.
I also think that alot of people are trying to flush them out of Google's index.
I also think that getting old version of pages out of the index is in Google's best interest because it is wasting space and so if a page or site is removed via the removal tool, it should be gone from the index period.
joined:Dec 29, 2003
what upsets me, is that someone can have hidden links, hidden text and all, yet they can be re-instated 30 days later. On the other hand, I'm nowhere even for my "domain.com," because the algo apparently thought that I cheated, and I have no one to contact. This week it will be a full 30 days that I removed a link from my other site to the one now penalized, thinking that that caused the huge drop.
Some crime...huh? Linking from one site to the other.
Also on that note, what are some of the common causes as to why a website will drop back and not be one of the lucky ones to be filtered through, when it was filtered through before?
Is there anything we can do on our end or ... is it all in the hands of the adjustments you's do to your regional filtering algo?
1) What _really_ is the supplemental index, why is it so problematic, and when will it be fixed?
2) What's so bad about a site review process where sites pay a fee to have the sites examined for technical errors or honest mistakes?
MY QUESTION :
What Google is UPTO? what Google is looking for form webmaster in term of Good sites..
as i have followed google's guidelines in proper way.. did everything with is ethical.. but no good output.. and in other hand the same comeptitior which is just useing SPAM ranking well in google.. no action from Google even after reporting SPAM.
A set of graphical results could show how well the page scores in, for example:
Off-page optimization (low quality if spam BL patterns exist, high quality if stable BL's from relevant sites);
Technical factors (stability, dedicated IP better than shared IP, etc.)
A webmaster tool might save people from running around in circles when their site drops, and save workload for Google in dealing with webmaster queries.
This is one not so much for Matt but related to spam sites: why can the AdSense code be amended so that one publisher ID will only work on one domain? Result = Spam Adsense sites gone overnight.
I think the answer is obviously financial, but is increasing Adsense revenue going to offset a future drop in shareholder value if the web community lose confidence in the quality of G Search?
Does this signify that the homepage has been penalized/filtered/re-sandboxed for that phrase?
Any other q's of a more general nature for google?
URL only results, cacheing of 5 year old pages, supplementary results ect ect.
Realising this is now a business, do they actually care enough to make the free service for which they were once known so well work properly again?
oops didnt really read the post before i posted sorry bret
How about. Now google is in, what can only be thought of as a fight, with the main software developer and supplier on the planet, has thier vision to take over the world changed at all over the last few months?
How to break the speed of light? How to marry quantum mechanics and classical physics? Any question at all -- truly anything -- and Matt will answer?
Do you know anything about the use of chaos theory in predicting weather cycles? No? What about given that God is infinite, and that the universe is also infinite, would you like a toasted tea-cake?
Has Matt ever got to third base with a hot apple pie?
What about -
Has google ever thought about genetically engineering monkeys with wings to stop employee's from charging room service to the company?
In all seriousness....
How long do the local versions of servers (.co.uk for example) take to come in to line with the main datacentres in regards to updates.
And the GG question.
2. Why does Google.com continue to allow foreign domains to rank well for competitive terms? If I want a UK site, I'll search at Google.co.uk!
3. How much [insert bribe medium] to get added to the weekly Algo Update email? :-)
How does a web page get marked as "supplemental"? What can I do as a webmaster to get my page that's marked as "supplemental" out of the supplemental results?
Why does Google have a supplemental index?
Re-index the darn supplementals - non of my 30+ supplementals exist or they have changed many moons ago.
And, how about looking at my robots.txt exclusion list? Seems like you pickup 1 entry every two weeks (only 29 weeks to go).
If I use base href then please read it and believe it and don't index the non-www pages.
Maybe add a new meta tag that tells you www or non-www for the site?
2. Why does a new site, with just a couple of straight html links, has its pages listed as supplementals? (I have seen this with two sites of mine. Forget unnatural linking, there was practically NO linking.) Is this normal G behavior with new sites?
3. After all the Googlebowling and Googlewashing and 302 hijacks and meta refresh hijacks, does the old statement that there is 'almost' nothing anyone else can do to make a site lose its rankings still hold?
** I have sort of made up my mind about all of them; just need to know what G has to say.
What is in the works relative to pre-fetching? Where do you see it all going and what is the expected impact on webmasters and users/visitors?