Well, this kind of combines many of the points above, but being very open ended it would be interesting to see what Matt has to say.
He's stated that one of the reasons he started his blog was to see the 'net from a webmaster's point of view. So, what has he found out? Has it changed his thinking from viewing the web purely from the search side? What would Matt, as a webmaster, like to see SEs do better?
The following scenario:
A site has been reported to have hidden text on its home page and at least 10 interior pages. These pages rank on first page SERPs. The home page as well as the interior pages that have hidden text are all reported via Google Spam Report.
After filing a spam report every six weeks to two months, the site is finally removed from the Google index. (Yay!).
Three weeks later, the site appears back in the index and SERPS but only the home page has changed - it no longer has hidden text but the ten interior pages still have the hidden text (boooooooo).
It was reported to Google again but no action has occurred.
Matt specifically says in his blog that you're supposed to promise not to do it again when they let you back in the index but this certainly did not happen in this case.
Have him publically say that AdWord Reps have no influence on getting banned sites back in the index. This is rumour running rampant in the bars.
What is he going to do about it?
When a scraper (made for adsense) site is removed from Google search results, why can't Google block or remove adsense at the same time?
I have to vote for this one. I think a lot of people are feeling that they are being hurt by internal site duplicate content penalties.
I also think that alot of people are trying to flush them out of Google's index.
I also think that getting old version of pages out of the index is in Google's best interest because it is wasting space and so if a page or site is removed via the removal tool, it should be gone from the index period.
he is admiting that Google does manually change things; they have to. Math is good, but in this case it has a margin of error.
what upsets me, is that someone can have hidden links, hidden text and all, yet they can be re-instated 30 days later. On the other hand, I'm nowhere even for my "domain.com," because the algo apparently thought that I cheated, and I have no one to contact. This week it will be a full 30 days that I removed a link from my other site to the one now penalized, thinking that that caused the huge drop.
Some crime...huh? Linking from one site to the other.
In regional Google's, what are some of the common causes which would help websites to be filtered through to the default (the web), so they are able to be placed under quality search terms, that would not of been possible on the .com?
Also on that note, what are some of the common causes as to why a website will drop back and not be one of the lucky ones to be filtered through, when it was filtered through before?
Is there anything we can do on our end or ... is it all in the hands of the adjustments you's do to your regional filtering algo?
1) What _really_ is the supplemental index, why is it so problematic, and when will it be fixed?
2) What's so bad about a site review process where sites pay a fee to have the sites examined for technical errors or honest mistakes?
If it were me I'd ask him if MS has made him an offer. :)
i'm new here but not New in SEO industry. i'm following this forum regularly..
MY QUESTION :
What Google is UPTO? what Google is looking for form webmaster in term of Good sites..
as i have followed google's guidelines in proper way.. did everything with is ethical.. but no good output.. and in other hand the same comeptitior which is just useing SPAM ranking well in google.. no action from Google even after reporting SPAM.
Many legitimate websites appear to have been caught in the net by Google’s tougher anti-spam measures (e.g. those that have changed domain name using a 301 redirect). Is it time to accept that even the most ingenious algorithms have limitations and offer a human checking service for sites that have been unfairly penalised?
Whether Google will ever produce any 'webmaster tools' that allow someone to scan a webpage for that hard-to-define attribute: "Quality".
A set of graphical results could show how well the page scores in, for example:
Off-page optimization (low quality if spam BL patterns exist, high quality if stable BL's from relevant sites);
Technical factors (stability, dedicated IP better than shared IP, etc.)
A webmaster tool might save people from running around in circles when their site drops, and save workload for Google in dealing with webmaster queries.
Different topic, new post:
This is one not so much for Matt but related to spam sites: why can the AdSense code be amended so that one publisher ID will only work on one domain? Result = Spam Adsense sites gone overnight.
I think the answer is obviously financial, but is increasing Adsense revenue going to offset a future drop in shareholder value if the web community lose confidence in the quality of G Search?
Does changing the title of an existing page cause it to be filtered/resandboxed?
And lastly, why, following Jagger, is the first page from a particular domain to rank for a particular phrase no longer the homepage, but instead an inner page with far fewer (perhaps zero) external BL's?
Does this signify that the homepage has been penalized/filtered/re-sandboxed for that phrase?
Cool batch of questions. As usual, most slant towards the algo side - As usual, G is not going to talk about algo issues too much. The second they talk about them, the second someone devises a way to exploit them.
Any other q's of a more general nature for google?
Google for a large portion of professional webmasters appears to have been broken for sometime.
URL only results, cacheing of 5 year old pages, supplementary results ect ect.
Realising this is now a business, do they actually care enough to make the free service for which they were once known so well work properly again?
>Any other q's of a more general nature for google?
oops didnt really read the post before i posted sorry bret
How about. Now google is in, what can only be thought of as a fight, with the main software developer and supplier on the planet, has thier vision to take over the world changed at all over the last few months?
How to break the speed of light? How to marry quantum mechanics and classical physics? Any question at all -- truly anything -- and Matt will answer?
Do you know anything about the use of chaos theory in predicting weather cycles? No? What about given that God is infinite, and that the universe is also infinite, would you like a toasted tea-cake?
Has Matt ever got to third base with a hot apple pie?
What about -
Has google ever thought about genetically engineering monkeys with wings to stop employee's from charging room service to the company?
In all seriousness....
How long do the local versions of servers (.co.uk for example) take to come in to line with the main datacentres in regards to updates.
And the GG question.
did he hear about any plans of expansion outside the search market
Weird. I thought I had a question in this thread.
What does it take for Google to like a "new site"?
Why no default Brit English spelling on G uk/ireland/aus/nz?
1. Does having a large number of pages in the Supplemental Index decrease the perceived validity and relevancy of the site as a whole - especially those pages which are NOT supplemental.
2. Why does Google.com continue to allow foreign domains to rank well for competitive terms? If I want a UK site, I'll search at Google.co.uk!
3. How much [insert bribe medium] to get added to the weekly Algo Update email? :-)
How about introducing a location metatag eg <metaname="location" name="UK"> for example.
How about a question about the Supplemental Results.
How does a web page get marked as "supplemental"? What can I do as a webmaster to get my page that's marked as "supplemental" out of the supplemental results?
Why does Google have a supplemental index?
Here's a new one:
Re-index the darn supplementals - non of my 30+ supplementals exist or they have changed many moons ago.
And, how about looking at my robots.txt exclusion list? Seems like you pickup 1 entry every two weeks (only 29 weeks to go).
If I use base href then please read it and believe it and don't index the non-www pages.
Maybe add a new meta tag that tells you www or non-www for the site?
1. Does the sandbox really exist or is it a side efefct of something else.
2. Why does a new site, with just a couple of straight html links, has its pages listed as supplementals? (I have seen this with two sites of mine. Forget unnatural linking, there was practically NO linking.) Is this normal G behavior with new sites?
3. After all the Googlebowling and Googlewashing and 302 hijacks and meta refresh hijacks, does the old statement that there is 'almost' nothing anyone else can do to make a site lose its rankings still hold?
** I have sort of made up my mind about all of them; just need to know what G has to say.
So there you have it Brett,
Ask about the Supplementals,
then sneak one in about Supplementals
and while he's not looking ask again about Supplementals,
and if he still has time, see if he can shed any light on Supplementals!
Haven't heard much lately on our friend;
What is in the works relative to pre-fetching? Where do you see it all going and what is the expected impact on webmasters and users/visitors?
You got it, Cleanup! :-)
Is Google ever going to provide an incentive for people to build sites/pages that are web standards-compliant and accessible?
| This 87 message thread spans 3 pages: < < 87 ( 1  3 ) > > |