| This 152 message thread spans 6 pages: < < 152 ( 1 2  4 5 6 ) > > || |
|Some Q&A answers|
GoogleGuy answers some questions from last week..
Okay, I sat down and thought about answers to some of the questions that people asked last week. Brett, how about we work it like this; I'll do the next several posts, with one answer per post. Please wait for me to finish posting answers, and then we can keep this thread open for people to discuss. Does that sound good? I'll let people know when I'm done posting my responses, and if everyone is courteous to each other, I'll try to post some more responses in a few days. As always, I'll try to do my best to give good answers, but bear in mind that this is my personal take on things. Does that sound fair?
I never thought of "off-theme" as being bad, I just think that there is a lot of value to having links coming from the majority of on-theme pages on the web.
And when I speak of themeing I am usually talking about lexical themeing on-page. Something like what is being done with Google sets, but not quite.
You have to be able to understand the theme of a page and the words and grammar that is appropriate for that theme before you would be able to compare the theme and the language of one page to another.
For example, if you enter "ford", "chevrolet", "dodge" into Google Sets, you get a list of other manufacturers. But there are a lot of other related words that could give you a good idea of how on-theme a page might be lexically.
Those three names can represent many different things that people might be looking for and themeing might be used to make sure that the first page of the SERPs are spitting out a selection of different themed pages so that the searcher has a selection.
Give them a couple of pages with the "truck transfer-case off-road" results. Some pages of car parts, and some pages of dealers and collectors.
|GG wrote: ...Lots of web designers donít think about how search engines will see a site (lots of session IDís, or framesets, or dynamic urls, etc.). I think one of search engines' big jobs will be indexing a site intelligently even if the site wasnít designed with search engines in mind. Our bots do a pretty good job, but it would always be nice to do more so that people (users and web designers) donít have to think as much about search engines and how they work. |
This begs the question > Is Google getting more involve in "web standards" e.g. developing with say W3C?
Part of the problem is that current guildlines say "design for the visitor" and never really hint at what designing for the visitor really means... therefore all who view this dialogue draw on intrepretative meaning such as:
Ok lots of session IDís, framesets, and dynamic urls, we have now designed for the user! Obviously not what Google had in mind but nonetheless the outcome of vague guidance.
This is my only real pet peeve with G.
I see a lot of questions and answers and speculations about penalisation etc. I think it would be really nice to have an overview so to make an end to the speculations once and for all.
Things I like to know:
- How do you know when you're being penalised.
- Does PR0 automatically mean you are penalised or does it mean you just naturally have a low PR. In the last case, would it be a good idea to introduce a seperate "status" (e.g. "red" means you should review that page and find out what's wrong with it).
I know GG says that webmaster usually will know when they are doing things wrong, but I see a lot of very parranoid people here, and with a lot of discussion on things link in- and outbound links, mirror domain names etc etc it is sometimes not that obvious.
It looks like a lot of people are often unsure as I am. For example, I have a PR3 on my homepage, a PR1 on the search result page (for unexplainable reason?!) and PR0 on every other page (almost no gray bars). I do have mirror domains (alternative spellings) and some pages with a lot of links to detail pages. To me all pretty natural stuff, but maybe there is something that is seen as "bad practise". I think for a lot of people in that situation it would be good if a page would be "flagged" as a problem page so the webmaster can asses it.
By the way, thanks to GG and all the excellent people on this forum. I'm quite new to it and the subject and I have been enjoying all the knowledge here! Thumbs up!
"Q: Are there penalties that last forever? And what would be a possibility to remove this penalties from ones websites?"
My Q. is: What are the minamum and the maxamum "penalty times" that are give out for all other offences(excluding the google death penalty)? I mean, will I get "the chair" for linking to a bad neighborhood?
Q. Are all offences (excluding the google death penalty) cureable just by undoing whatever you did to get banned and if so is there like a set time frame for "each" type of ban or are they all the same (as far as time you will be out google)?
This is just a shot in the dark but, are there any plains(in the near future) to have a phone number to help webmasters with there banned sites, missing from the index sites, can't get googlebot to cwarl sites, ext....? Even if this were a $100 a call service, you would have no problem get takers.(can anyone 2nd that for me:).
Lastly, thanks for your time and understanding; you sure do rep. google in the right way (even though everything you say is off the record:).
>I'm not positive that I'm a huge fan of the theming arguments that people have made--some of the most useful links I've seen are from "off-topic" sites--but I would definitely agree that it helps users to link to useful, relevant, related sites. So I could see where someday we might our scoring to reflect that in some part. If you see tons of links flowing into a site and not a single link to the rest of the web, then as a user I might scratch my head a little bit.
I consider the above significant, as the above by GG seems to suggest that Google doesn't use any theming today when it comes to links between sites. This is what I had assumed by looking at SERPs. And, I also tend not to be a fan of the theming arguments that people have made. The idea has some problems in practice.
|How do you know when you're being penalised. |
If you don't know there is a problem (or a bad tactic being used) then you really have no guage to say: yes I am penalized or no I am not.
The first general indications are > much lower visitation and nornally much lower web related conversions.
A person that knows of PR can normally tell quite easily... if you have at least one link to you and that link is a PR2 or higher (the lower values would need less links sharing on the passed PR). If nothing registered on the "link to" page then something is wrong (bad time to use this guage though since the update really has completed.
|Does PR0 automatically mean you are penalised or does it mean you just naturally have a low PR. |
The latter > most sites that you see a Toolbar PR0 have a little PR just not enough to register on the toolbar as PR1.
|In the last case, would it be a good idea to introduce a seperate "status" (e.g. "red" means you should review that page and find out what's wrong with it). |
Possibly... in defence of the current trends, you get far superior measurements and guages in SEM strategies than any other more traditional strategies.
e.g. ever advertised in a newspaper > you paid for potential circulation not actual eyeballs. An ad in the newspaper might set you back $1000 for a good size once only ad, with a circulation of 1/2 million. On a bad day > 60% of those newspapers are never read just recycled but you paid for them > and didn't complain either that you never knew...
You were quite happy with being ignorant to that loss... see any parallel?
Simple understanding goes a long way to guaging outcomes.
Kudos to GoogleGuy for this answer-a-thon.
Previous threads on this topic:
<<>I'm not positive that I'm a huge fan of the theming arguments that people have made--some of the most useful links I've seen are from "off-topic" sites--but I would definitely agree that it helps users to link to useful, relevant, related sites. So I could see where someday we might our scoring to reflect that in some part. If you see tons of links flowing into a site and not a single link to the rest of the web, then as a user I might scratch my head a little bit. >>
This definitely implies, as most of us already knew that theming is currently NOT being used in a significant way.
Thanks for the interesting information. One thing you didn't touch on in the area of spamming and I'm alittle concerned about is the statement I read the other day concering the use of transpranet images with links.
I've been looking for the URL to clear this up .. but I forget.
Anyway it was on Google and it mentioned something about Google will consider it spam if you have a GIF image that is transparent and wrapped with a URL. How is this going to affect site tracking software, that often uses non-human readable images and links to track pages?
Another things is you (and others) talk alot about PAGE RANK, but I for the life of me can't find where you locate your current page Rank in google. Do you manually count the pages ahead of your URL? surely not?
I think your query was partly covered by the following thread
Hello Kiwi001 Welcome to webmasterworld. You will find your answer regarding Pagerank over here :-
WebmasterWorld Google Knowledgebase V2 [webmasterworld.com]
Thanks for the inside information GoogleGuy. This kind of thread will help reduce the misguided methods and the conspiracy theories.
|Q: Are there penalties that last forever? And what would be a possibility to remove this penalties from ones websites? |
A: There are things that need a manual review before theyíre lifted. If a webmaster is pretty sure that they did something wrong, they can mail to webmaster at google.com with the subject line ďreinclusion request.Ē It helps to describe what you think happened, and what you changed on the site to make sure that everything is in good shape now.
A site can harm itself but can a penalised site harm sites it links to in any way?
[edited by: kapow at 12:36 pm (utc) on June 12, 2003]
What is Google's view of adult websites? Do they tend to get generally lower ranking than other websites?
I have done mostly normal websites but I am working with an adult website now, is there anything I should keep in mind to follow your rules?
GoogleGuy, first off, thanks.
At first I was a little dissapointed in the choice of questions that you answered, especially the lack of an answer on the huge "is freshbot now deepbot" thread but until we get a rep from MSN and Yahoo in here, you are proving that Google is the only SE that has any communication at all with webmasters.
Here is what I have learned.
1. Dmoz will continue to be used for the Google Directory
2. Theming means squat in terms of ranking.
3. There will be, at some point, another update.
4. You have an office with a window.
Thanks guys, sorry for posting those Q's of mine in the wrong area, I appreciate the correct links.
A big thank you to GoogleGuy for endulging us all to such an extent, this has obviously taken some time. I'll add to mfishy's list:
There seems to be no long term reason for a home page to be buried for particular search terms.
High PR pages are natural (not assigned by hand).
Things should become more clear after then next update.
GoogleGuy even penalises ex college buddies for hidden text.
|Web Footed Newbie|
Thanks GG for the generous time and good answers, especially ODP and spam.
In msg #12:
"The really nasty part is that unknown to their clients, the SEO also inserted 7-8 hidden links back to the SEO, so roughly half the PageRank that each customer had earned was getting routed to the SEO!"
No, I am not defending this guy. The bold type above is what I am interested in.
Does this mean if I have good PR and link to another site, I am sharing my PR with that site? Or more importantly, how much PR deduction is their for 1 link? 10 links? I am not selfish with linking (currently have over 100 outbound links of relative content). But the bold type above causes me to ask, do I need to be selfish with my linking?
I take my suffrage seriously: the question of "voting" for a site with a link from my page to theirs is important to the content I provide my users! I have no problem sharing, if the effect on PR is very small. The goal is to offer as much info to my constituents as possible (good content & good links).
Again, thanks for the answers! WFN :)
Yeah agree with you ciml, on all those negative posts about Google I said Googleguy is an example of why Google is the best. Which other SE has a representative posting on webmaster forums trying to help us out.
|Q: Does Google like honest SEO's or would you prefer there not be any? |
Thanks for answering my question. As Google's algos get better and has less Dominic type downtime, you will help discourage less than honest SEO's from even existing. But I guess you know that.
|Lately, I also see more SEOs broadening their offerings by managing PPC for clients as well. |
Mainly because Google PPC works. We will do whatever is best for the client, and PPC has proven to be a good thing right now (especially with professional management). Keep up the good work!
Top Notch GoogleGuy
Some people here seem to have thought you may have been throwing them a googly or two.
For those not accustomed to the UK and Cricket - it's a leg-spinner's "wrong' un". You have just "dismissed" those disbelievers with your answers.
Firstly - thanks Brett for choosing my question as one of the questions to submit - and GoogleGuy - thanks for your answer (see #9).
So GG - my idea for implementation of this two tier spam reporting system is that 'another tier' is created as a formal process - with two way communication - open to spamreportprocesspartners eg. by invitation. Anyone can 'report a competitor' under the current URL focussed system - this new system is a 'process' report.
Google then gets the experience/expertise of these selected SEO's to gain advantage of the 'spam method' - rather than 'he has... his site ranked ....' type reports.
I envisage that the 'method' report process will focus on 'how' - and would include a few 'sample' sites. Being non public - this gives Google 'strategy stealth'', 'external experience' and 'implementaion surprise' - and a chance to roll detection of 'methods' up into a filter and deploy it - rather than multiple separate individually handcarved sticks to beat up individually selected cheats who break the rules.
|Ethical SEO firms report deceptive sites that violate Google's spam guidelines |
Reports which are targeting the full disclosure of spam method - with 'example' sites - under a new process - and by communicating with, and focussing professional, ethical SEO's as a resource, Google can use our skills to make the filter changes it needs to be better than the rest.
If Google doesn't interact 'both ways' - this won't work. GG - you've seen my reports marked to your attention. Clients (mostly) pay me to do these reports as part of a site analysis project - if they don't - how could I afford to spend hours desribing how/ why spammers break Google published rules to Google? Hey - they are your rules!
Why do I, or my clients, care? Because ethical people, playing within the published rules and guidelines of business - NEED to believe there is a level playing field. And if there isn't - well watch out - because my Sherman tank is bigger than your knife. And thats where this is headed.
The current process make it look like 'schoolyard tattletales' - 'he did this... and his site ranks....'
Google needs to make a few hard choices. It either wants to clean up - and enforce the published rules - or have rampant cloaking. Yes - these are the 2 extremes - and my personal belief is most people want to 'play fair' - but hey - if Cloaking is eg an unpenalised offence - and is undetectable and no one wants to detect it - I know heaps of MDs who'd love to have really cool non W3C compliant, flash only sites - that will also rank really highly in search engines.....
I think this proposed reporting process will also provide Google all the advantages of DMOZ - ie. a high level conceptual level - ie human review - at low cost. Lets talk.
(I'd offer to buy you lunch - but as you are in an office at the Mountain View 'Plex - with a window overlooking the 'Google sign' - and as I'm in Sydney - that's probably not possible... unless .. GG - you're really named Kate or Jane ... and can see Darling Harbour.... I had no idea.......)
[edited by: Chris_D at 2:55 pm (utc) on June 12, 2003]
"A: I think shaadi asked this question. So shaadi, I think if you check the site that you reported for hidden text, youíll find that itís in the penalty box."
GoogleGuy or any one else that may know....
is there such a 'penalty box'.....if my site has been penalised...is there some where i can go where these sites are listed or something?
|Our hidden text detection recently found hidden text on the page of someone I knew from college. That page got the same treatment as any other page. When the white-on-white text was removed, the page came back just fine and everyone was happy. |
As much as I hate hidden text I'm wondering how strict this filter is? If it's on the home page or in the footer of every page of the site and there's just a ton of hidden text, I can understand a site penalty. But in forums I've often seen people put hidden text in the signature just to be cute. And if they are frequent posters, there could literally be a TON of it through out a forum. It would be a shame to penalize a site for this.
Or when the web just started out, a lot of us had those cutesy home pages where we tried every html trick in the book. The old days sure were exciting and fun. I remember testing every ugly html trick in a sort of tutorial...the "blink" tag, hiding text for riddles, etc..
So my follow up question is, for inside pages, when there is hidden text in signatures of a forum, or if someone has a quiz and hides the answers in hidden text, does google penalize the page or simply invalidate the hidden text as a keyword in the search?
P.S. Another possibility would be to test the hidden text for how competitive the keywords inside are, or how related they are to each other.
WebFootedNewbie: What he is saying is that if your site has a PR4, and you have 10 links out plus 10 hidden SEO links, your PR voting is being split by 20 links (10 you see, 10 you don't). You are not LOSING PR because of the links, you are just wasting 50% of your voting power!
PR doesn't "go away to others", you vote for other sites by linking to them.
That is my understanding after a year of this place! :-)
|4. You have an office with a window. -mfishy |
Actually, GG said
|Although once I was looking out a window and I saw something neat |
Could have been the lobby window, so I think we're back to square one :)
|Web Footed Newbie|
Thanks, RawAlex for that, I thought I was a bit confused!
Thanks GoogleGuy, great input. However I have a question on your take of spam.
|Our ultimate goal is to improve quality using only automated algorithms. Those algorithms may take longer to get right, but the nice thing is that when theyíre done, they can often shut down an entire type of spam. |
Since these algorithms sometimes take very long to tweak why should I not join the spammers? Let's say that my clean site is being pushed down by sites that are using spam (1x1 pixels, hidden text, external css files, etc.) I could build a small website using the spam techniques that work for the competing spamming websites. I would then get the ranking and traffic that the spammers are enjoying. When Google finally catches the "spam method". My small site and all the spammers are removed and my clean site bounces back.
I am not advocating spam and do not want to play these tricks. But based on your answer, this seems like a way that someone with no ethics could take advantage of the system.
>Could have been the lobby window, so I think we're back to square one
Good grief. People here really do dissect everything GG says closely. ;)
Hmm. That's a great idea. It will give G more data to work with, give you some interim traffic and once you are penalized you can just link to your competitors in a "friendly" gesture.
SEOs are evil heh heh heh.
Thank you Googleguy for enhancing our enlightenment here.
In connection with spam reporting and development of algorythmic solutions, a couple of areas not touched upon, and which occupy a great deal of WebmasterWorld bandwidth are 1) cross-linking and 2) duplicate content.
I'm wondering if you would be willing to entertain questions here, or perhaps in another thread, regarding those two rather perplexing topics. Particularly, are they a hot topic within Google as far as spam techniques are concerned, and are algorythmic changes being developed to address these areas?
No one seems to know, for instance, what constitutes "excessive cross-linking", leaving hidden links aside.
| This 152 message thread spans 6 pages: < < 152 ( 1 2  4 5 6 ) > > |