Forum Moderators: open
And when I speak of themeing I am usually talking about lexical themeing on-page. Something like what is being done with Google sets, but not quite.
You have to be able to understand the theme of a page and the words and grammar that is appropriate for that theme before you would be able to compare the theme and the language of one page to another.
For example, if you enter "ford", "chevrolet", "dodge" into Google Sets, you get a list of other manufacturers. But there are a lot of other related words that could give you a good idea of how on-theme a page might be lexically.
Those three names can represent many different things that people might be looking for and themeing might be used to make sure that the first page of the SERPs are spitting out a selection of different themed pages so that the searcher has a selection.
Give them a couple of pages with the "truck transfer-case off-road" results. Some pages of car parts, and some pages of dealers and collectors.
GG wrote: ...Lots of web designers don’t think about how search engines will see a site (lots of session ID’s, or framesets, or dynamic urls, etc.). I think one of search engines' big jobs will be indexing a site intelligently even if the site wasn’t designed with search engines in mind. Our bots do a pretty good job, but it would always be nice to do more so that people (users and web designers) don’t have to think as much about search engines and how they work.
This begs the question > Is Google getting more involve in "web standards" e.g. developing with say W3C?
Part of the problem is that current guildlines say "design for the visitor" and never really hint at what designing for the visitor really means... therefore all who view this dialogue draw on intrepretative meaning such as:
Ok lots of session ID’s, framesets, and dynamic urls, we have now designed for the user! Obviously not what Google had in mind but nonetheless the outcome of vague guidance.
This is my only real pet peeve with G.
Things I like to know:
- How do you know when you're being penalised.
- Does PR0 automatically mean you are penalised or does it mean you just naturally have a low PR. In the last case, would it be a good idea to introduce a seperate "status" (e.g. "red" means you should review that page and find out what's wrong with it).
I know GG says that webmaster usually will know when they are doing things wrong, but I see a lot of very parranoid people here, and with a lot of discussion on things link in- and outbound links, mirror domain names etc etc it is sometimes not that obvious.
It looks like a lot of people are often unsure as I am. For example, I have a PR3 on my homepage, a PR1 on the search result page (for unexplainable reason?!) and PR0 on every other page (almost no gray bars). I do have mirror domains (alternative spellings) and some pages with a lot of links to detail pages. To me all pretty natural stuff, but maybe there is something that is seen as "bad practise". I think for a lot of people in that situation it would be good if a page would be "flagged" as a problem page so the webmaster can asses it.
By the way, thanks to GG and all the excellent people on this forum. I'm quite new to it and the subject and I have been enjoying all the knowledge here! Thumbs up!
My Q. is: What are the minamum and the maxamum "penalty times" that are give out for all other offences(excluding the google death penalty)? I mean, will I get "the chair" for linking to a bad neighborhood?
Q. Are all offences (excluding the google death penalty) cureable just by undoing whatever you did to get banned and if so is there like a set time frame for "each" type of ban or are they all the same (as far as time you will be out google)?
This is just a shot in the dark but, are there any plains(in the near future) to have a phone number to help webmasters with there banned sites, missing from the index sites, can't get googlebot to cwarl sites, ext....? Even if this were a $100 a call service, you would have no problem get takers.(can anyone 2nd that for me:).
Lastly, thanks for your time and understanding; you sure do rep. google in the right way (even though everything you say is off the record:).
teeceo.
I consider the above significant, as the above by GG seems to suggest that Google doesn't use any theming today when it comes to links between sites. This is what I had assumed by looking at SERPs. And, I also tend not to be a fan of the theming arguments that people have made. The idea has some problems in practice.
How do you know when you're being penalised.
If you don't know there is a problem (or a bad tactic being used) then you really have no guage to say: yes I am penalized or no I am not.
The first general indications are > much lower visitation and nornally much lower web related conversions.
A person that knows of PR can normally tell quite easily... if you have at least one link to you and that link is a PR2 or higher (the lower values would need less links sharing on the passed PR). If nothing registered on the "link to" page then something is wrong (bad time to use this guage though since the update really has completed.
Does PR0 automatically mean you are penalised or does it mean you just naturally have a low PR.
The latter > most sites that you see a Toolbar PR0 have a little PR just not enough to register on the toolbar as PR1.
In the last case, would it be a good idea to introduce a seperate "status" (e.g. "red" means you should review that page and find out what's wrong with it).
Possibly... in defence of the current trends, you get far superior measurements and guages in SEM strategies than any other more traditional strategies.
e.g. ever advertised in a newspaper > you paid for potential circulation not actual eyeballs. An ad in the newspaper might set you back $1000 for a good size once only ad, with a circulation of 1/2 million. On a bad day > 60% of those newspapers are never read just recycled but you paid for them > and didn't complain either that you never knew...
You were quite happy with being ignorant to that loss... see any parallel?
Simple understanding goes a long way to guaging outcomes.
Google does consider cloaking to be outside our guidelines. Truthfully, the use of cloaking seems to be in decline. I’ve seen several SEOs serve up pages that do JavaScript redirects or other types of redirects, but it’s getting to be pretty rare to see actual textbook cases of cloaking.
I agree: at least a good portion of problem results (pages whose content does not match their Google description) make use of JavaScript and sneaky client-side redirects as "poor man's cloaking".
Is there any hope that Google might start parsing JavaScript anytime soon in order to get rid of this particular type of problem?
Previous threads on this topic:
This definitely implies, as most of us already knew that theming is currently NOT being used in a significant way.
Thanks for the interesting information. One thing you didn't touch on in the area of spamming and I'm alittle concerned about is the statement I read the other day concering the use of transpranet images with links.
I've been looking for the URL to clear this up .. but I forget.
Anyway it was on Google and it mentioned something about Google will consider it spam if you have a GIF image that is transparent and wrapped with a URL. How is this going to affect site tracking software, that often uses non-human readable images and links to track pages?
Another things is you (and others) talk alot about PAGE RANK, but I for the life of me can't find where you locate your current page Rank in google. Do you manually count the pages ahead of your URL? surely not?
Q: Are there penalties that last forever? And what would be a possibility to remove this penalties from ones websites?
A: There are things that need a manual review before they’re lifted. If a webmaster is pretty sure that they did something wrong, they can mail to webmaster at google.com with the subject line “reinclusion request.” It helps to describe what you think happened, and what you changed on the site to make sure that everything is in good shape now.
A site can harm itself but can a penalised site harm sites it links to in any way?
[edited by: kapow at 12:36 pm (utc) on June 12, 2003]
At first I was a little dissapointed in the choice of questions that you answered, especially the lack of an answer on the huge "is freshbot now deepbot" thread but until we get a rep from MSN and Yahoo in here, you are proving that Google is the only SE that has any communication at all with webmasters.
Here is what I have learned.
1. Dmoz will continue to be used for the Google Directory
2. Theming means squat in terms of ranking.
3. There will be, at some point, another update.
4. You have an office with a window.
thanks again
mfishy
There seems to be no long term reason for a home page to be buried for particular search terms.
High PR pages are natural (not assigned by hand).
Things should become more clear after then next update.
GoogleGuy even penalises ex college buddies for hidden text.
In msg #12:
"The really nasty part is that unknown to their clients, the SEO also inserted 7-8 hidden links back to the SEO, so roughly half the PageRank that each customer had earned was getting routed to the SEO!"
No, I am not defending this guy. The bold type above is what I am interested in.
Does this mean if I have good PR and link to another site, I am sharing my PR with that site? Or more importantly, how much PR deduction is their for 1 link? 10 links? I am not selfish with linking (currently have over 100 outbound links of relative content). But the bold type above causes me to ask, do I need to be selfish with my linking?
I take my suffrage seriously: the question of "voting" for a site with a link from my page to theirs is important to the content I provide my users! I have no problem sharing, if the effect on PR is very small. The goal is to offer as much info to my constituents as possible (good content & good links).
Again, thanks for the answers! WFN :)
Q: Does Google like honest SEO's or would you prefer there not be any?
Thanks for answering my question. As Google's algos get better and has less Dominic type downtime, you will help discourage less than honest SEO's from even existing. But I guess you know that.
Lately, I also see more SEOs broadening their offerings by managing PPC for clients as well.
Mainly because Google PPC works. We will do whatever is best for the client, and PPC has proven to be a good thing right now (especially with professional management). Keep up the good work!
Firstly - thanks Brett for choosing my question as one of the questions to submit - and GoogleGuy - thanks for your answer (see #9).
So GG - my idea for implementation of this two tier spam reporting system is that 'another tier' is created as a formal process - with two way communication - open to spamreportprocesspartners eg. by invitation. Anyone can 'report a competitor' under the current URL focussed system - this new system is a 'process' report.
This proposed 'uncover/disclose the method' system is a separate two way process - where selected participating SEO's get told/ or tell/ Google that eg this week /month eg 'javascript clown methods' will be the target.
Google then gets the experience/expertise of these selected SEO's to gain advantage of the 'spam method' - rather than 'he has... his site ranked ....' type reports.
I envisage that the 'method' report process will focus on 'how' - and would include a few 'sample' sites. Being non public - this gives Google 'strategy stealth'', 'external experience' and 'implementaion surprise' - and a chance to roll detection of 'methods' up into a filter and deploy it - rather than multiple separate individually handcarved sticks to beat up individually selected cheats who break the rules.
So then - eg - all the Javascript redirector clowns - who provide distorted results - irespective of the search terms - will cop a penalty - because a filter to stop the 'process' will be discovered. Because we can help to solve the compliance problems - at the symptom level.
Ethical SEO firms report deceptive sites that violate Google's spam guidelines
Reports which are targeting the full disclosure of spam method - with 'example' sites - under a new process - and by communicating with, and focussing professional, ethical SEO's as a resource, Google can use our skills to make the filter changes it needs to be better than the rest.
If Google doesn't interact 'both ways' - this won't work. GG - you've seen my reports marked to your attention. Clients (mostly) pay me to do these reports as part of a site analysis project - if they don't - how could I afford to spend hours desribing how/ why spammers break Google published rules to Google? Hey - they are your rules!
Why do I, or my clients, care? Because ethical people, playing within the published rules and guidelines of business - NEED to believe there is a level playing field. And if there isn't - well watch out - because my Sherman tank is bigger than your knife. And thats where this is headed.
The current process make it look like 'schoolyard tattletales' - 'he did this... and his site ranks....'
Google needs to make a few hard choices. It either wants to clean up - and enforce the published rules - or have rampant cloaking. Yes - these are the 2 extremes - and my personal belief is most people want to 'play fair' - but hey - if Cloaking is eg an unpenalised offence - and is undetectable and no one wants to detect it - I know heaps of MDs who'd love to have really cool non W3C compliant, flash only sites - that will also rank really highly in search engines.....
I think this proposed reporting process will also provide Google all the advantages of DMOZ - ie. a high level conceptual level - ie human review - at low cost. Lets talk.
(I'd offer to buy you lunch - but as you are in an office at the Mountain View 'Plex - with a window overlooking the 'Google sign' - and as I'm in Sydney - that's probably not possible... unless .. GG - you're really named Kate or Jane ... and can see Darling Harbour.... I had no idea.......)
: )
Chris_D
Sydney Australia
[edited by: Chris_D at 2:55 pm (utc) on June 12, 2003]
Much appreciation for your time in this forum. I am a newbie who has been given the task of SEO for the large company I work for. You mentioned JavaScript redirects and cloaking, things I am not very familiar with, however, my company's system does all external links as dynamic links (through javascript, not an href). Is this the same thing as a javascript redirect that will penalize us? This is how our web development team built our system. They did this to easily verify all external links are live every 24 hours. By dynamic linking, are we risking being penalized by Google?
GoogleGuy or any one else that may know....
is there such a 'penalty box'.....if my site has been penalised...is there some where i can go where these sites are listed or something?
Our hidden text detection recently found hidden text on the page of someone I knew from college. That page got the same treatment as any other page. When the white-on-white text was removed, the page came back just fine and everyone was happy.
As much as I hate hidden text I'm wondering how strict this filter is? If it's on the home page or in the footer of every page of the site and there's just a ton of hidden text, I can understand a site penalty. But in forums I've often seen people put hidden text in the signature just to be cute. And if they are frequent posters, there could literally be a TON of it through out a forum. It would be a shame to penalize a site for this.
Or when the web just started out, a lot of us had those cutesy home pages where we tried every html trick in the book. The old days sure were exciting and fun. I remember testing every ugly html trick in a sort of tutorial...the "blink" tag, hiding text for riddles, etc..
So my follow up question is, for inside pages, when there is hidden text in signatures of a forum, or if someone has a quiz and hides the answers in hidden text, does google penalize the page or simply invalidate the hidden text as a keyword in the search?
P.S. Another possibility would be to test the hidden text for how competitive the keywords inside are, or how related they are to each other.
PR doesn't "go away to others", you vote for other sites by linking to them.
That is my understanding after a year of this place! :-)
Alex
Our ultimate goal is to improve quality using only automated algorithms. Those algorithms may take longer to get right, but the nice thing is that when they’re done, they can often shut down an entire type of spam.
Since these algorithms sometimes take very long to tweak why should I not join the spammers? Let's say that my clean site is being pushed down by sites that are using spam (1x1 pixels, hidden text, external css files, etc.) I could build a small website using the spam techniques that work for the competing spamming websites. I would then get the ranking and traffic that the spammers are enjoying. When Google finally catches the "spam method". My small site and all the spammers are removed and my clean site bounces back.
I am not advocating spam and do not want to play these tricks. But based on your answer, this seems like a way that someone with no ethics could take advantage of the system.