| 6:35 am on Jun 12, 2003 (gmt 0)|
|Although once I was looking out a window and I saw something neat. A car pulled up pretty fast and stopped right outside the Googleplex. Three people piled out and gathered out front in front of the sign for our building. |
You realize this will only encourage those who are trying to determine your true identity don't you? LOL
I can see it now, SEOs sitting outside the Plex, mapping out windows that have s direct view to the sign.
| 6:36 am on Jun 12, 2003 (gmt 0)|
For $20 I bought a pair of barbers shears and I cut my own hair on my kitchen floor. Amortised over the life of the shears I think it's going to be about 50 cents a cut. Good advice for these troubled economic times, GG!
On topic, Marcia? :)
| 6:39 am on Jun 12, 2003 (gmt 0)|
I'm sure that they're just trying to keep things on-topic, Seattle_SEM. I will be around to answer questions and make sure we get all the feedback when the next update comes, though.
DarrylParker, sounds like you have the makings of a frugal Googler. ;) Although I have to admit that I'm not quite that frugal. :)
| 6:41 am on Jun 12, 2003 (gmt 0)|
Shall I keep my schedule open in June, for that feedback rendevous, GG, or is July looking better?
Also, regarding touring the 'plex - attend SES in CA in the summer - google had a party at the GooglePlex during that event last year.
| 6:45 am on Jun 12, 2003 (gmt 0)|
I'm looking forward to it too, Seattle_SEM. I like June a lot, but I'm not going to commit to something that I can't really promise.
I thought about mentioning the Google Dance 2002. It was a party that we threw for Search Engine Strategies attendees last year when the conference was in San Jose. But I think even then, we just danced and ate outside and didn't really do tours. :)
| 6:46 am on Jun 12, 2003 (gmt 0)|
I am a bit dissappointed. Still maybe i had set my expectations to unrealistic levels. All answers seem perfectly logical (the serious ones) and am personally happy with the responses about ODP and your's taking notice of comments by level headed WW members. Thanks for the responses :)
| 6:48 am on Jun 12, 2003 (gmt 0)|
Don't forget about the beer at the party, GoogleGuy - there musta' been 30 kegs ;-)
| 6:53 am on Jun 12, 2003 (gmt 0)|
All I know is that there was good dancing. :)
| 7:03 am on Jun 12, 2003 (gmt 0)|
About the only thing in these answers which approaches significant to me is that this seems to suggest Google isn't going to drop the ODP soon. And, I didn't consider that likely to begin with.
| 7:06 am on Jun 12, 2003 (gmt 0)|
Glad you got something useful out of it then rfgdxm1, thanks GG, very interesting stuff.
I invested in a pair of clippers and just shave it all off once a month ;)
| 7:08 am on Jun 12, 2003 (gmt 0)|
|I think the thing that *doesnít* work well is when an SEO gives bad advice, or does things well outside quality guidelines ... the clients may have temporary trouble now. |
Don't know if this is answerable GG > Off-topic to the question - on-topic to your answer.
A penalty > does it penalize the technique/tactic only or actually penalize the page and site (which are really two different things).
The best "generalized" example I can give is > tactics or techniques do not define the substance of the web site > the physical content, topics, depth, breadth and context of that site.
So does the technique/tactic become ineffective when penalized or is the page/site actually removed or lost (for lack of a better words).
Google goal is "to provide to best answers to the question" therefore I would believe the technique/tactic ineffective approach would be more beneficial to Google, the user, and even the clients of bad SEO advice.
[edited by: fathom at 7:10 am (utc) on June 12, 2003]
| 7:08 am on Jun 12, 2003 (gmt 0)|
I like the answers on the ODP. It certainly HAS improved of late, and it's good to see that Google have noticed.
For me I think Google/ODP is a great partnership. The ODP is the Google of the directory world, and despite people's frustrations sometimes, it is far and above any other directory. A different class really.
I'd still love to see it hosted by Google though in the long term. It's a search facility, and really should be looked after by a search technology company.
Thanks for taking the time out. There's some good stuff in there.
| 7:09 am on Jun 12, 2003 (gmt 0)|
Many thanks for answering me GoogleGuy :)
| 7:13 am on Jun 12, 2003 (gmt 0)|
Q: Is / will cloaking still be seen as bad, and furthermore is cloaking an issue for what webmaster will be punished without asking for reasons why they use it?
I have a quick question about this. I have a phpBB2 forum that puts session ID's into the url. I used one of the "hacks" available to turn off the session ID filter for googlebot, inktomi and several other search engines. Would this be considered cloaking, since the session ID is turned off for google but on for normal visitors? Reason I ask is that for many months, google successfully spidered this forum. But last month, something strange happened and google got the url wrong for my homepage - putting it inside my forum instead! My thoughts were perhaps the hiding of the session ID string from googlebot caused this problem.
| 7:18 am on Jun 12, 2003 (gmt 0)|
GG Thanks for taking the time. I think you switched hats only a couple of times there :) It's a step in the right direction ;)
| 7:20 am on Jun 12, 2003 (gmt 0)|
:::Would this be considered cloaking, since the session ID is turned off for google but on for normal visitors?
If visitors see the exact same thing that Google see's, then there is nothing wrong with geting rid of the Session ID. Google hates session IDs, there for it is not cloaking when you get rid of sessions IDs, even if the visitors still get them.
| 7:20 am on Jun 12, 2003 (gmt 0)|
I am happy to hear that Google plans to use automated algorithms to catch spam rather than by manual removal. I agree that is a fairer way. With manual removal, only a small fraction of the transgressors are removed leaving the others to benefit.
The automated approach is also more objective. For months I watched my competitors dominate the top rankings in my industry. I reported the sites but no action was taken. I finally came to the conclusion that perhaps the tactics they employed where not regarded as spam by Google and I applied the same tactics in order to compete. Lets face it some areas are gray and are not specifically mentioned in Googleís guidelines. The irony is that I was eventually banned and they still dominate the top rankings. Had an automated algorithm been used, all offending sites would have been penalized instead of the unlucky ones that got reported.
GoogleGuy, Can I plead temporary insanity? Seriously, Thanks for you generosity in addressing our concerns.
| 7:25 am on Jun 12, 2003 (gmt 0)|
Good question, fathom, if I understand you correctly. One point is that it's much better to go after types of spam than individual sites. There will always be more sites that are willing to cut corners or to try tricks. Also, if you take action for an individual site, then that means more work down the line if the site needs to write in, the site needs to be reviewed again, and so on--that leads to why algorithms work better than manual action. I think the ideal is just to make techniques ineffective. Some people will waste time doing the ineffective techniques, but they wouldn't get gain on it from Google. There's an interesting second-order effect where spam doesn't do a site any good, but a competitor sees the spam on the site and assumes that it helped. That's probably an advanced topic for another time though. :)
| 7:29 am on Jun 12, 2003 (gmt 0)|
iSeeker, I would definitely check out the reinclusion question in msg #11. :) Automated algorithms are definitely the goal. It's just more robust, more scalable, and faster to reinclude sites, because they don't need a manual review anymore. It takes longer to do things with algorithms, but I think most people agree that long-term it's definitely the right way to go.
| 7:37 am on Jun 12, 2003 (gmt 0)|
>>>Thanks to whatever mod (Brett? WebGuerrilla?) set it up so I could have the thread to myself for a while. Maybe we could do this again sometime.<<<
Actually, we have a Secret Underground operating here in the Google News Forum. Very quietly and clandestinely, it has been flourishing right under the conspiracy theorists' noses without them ever having spotted it. ;)
GoogleGuy, regarding your response about the ODP and Google, and your other response about finding sites through links so that they have at least some degree of Page Rank, however small:
ODP, often mentioned in discussions about relevance for Google, is not only an authoritative link, but is also topically relevant because of the category structure. Should we be moving away from obsessing about Page Rank and moving more toward a more balanced approach of aligning ourselves with on-topic sites relevant to the theme of our own sites?
| 7:45 am on Jun 12, 2003 (gmt 0)|
I'm not positive that I'm a huge fan of the theming arguments that people have made--some of the most useful links I've seen are from "off-topic" sites--but I would definitely agree that it helps users to link to useful, relevant, related sites. So I could see where someday we might our scoring to reflect that in some part. If you see tons of links flowing into a site and not a single link to the rest of the web, then as a user I might scratch my head a little bit.
That's just my personal take on things though. I guess my short answer would be that it's natural that we would want our scoring to reflect the real-world things that make a site useful. In an ideal world, webmasters would only worry about making great sites for users, and Google would follow that to find the best sites that users loved, and score those useful sites highly.
| 7:47 am on Jun 12, 2003 (gmt 0)|
|There's an interesting second-order effect where spam doesn't do a site any good, but a competitor sees the spam on the site and assumes that it helped. |
You're quite right there! ;) GOOD POINT!
|I'm not positive that I'm a huge fan of the theming arguments that people have made--some of the most useful links I've seen are from "off-topic" sites--but I would definitely agree that it helps users to link to useful, relevant, related sites. So I could see where someday we might our scoring to reflect that in some part. If you see tons of links flowing into a site and not a single link to the rest of the web, then as a user I might scratch my head a little bit. |
A user wouldn't notice a ton of links "going" to a site their on though".
I myself am a fan of "theming" or better stated a continuing growth of common topics in content.
Looking at the reverse > one could say (or infer) GG says just get links from whom ever and where ever you can... because that linkage "would be useful to someone"! :)
[edited by: fathom at 7:57 am (utc) on June 12, 2003]
| 7:53 am on Jun 12, 2003 (gmt 0)|
Contrary to my initial thoughts this thread is turning more and more fascinating. Great seed for future thoughts. Thanks Fathom, Marcia and GG. :)
| 7:54 am on Jun 12, 2003 (gmt 0)|
fathom, you've hit on a really good reason to take action on hidden text. Some people ask "Why not just ignore it?" And the short answer is that our users and other webmasters hate running across hidden text, even if it didn't really have much impact. That's a pretty good reason to encourage folks to remove hidden text; if it's not there at all, people don't have that negative reaction.
| 8:03 am on Jun 12, 2003 (gmt 0)|
>> Absolutely. I think two important challenges for the future are discovering user intent and uncovering webmaster intent.
>> I think that there will always be a need for consultants that help site owners make their site more useful for surfers and search engines.
Reading between the lines here suggests that perhaps google are not too far along in being able to discover "webmaster intent". Not that it is really a problem because the whole concept of descovering a sites message in areas that are not clearly marked-up with code, is on the bleeding edge of search engine R&D.
Perhaps the next generation of google alorithms would be better served by attempting to assess the usability of sites in regard to their layout consistancy etc.
| 8:04 am on Jun 12, 2003 (gmt 0)|
Do you just ban domains, or also IP addresses? If your domain is on an IP address that has hundreds of domains on it, and one of those domains does big time spamming and the IP address get's banned, what's the best way to get unbanned, since you can't include any evidence in the E-mail that you tried to get rid of anything, like hidden text links, since you did nothing wrong in the first place?
| 8:04 am on Jun 12, 2003 (gmt 0)|
I can see that GoogleGuy, however there are instances where hidden-text does not hurt anyone and is in fact part of a useful feature. I don't know how much you are into CSS, but take a look at
> fahrner image replacement (replacing headings with images)
> css popups (submenus/popups that appear when you hover the main-nav)
I really hope you are not going to put people that use this in the same boat as spammers, because there are a lot of really respectable, well known people in the CSS/Webdev community that do this. And it really does not hurt and is just useful.
| 8:05 am on Jun 12, 2003 (gmt 0)|
|Looking at the reverse > one could say (or infer) GG says just get links from whom ever and where ever you can... because that linkage "would be useful to someone"! |
Initially yes and then the next generation of algo change will calculate the relevancy factor :)
| 8:25 am on Jun 12, 2003 (gmt 0)|
Jesse_Smith, the things you mentioned are good reasons why IP address penalties are usually not a good idea. All the comments about manual penalties apply, plus it's easy for a bad guy to switch to a different IP. I haven't seen an IP-based penalty in a long time.
Okay, it's way past 1am my time, and I'm tuckered. Hope this was helpful to folks, and as usual I'll be around to answer questions when I can. Good night..
| 8:29 am on Jun 12, 2003 (gmt 0)|
|Itís impossible to say what the future holds in this industry, but in my mind, itís a good sign that the ODP is taking steps that will continue to improve its quality. |
Good to hear something positive about the ODP from one of its biggest users!
There are positive ways Google could help the ODP improve -- and thus provide a benefical spiral: as the ODP improves, Google's Directory will improve, and we'll all benefit.
One example would be to use your immense indexing capacity to notify the ODP of sites in ODP that are being penalised in Google for spam techniques that break ODP guidelines, e.g. affliates and mirrors
(with a bit more work at your end) no unique content
(Using your cache) sites that appear to no longer be what they were (maybe a domain has been resold and is now porn).
ODP editors would then have a heads-up to consider removing or recategorizing those sites.
Does Google have any plans to lend such assistance to the OPD for quality assurance?
| 8:29 am on Jun 12, 2003 (gmt 0)|
I never thought of "off-theme" as being bad, I just think that there is a lot of value to having links coming from the majority of on-theme pages on the web.
And when I speak of themeing I am usually talking about lexical themeing on-page. Something like what is being done with Google sets, but not quite.
You have to be able to understand the theme of a page and the words and grammar that is appropriate for that theme before you would be able to compare the theme and the language of one page to another.
For example, if you enter "ford", "chevrolet", "dodge" into Google Sets, you get a list of other manufacturers. But there are a lot of other related words that could give you a good idea of how on-theme a page might be lexically.
Those three names can represent many different things that people might be looking for and themeing might be used to make sure that the first page of the SERPs are spitting out a selection of different themed pages so that the searcher has a selection.
Give them a couple of pages with the "truck transfer-case off-road" results. Some pages of car parts, and some pages of dealers and collectors.
| This 152 message thread spans 6 pages: < < 152 ( 1  3 4 5 6 ) > > |