homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 179 message thread spans 6 pages: < < 179 ( 1 2 3 4 [5] 6 > >     
2002, Part II
Ready for the rest of the year?

 4:54 pm on Jul 18, 2002 (gmt 0)

The first half of 2002 was pretty eventful at Google. We rolled out major new products like Enterprise Search and AdWords Select, pulled back the curtain on innovation like Google Answers and labs.google.com, partnered with some great companies, and launched tons of little things to improve search that most people--except for maybe the posters here--never notice, but really improve search. What sort of things do you want to see Google doing next? And are you ready for the rest of the year? :)



 11:26 am on Jul 20, 2002 (gmt 0)

Tweak the algo to be a bit more sensitive to the proximity of words in the search query


search for: Google searches from your web page (without "")

sadly for Google, the number one result is Altavista..
and the Google pages presented are not the ones I copied the exact sentence from (it was in the first sentence of [google.com...] )

I know I could find that exact page by using "Google searches from your web page",
but how many people do that?
(Googleguy, just in case you do not have access to the internal stats [webmasterworld.com]) its approx. 1% ;)

And while on the subject of proximity, what about a variable control bar in the toolbar that allows you to adjust the proximity factor?

GG, as always, just add the regular stock options bonus for suggestions to my sticky, thanks.


 11:39 am on Jul 20, 2002 (gmt 0)

I think most members here have pointed very valuable ideas for you.

Personalized serp's and pages seems to be comming, i heard a lot of it from different SE in december last year, and now FAST have started.

Most of the ideas here is to make it easier for webmasters, and that is great for me beacuse i am a webmaster and it would make me glader if you would have some of the services mentioned here, but i think you also want to know how the general user would use the service, and according to me most of the "normal users" wont use these services.
I don't know how it is in the rest of the world but were i live the "normal user" won't customize there pages/serp's if they had a chance.

I also suggest you read the back-fill [webmasterworld.com] of this thread, not much there was accomplished and i wonder if anything in this thread will be done?

Advertising, yes i agree with Brett that it's to much now, but i think you could pretty much get away with it because the "normal user" doesn't care that much because their used to pop-ups, pop-downs, email-spam etc, so they it's pretty light and probably most of them think that the ads is a part of the serp's which we SEO's/Webmasters don't like (unless i have an ad), you could change this if you wished and i think it would give better user experiences because the user will see that it's an ad, so they won't misstake it for a ser (search engine result) so they would still think that you have quality serp's.

In the back-fill it's mentioned that you give a high ranking to .edu and .gov pages, i do respect this and i think the .edu pages often have very valuable information and Brett mention that he doesn't want .pdf files etc in the serp's i want them gone too except for the .pdf files from the .edu domain, but when it comes to the .gov, seriously i don't give a S*it what the US government think if i should travel to pakistan or not, or if i should contact the US embassy in pakistan, this is not relevant for me.
Sure it might be good with warnings, if i do a search for "travel" and get some warnings, but often the warnings the US government will give me my own government won't do becuase we havn't had any troubles with that country/city etc.
So PLEASE give .gov less value.
The US might think this is a bad idea, because this information is valuable for them, if so please customize the results for the US people.

The most important think though and probably the most hard one to fullfill, a fresher index (don't meen single pages which is updated every day), today most people have found pretty much on the net, and when they do a search they often do it for new fresh information, for example if a new mobile-phone have been launched they will search for it and you will probably have 0 results to report. They arn't looking for old information anymore, sure people are still searching for facts from the past, and they find it often very quick, but most of the BIG SE's can present good results when it comes to that.


 12:05 pm on Jul 20, 2002 (gmt 0)

I also suggest you read the back-fill [webmasterworld.com] of this thread, not much there was accomplished and i wonder if anything in this thread will be done?

Lazerzubb, good point that WebGuerrilla also made earlier in this thread.
There is also some - as yet -unfulfilled homework here [webmasterworld.com].

Not that we expect full exact answers from Googleguy, or that we could ever dream for immediate implementation, but I am sure that in the process of the nice give and take in these threads, Googleguy could elaborate a bit more why certain suggestions or complaints (e.g. better PR0/PR-Grey clarification) are too difficult to implement, down-right too revealing for the competition, nice ideas but too expensive hard-ware-wise, or confidential for ever. That would make for even nicer discussions without selling-out all internal secrets. It could also stop us barking up the same tree on certain subjects..even answering questions with counter questions can proove to be very informative.

This is all good--I'm getting this down, keep the good suggestions coming..

and keep the interaction coming..;) (I know we are spoiled for having a SE-rep posting here, but no harm in pushing for more discussions?)

Mark Candiotti

 1:14 pm on Jul 20, 2002 (gmt 0)

my two cents worth...

I think it would be nice that if a major change will be implemented which may penalize an existing common practice, that there is sufficient notice to allow people to at least update their sites accordingly in time. All of us DO want to optimize, but many of us are not willing to cheat to get there. For example, if in the future, it is decided that heavy use of Flash links or Javascript links to outside sites will be, er, "frowned upon" then let us know and we'll go another way. If it MUST be the way you do it, so be it - thought we'll still reserve our right to disagree, of course.

I understand the need to be a bit murky on some of these things but I think you'll get a better, cleaner web by being more upfront - even if we disagree. For example, most webmasters would stauchly defend cloaking as useful to protect some of our hard work, but once you said straight-up "Don't do this" - it's only OUR fault if we cross the line. Let us know in advance what the major penalties are - I think with all this information being spread around anyway that they line between what the SEO guys/girls do and what your average optimizing web-site owner does is getting pretty thin. If I hadn't stumbled upon this place, I probably would have committed about ten sins that made sense to me (even ones that didn't seem inherently sneaky!) that would have devasted my sites. Is it somehow more valuable to have people learn by losing months of income than being told the rules of the game?

You can be a BIT vague here, of course. Qualifiers such as "too many", "heavy use of", "repeated", let us know where to tread carefully without giving away the keys to the farm. It seems you guys are leaning this way anyway what with a bit of contrition for past "purges" and the advent of yourself, Mr. Google Guy, for which we are grateful. I hope things keep moving in the direction of openness and communication.


 1:55 pm on Jul 20, 2002 (gmt 0)

Please don't discriminate against affiliate sites as another poster suggested. Many affiliate sites are full of useful content, precisely geared toward what users are looking for, and not just a bunch of same-same links all the time.

A twice-a-month update would be nice.

Also I like the general idea about more info regarding pagerank (ie a place where you can put a url and find if your pagerank has not yet been calculated (eg for new sites), or whether you have a penaltly for some reason. Maybe you could also include the length in months that the penalty will last. Eternal penalties (if they exist) do seem a bit harsh by the way.

Thanks for listening Googleguy.


 2:00 pm on Jul 20, 2002 (gmt 0)

Dead sites..or sites that arent updated regularly..deads are sites that havent been updated in say 30 days..clean up the dead skin

give sites that are updated daily higher rank (at least a have it be one of the variables..or is it already ?)

Mark Candiotti

 5:03 pm on Jul 20, 2002 (gmt 0)


There are many, many useful, productive sites which do NOT require updating. Perhaps you need to think out of the box of your own situation a bit. A fall catalogue, for example, can be good for 3 or 4 months with no need to update as people are busy working on the next season.

Give more frequently updated sites more frequent updating? Sure. Give them a higher rank? No reason for that - doesn't make sense.



 6:07 pm on Jul 20, 2002 (gmt 0)

Dauction & Mark_Candiotti,

there is something to be said for taking staleness of sites or pages into consideration (as long as you relate staleness to the age of incoming links) and Google already seems to use it for their more frequent Fresh crawling criteria:

Take this simplified theoretical example:

Two identical sites A and B both offer exact the same quality content.

A has been around for 4 years. B just started out.

In a perfect world both A and B would get the exact same amount of external links from other sites and rank equally well for every relevant search query.

In the real world, site A, being around much longer, will have gotten links from e.g. directories that have gone completely stale (say they dropped adding any new links after the internet bubble collapse, I have seen several high Pagerank directories like that) that site B will never be able to get anymore.

All things being equal (both sites have equally well informed WebmasterWorld members as SEO'er), site A will always outrank site B for many search queries.

Google already likes the "age factor" of links, as it was one of the Honorable Mentions in the Google programming contest [google.com]. It just has to identify pages with only stale outgoing links (no new links added) and give these links less value in the overall computed ranking.

There is also something to be said for rating pages that frequently get new votes (links) continuously, higher (because they most probably are continously up-to-date and relevant).

Note: This has nothing to with simply just "updating" your site.

IMO, the freshness criteria for Google has more to do with which pages of your site continously get the most new inbound links in the recent past. Thats why index pages get Fresh often because it is the obvious page to link to for others and it gets the obligatory "home" link from every newly added subpage.

Could it also be the reason many "contact us" pages get the Fresh tag?
Search for:
"contact us" daterange:2452469-2452476 in:
[search2.cometsystems.com...] as mentioned in this thread:
[webmasterworld.com...] .

In case you are guessing, I am site B ;)


 6:48 pm on Jul 20, 2002 (gmt 0)

It has been discussed here that Google has been experimenting with the meta description lately. I find that the snippet is rarely accurate or useful for the pages of my site, and I would like to see the meta descritpion used in the serp descriptions. If there is no meta, then use a snippet.

I know the argument is that the purpose is to present the searcher with a description relevant to their search query, but it seems to me that a proper description of the page is more relevant than snippets that often will not relate that purpose. In fact I think snippets can be rather counterproductive, giving the illusion of relevance, when the page may have none related to the query, except that it contains the query words somewhere.

This idea that your robot can do a better job of describing my pages than I can, I find rather presumptive, and I have no doubt that it costs me visitors. But more than that, it is rather arrogant to rearrange the content of my pages. I work very hard to present my content just as I wish it to be presented - where does Google get the idea they can rearrange it as they wish?

As a matter of fact, for my most important keyword phrase, the snippet includes one snip that says, "5 widgets" making it seem as though my site deals with only five of these widgets, when in fact it deals with many, many important widgets. This trivializes my site to searchers, and is almost like a penalty.


 8:15 pm on Jul 20, 2002 (gmt 0)

NATURALLY the variable for for updating" regularly your site would include a "pass" to say 3-4 months for catalogue type sites..some government pamplet sites etc..thats all part of the equation ..

Perhaps you need to think outside the box of youor own circumstances LOL


 8:44 pm on Jul 20, 2002 (gmt 0)

Vita ..thanks for the information..

While I understand the theory that the more fresh links you get the algo takes that into consideration...

but this goes back to the fact that those fresh links algo can be easily abused ..timing inbound links ? you bet...try this ..FLOOD your site with "new" bound links sometime..

yes the algo can easily be defeated ..thats why I wish they would simply do away with PR having anything to do with links pointing at you ...even so called "quality" links ..
I know many webmasters with "quality" links I can get , but that has nothing to do with (should have nothing to do )with where my sites rankings on google....

If little old mediocure me can..just imagine what those with "real" talent are doing...

The way it is set-up know is make as many pages of "content" as possible and get all your buddies to exchange links ...

IMO..the only "votes" that should matter are those of the consumer/ or inforamtioin seeker..if THEY are finding whatever it is on your site then they will come back and new visitors will STAY longer ...these are imo the MOST important variables on where you should rank...not whether OTHER webmasters link to you or not..

I am looking at it from a consumers point of view..they want to know whats HOT.. where everyone is shopping/or seeking information if thats what your sites are ..NOT where "webmasters" think they should be at..
see my point..

middle America Loves WalMart over Kmart ..The Consumer decided that Walmart would be the place to go ..there wasnt a vote amounst tens of thosands of corporate americas CEOs! (well, I suppose that could be debated ! LOL)

But you see my point...

Listen.. I take heat because some dont like the fact that I am a "products" web site(not only a "product web sites but even an affiliate (gasp!) :)..so what...

I am not asking them to vote..I simply want a chance for the shopper to vote...use a algo that takes that into consideration and everyone will place as they should then...

The other theory is that somewho consumer sites need to be sent to the back of the pack so to speak...nonsense EVERY site is selling something..thats what bette than 99% of the internet is about.. making money.. some will claim it is for a medium for the convergence of idea exchanges...yeah , Ok buddy....let me finish that for you ...it is a medium for the exchange of ideas ..TO MAKE MONEY ! LOL

Ok, I'm rattling....


 9:21 pm on Jul 20, 2002 (gmt 0)


I would assume the Fresh criterium I mentioned earlier, would take consistent new external inbound links as the overall ruling factor. That cannot so easily be abused IMO. The odd internal page turning up (as the "contact us" example) would probably be one of the next best Pagerank pages within the site, also getting consistent new inbound links.

Do not forget this is something I think I observe, it is not necessarily the truth ;).

Also, for the moment, Freshness in my opinion means frequent spidering and a short, few day indexing/caching of that content. It does not mean that you necessarily rank better.

I would also not be suprised that if a search query suddenly gets used more frequently in Google (someone gets assasinated e.g.), that Google would then take the top ten rankings for that search query and give those pages Freshness (more frequent spidering) while that popular search query hype lasts. (if they dont do it, they should, it makes sense).

Also, who says Google is not already using toolbar data for something like Fresh criteria? (your popularity by real visits quest, hopefully an improved Alexa thing taking visitors visiting time into account, or something better than "popular means should rank high")

I understand your point about the affiliate site also wanting high rankings, but what would you do if you were Google?
Rank your site above Sony, if you sell Sony walkmans as affiliate?
Who says you could not rank high for Sony if someone searches for "Sony + Honolulu"? OR "the cheapest sony walkman in Honolulu"?


 10:39 pm on Jul 20, 2002 (gmt 0)

Vita.."I understand your point about the affiliate site also wanting high rankings, but what would you do if you were Google?
Rank your site above Sony, if you sell Sony walkmans as affiliate?
Who says you could not rank high for Sony if someone searches for "Sony + Honolulu"? OR "the cheapest sony walkman in Honolulu"?

Oh certainly I would expect Sony to rank higher..my concern is with "my" competitors ..other affilaites and online stores selling the same sony products...

My perspective is let the consumer decides where they would rather shop than a bunch of webmasters trading links with each (thereby ranking higher)

Back to should sony rank higher question ..let me qualify that with sometimes on these enormous corporate sites.they ARE confusing to hte avg surfer...If I can "dumb down" (a web site) if I dare use that term...just make their shopping venture something THEY understand...and it is "proved" that shoppers prefer my site over sonys'..then why shouldnt it be rated higher ? !!!

It is a shame Alexa sucks LOL..kind of the right idea...just no controls over there

Thanks for the discussion..


 12:10 am on Jul 21, 2002 (gmt 0)

I agree completely and my snippets are often completely misleading making my site appear something it is not.


 5:08 am on Jul 21, 2002 (gmt 0)


I, like many others DO love Google as they are simply the best, however, I think that their adwords select is a total desaster just like their other adwords were (sorry I cant remember its name)
A big dessapointed coming from Google


 6:17 am on Jul 21, 2002 (gmt 0)

1) A little less advertising on the right side of the screen. The links Teoma has there are on topic and useful.

2) Google took steps to eliminate frames abuse before and wiped out a bunch of sites. They need to go further and PR0 sites that use Javascript or some other script to hide the same trick. It would knock out an unbelievable amount of spam from automakers, financial services companies, homebuilders, camping products (one popular name in that category is a real spammer), and numerous other sites in other industries. These sites make it anywhere up to PR7 and clog the SERPs using that magical little line of javascript or other script.

3) Stick with ODP and snap it up if the financial turmoil at AOL causes them to do something stupid like selling or doing away with it.

4) Maintain the same level of disclosure about what is and isn't allowed in Google.

5) Devalue the ranking contribution of domain names. Seems like it is still rather important.

6) Evaluate whether or not it is worth it to even include domains with more than x number of dashes in them. I've never seen one with more than 3 that wasn't spam or some affiliate site siphoning off traffic from a real site. Consider dropping sites listed in YAHOO! and ODP whose listings begin with !!'s or other characters used only for spamming puposes.

7) Don't take over the entire web:) Google is great but to much of anything isn't. Humor FAST or Teoma and let them get a backfill spot somewhere.

8) Keep search results and advertising separate.

9) Keep ConAgra beef [fsis.usda.gov] out of the Google cafeteria.


 8:20 am on Jul 21, 2002 (gmt 0)

Improvements I'd like to see?

1. On the SERP, tell me how big the whole page is, not just the one element you've indexed. A 5k page may be the start of a ten minute wait while loads of JPEGs download. In fact, give me an Advanced Search setting so I can sort pages by actual size.

2. On the SERP, tell me what plug-ins a page needs. Give me an Advanced Search option to find pages that do/don't use a plug-in -- that way I'd never need to see another Flash site!

3. On Groups Advanced Search -- let me sort by date, oldest first. It's a real pig to get to the start of a theme (as opposed to a thread) when the date sort is most-recent first.

Thanks for asking!


 1:15 pm on Jul 21, 2002 (gmt 0)

listed in google,

One horoscope site,among others, that has a pr of 7 no links so whatever and the only thing that comes up is one of those pages where it says to pick a horoscope and then gives a canned answer...The first page up is nice and neat, no ads.

the canned response is almost a dupe of it with a gambling banner at the top.

That's it folks....no content- no links other than the gambling banner.

Relevance...some could argue that it is but I don't think so.

Others, including the astrology hits and not just horoscopes are VERY high ranking but...some are link farms and almost all have no links or link pages.....Go figure!



 5:49 pm on Jul 21, 2002 (gmt 0)

futureski wrote:

Please don't discriminate against affiliate sites as another poster suggested. Many affiliate sites are full of useful content, precisely geared toward what users are looking for, and not just a bunch of same-same links all the time.

There's a difference between content sites with affiliate links and affiliate sites with content. An "affiliate site," as the term is generally understood, refers to a site that is basically just a collection of affiliate links. And while there's nothing wrong with such sites, they often clutter up search results and make it hard for readers to find the information they're looking for.

Hotel booking sites are perfect examples of the clutter problem. Not long ago, I saw a very slick-looking hotel site that was nothing more than a front end for the site owner's booking partner (I think the partner was TravelNow). The pages were attractively designed, and they gave the illusion of a comprehensive site, but in many cases, clicking on a city name did nothing more than call up the booking partner's search form for that city. The Web is awash in sites like that, and if Google allows them to dominate SERP listings because of clever search-engine optimization techniques, Google will go the way of AltaVista.

A related problem is the overwhelming number of e-commerce sites in some categories. Travel agencies are a good example. Let's say I'm looking for information on Silversea cruises and I search on "Silversea" or "Silversea Cruises," Silversea Cruises will probably come up first on the SERP (as it should), but the rest of the search results are likely to be dominated by similar or identical pages (which use Silversea's boilerplate text and photos) from travel agencies that sell cruises. As a result, I may miss some original pages (such as cruise reviews or articles) about Silversea Cruises unless I'm willing to wade through hundreds of search results. If I had the option of filtering out the clutter of travel-agency listings (most of which won't interest me unless I live in a given travel agency's city), Google would be a much more useful tool for me.

Side notes:

1) It's important to remember that Google's mission is to serve users, not Webmasters.

2) It's in Google's own interest to favor content sites over e-commerce sites in SERPs, because that encourages e-commerce sites to buy AdWords.


 6:03 pm on Jul 21, 2002 (gmt 0)

How about search results tweaked for the individual user? I mean those little smilie and unhappy faces could actually come in handy here :-) If someone shows by repeated happy-face-clicks that they like pages full of flash, music and the like because they have the bandwidth, start classing those pages higher on their searches. If the pages that make another person happy are those with photo's on, then skew it towards them. If one person uses .docs a lot, make those appear higher for them.

Of course then SEO's would be driven completely insane because even if they ranked highly one their own machine, they might not on everyone elses:-) And I have no idea what technology it would take to try that.


 7:18 pm on Jul 21, 2002 (gmt 0)

Perhaps I shouldn't be too surprised that some posters are suggesting algo changes which will patantly hurt other posters. After all, one man's spam is another man's basic page design. Maybe they will also be the first to squeal next time there is another indescriminate PR0 type assault.

I hope though that Google will exercise good sense in this area, and ensure that changes, if any, only net those who deserve it. Some of the ideas above whould hit thousands upon thousands of totally innocent sites.

Anyhow, I suspect that GG is actually looking for ideas on direction and new initiatives rather than advice on tweaking an algo which is easily the best in the business. I think he really has some crackers here: Google portal tag, PDFs/Docs under different tag, payment for ad free Google, customizable display, and so on. I'd love to see most of these.

I'd also echo some of the other points, especially about DMOZ. I fear for DMOZ and hope that Google is monitoring the situation in case it is disposed of in some damaging way by AOL.

I think the biggest battle for Google though remains with itself. It really does have to watch those adverts. We have seen search engines one after the other drive their users away through over commercialization (and they never learn from history). The temptations to almost print money for today are obvious, and it takes stength to resist. Let's hope that Google can be the first to look beyond today at hopefully to an even brighter future.


 8:19 pm on Jul 21, 2002 (gmt 0)

don't penalise links that are deemed as bad, just IGNORE them.

that way, the algo will be even more mysterious - who will know what artificial linking practices work or don't

that way, the webmaster does not have to be charged with personal responsibility over the sites he/she links to. it's pretty daft. it's rubbish to say "oh, you must know its dodgy if you look at it" - that's just rubbish and short sighted and really a fob-off.

ignoring these "extra" optimisation things will be a win-win for google, and responsible webmasters.

also, please don't make a penalty a pr3/2/1 instead of a pr0. it's either a penalty or not a penalty, (unless pagerank is broken!!).



 12:10 am on Jul 22, 2002 (gmt 0)

I would like to emphatically second Mr. Dredd's advice! It seems so much more logical for "bad" links to simply not count as opposed to penalizing the site and everyone who was ever associated with it (otherwise google will need to change it's name to gestapo.com).


 12:20 am on Jul 22, 2002 (gmt 0)

Improve "link:" searches:
1. Show the link text in bold instead of showing a generic snippet from each linking site.
2. Allow "link:" searches to be combined with other searches. Let me know why a search for "some people" leads to www.darwinawards.com.
3. Include more pages in the "link:" index (currently PR4+).

Create an "acronym search" for labs.google.com. Give higher ranking to sites that include both the acronym and a spelled-out version.


 2:11 am on Jul 22, 2002 (gmt 0)

jaytierney wrote:

I would like to emphatically second Mr. Dredd's advice! It seems so much more logical for "bad" links to simply not count as opposed to penalizing the site and everyone who was ever associated with it...

If there were no penalty for "bad" links (e.g., excessive crosslinking or whatever Google may regard as spam), then Webmasters would have no incentive to walk the straight and narrow. The SEO spam wars would escalate, and it would become harder for Google to deliver accurate, useful search results.


 3:03 am on Jul 22, 2002 (gmt 0)

Greetings all,

I would like to add to the discussion of PR0 for the benefit of GoogleGuy.
As a newbie web site creator I created a site, registered a domain and downloaded a search engine submission program which was full of 'useful' information on how to get good SE ranking, which I dutifully followed.
I now have a prettey much useless domain which I have wasted quite a few months on and a (for me) rather large hole in my pocket from AdWords (at over AUD $3.00 per click, minimum AUD $1.10) while waiting for my PR to appear/improve.

I have no real animosity because of this for I have now discovered these fine forums and real information from real webmasters.

My point is the lifelong penalty (if such a thing exists) is definately a rather harsh penalty - not only for the experienced webmasters who should know better or make mistakes but for the new creators of content that don't know any better.
Some sort of recovery from this 'black hole' should be made available - I like the ideas of some sort of warning system - get your act together before next crawl etc.

My only other suggestion would be for currency conversion/selection for the AdWords select program. IMHO it wasn't made clear enough that the prices are in USD and I believe this to be one of the reasons for high cost per click in Australia particularly.

Hehe, I hope you have read this far in this thread - it's getting quite long.

Thanks WebMaster World


 4:59 am on Jul 22, 2002 (gmt 0)

What about a personalised algo?

Have it as part of the toolbar, maybe with a groovy weighting slider or something? One setting for how important you would like pagerank to be, one setting for how important terms in title should be etc.

Kind of like your own personal Google algo. Of course the default Google sanctioned algo would be by far superior and what you would use that most of all :)


 7:41 am on Jul 22, 2002 (gmt 0)

If there were no penalty for "bad" links (e.g., excessive crosslinking or whatever Google may regard as spam), then Webmasters would have no incentive to walk the straight and narrow

How would no penalty affect SERP results? If these bad links simply didn't count, there would be absolutely no reason to do it. In fact, it may cause some confusion and frustration to those who are trying to cheat. Removing the penalties would also eliminate the problem where a domain becomes worthless (do we really need to recycle any more good domain names? Aren't enough of them already wasted?).


 10:08 am on Jul 22, 2002 (gmt 0)

Would like to see another new search engine making some moves. I like google and take about 4,000 referrals a month from them but would like some new SEO food.

On Adwords Select, how about a little bit of Infrastructure in the European markets so that agencies can get management fees for big campaign spending? At the moment there might be something being touted but when you get through to someone in a regional office they tell you that they have nothing in place at the moment.

Sorry, but to buck the trend I'd say it's becoming a little bit apple-mac cliquey, at Google HQ.

Still the communication is good between Google and SEO's isn't it? But aren't we supposed to be manipulating their results? Figure on that on, then figure on it again.


 12:01 pm on Jul 22, 2002 (gmt 0)

yes jaytierney, exactly.

as chris_r (is it r?) has said numerous times, lots of SEO techniques JUST DON'T MAKE A DIFFERENCE.

why the hell should links be any different?

most of the time someone complains about a site with 2 title tags coming top, its not necessarily the 2 title tags that has made it top.

MOST SEO TECHNIQUES DON'T MAKE ANY DIFFERENCE. Why should link maniplutation be treated any differently?

All this link penalty stuff has contributed, in some fashion, to giving up on the one activity that the internet is based on - LINKING AND RECIPROCAL LINKING.

I suggest google should repsect the core spirit of the internet, just like it should respect html coding standards.


 3:10 pm on Jul 22, 2002 (gmt 0)

I think anyone should be able to link to anyone else without worry over penalties or getting kicked out of the index.

That IS where the spirit of the web was at one time....get dug in on as many sites as possible while giving the other sites the same chance.

Are the search engines so insecure that the people who own and run them think that if a majority of sites gets thousands of links it would make them, the search engine, almost irrelevant?

DON'T, repeat, don't use links for PR or position or anything....just buzz off and let people run their own websites.

Content, should reign supreme, and good coding. Spamming, hidden text, etc. should be the deciding factor in getting a penalty or kicked out.

Google, give us a break! Linking is good...repeat after me, linking is good, linking is good, linking is good......

Yes that's right, linking IS good....so let us get on with being good citizens and linking wherever we wish with out the goddess Googlebot rearing her domineering head.

The web could finally return to the way it was meant to be, a connected stream of information and resources...remember...linking is good, linking is good........


This 179 message thread spans 6 pages: < < 179 ( 1 2 3 4 [5] 6 > >
Global Options:
 top home search open messages active posts  

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved