homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 179 message thread spans 6 pages: < < 179 ( 1 2 [3] 4 5 6 > >     
2002, Part II
Ready for the rest of the year?

 4:54 pm on Jul 18, 2002 (gmt 0)

The first half of 2002 was pretty eventful at Google. We rolled out major new products like Enterprise Search and AdWords Select, pulled back the curtain on innovation like Google Answers and labs.google.com, partnered with some great companies, and launched tons of little things to improve search that most people--except for maybe the posters here--never notice, but really improve search. What sort of things do you want to see Google doing next? And are you ready for the rest of the year? :)



 1:44 am on Jul 19, 2002 (gmt 0)

1) Toolbar for the Mac, especially an IE version since Mac Netscape is just awful.

2) tell me how to get my site's original news stories added to the spider that your news search uses

3) continue to improve the news search; it's almost already the best news aggregation out there

4) be careful how much space on your SERPs you give to paying customers; you're starting to look like all the rest of the SEs (the ones that care more about dollars than searchers) on a few search terms

5) don't ever care more about dollars than searchers :)

6) but don't forget to keep making money!

Thanks for taking part in this Forum, GoogleGuy.

Visit Thailand

 1:52 am on Jul 19, 2002 (gmt 0)

Yes I would love to know the answer to point 2 above.


 2:22 am on Jul 19, 2002 (gmt 0)

vitaplease - Exactly correct. Your example is a tactic I have often considered :)

If a site is 100 pages then chances are that a lot of effort has been put into the theme, especially if the pages have plenty of unique content and are not generated by some meaningless dynamic system. The word 'comprehensive' comes to mind, but the 'blue widgets' page may be url www.goodcompanyname/widgets/bluewidgets.htm which at present stands little chance of having good pr and ranking unless extremely well optimised. The webmaster spends his time researching his site content, not buying silly domain names, and gets rewarded for this, not penalised. Google should give more points to a page deeper in a site that demonstrates unique informative content. This turns the pr flow concept on its head, but would weed out the spammers. To compete, you would have to produce equal content on a similar scale and the links in would be the deciding factor on ranking.

The result?

1) Serps giving specific deep page links on big informative sites.
2) A broader range of theme related sites in the top 10 results.

In your example the searcher would thus find the page of the big supplier No1. Once clicked through to this site he would find plenty of internal links to related products. No2 and 3 positions may be major competitors with plenty of content as well.
No4 could be an informed 'health and safety' site that mentions the product deep within its pages. No.5 likewise could be deep pages about 'installation and repair' from another comprehensive site. No.6 another not so big retailer etc. etc. with the small spam site no.30.
If a small site has got specialist and relevant info, then the big site should link to it, and be rewarded accordingly. Also, if a small site is genuinely usefull, then it will pick up links and compete on merit with the deep internal pages of big sites.

The bottom line is remove any gain from domain name keywords. Also let the index page pr be the same throughout a site, and then let a page be boosted if other sites have deep links to it. This makes all pages, whether in a big site or little site more even. It removes the biast towards index pages, making all pages within a big site compete with anyone.

Beachboy the serp url is <url snip>
Hope it works!

[edited by: WebGuerrilla at 6:16 am (utc) on July 19, 2002]
[edit reason] No specific examples please [/edit]


 2:25 am on Jul 19, 2002 (gmt 0)

GoogleGuy, what a great idea. I appreciate your willingness to answer questions, ask for information and generally be part of the community here.

I think it would be great to add some specialty sections to Google, much like Northern Light did previously with its Health, Linux, Government sections, etc. It is a great way to find specific information in one area, some mini-vortals of sorts. I think it would be a good resource for the search users.

Google is my favorite engine, has been since it began and I don't think that will ever change. Keep up the good work.


 2:27 am on Jul 19, 2002 (gmt 0)


I don't get the same results you do. Stickymail the URL of the site in question, pls.


 2:37 am on Jul 19, 2002 (gmt 0)

I just tried it on www3 www2 and google.com and you're right....
google.co.uk from here in the uk is showing different serps.
Why would that be?!


 2:39 am on Jul 19, 2002 (gmt 0)

No clue. Anybody want to speculate on that?


 2:52 am on Jul 19, 2002 (gmt 0)

Back to the original subject of this post. I'd like to see bi-monthly updates please Googleguy.


 5:38 am on Jul 19, 2002 (gmt 0)

I think the only area Google could improve on now compared to the competition is by adding a clustering feature. The addition of a clustering feature like the one at Vivisimo would enhance the ability of Google to provide even better and more targeted search return results for Google users.

Whether by chance or design, Google tends to give higher search return placements to large, institutional sites. I find that on a number of topics, especially health topics, many of these institutional sites have highly redundant and often outdated information. Under the current algorithms on Google, conventional, institutional views are often rewarded and propagated while non mainstream views often get pushed to the bottom of the listings.

If Google had been around in Galileo's time, he might have had a hard time getting his web page placed high in Google because he would not have a lot of links from large institutional sites with high page rank. His ideas went against the grain. So if somebody entered "orbit" into a pretend 1600's Google, the first three pages of return results would all mostly likely have described how the sun revolves around the earth. Galileo's web page would be lucky to get position #121.

However with a clustering feature like Vivisimo's, people could look along the side and see entries for:

Orbit -
Sun around earth (19,942)
Earth around sun (1)

I think a clustering feature would be a useful addition to Google in order to group sites with similar information together and set off sites that may offer a unique perspective on a topic, even if those sites aren't the ones linked to by major institutions.

As Galileo said, "The authority of a thousand is not worth the humble reasoning of a single individual." Right now Google algorithms tend to favor "the authority of a thousand". With clustering, it might be a way to give the modern day Galileo's, Wright brothers, and Columbuses of the world a fighting chance to get people to read about their ideas.


 6:12 am on Jul 19, 2002 (gmt 0)

GoogleGuy -

I've skimmed through the suggestions so far, and don't think I've seen these (reasonably obvious) suggestions:

1) Step up on the JavaScript parsing
2) Step up on the CSS parsing

Look at it from this angle -- if the browser can parse it, so can you. That'll keep a lot of the webmasters/SEOs here on their toes ;)

But realistically, I still see too many hidden or moved CSS layers with spam content, and far too many JavaScript redirects. Hell, I could write the parser for you that would filter 99% of the JS redirects used on the net.

I understand that once you've found an offender, it may be difficult to decide upon the severity of the according penalty, and perhaps this is why you haven't addressed CSS/JS 'tricks' as much as you might have. However, it does *need* doing. Unlike other people here, I don't agree with a clear cut "you've been penalised" klaxon to sound, because people will figure out what you're penalising far too easily. But there needs to be some better penalty method in place.

Oh, and keep up the fun stuff too as on labs.google.com :)


 6:23 am on Jul 19, 2002 (gmt 0)

GG: I think the debates above about the algorithm are actually a tribute to its strength. The fact that different people are seeing so many different things as critical is surely an indication that all is relatively well in that area. If there was total agreement on the real details on here... that is what would be of concern... but there isn't. No-one knows, which of course is what you want.

Anyhow, back on topic. I mentoioned 'Google the Portal' above and would just like to qualify that with respect to risk.

I believe that this is an opportunity to compete in the market space of MSN/etc (yes, I love people who will take those guys on!) and equally of course to grab a lot more revenue. But, and it is a big BUT, you must NOT risk brand/product contamination.

Google the Portal would have to be kept under a discrete tab as default and entirely separately from the main Google (apart from returns of course). This would protect the clean look/feel/etc of the main product which is absolutety essential, as you are no doubt aware.

Generally, all things to all men? Well yes, but you have already started on that path to some degree. You just have to manage it correctly and not drop any clangers on the way. Easy then eh?


 6:23 am on Jul 19, 2002 (gmt 0)

After reading all the great suggestions in this thread, the only thing I would like to see is a post from Google that spells out what they've done or are planning to do with all the suggestions that we gave them the first time they asked this question. [webmasterworld.com]


 7:08 am on Jul 19, 2002 (gmt 0)

... not wanting to overdo this one, but I can assure you that in my area (travel), the keyword in domain name issue is a real, no doubt about it, PITA. I'll take your one 'easy for the punter' usage and raise you 100 'easy for the lazy and cynical scam artist' Napoleon.
As others say, these tend to be 1 or 2 page front-ends that share a common database and very little content. Really very little. They appear out of nowhere and DO dominate the results, or at least they can. I'm not advocating a blanket implied penalty, but the current (particularly the last 2 months) updates have over-weighted this aspect, and you'll have no chance of convincing me otherwise.

Re-reading the threads, I'm surprised that less people want to see more user feedback/reporting/rating. I know that there is a potential for abuse, but if the numbers are on a global scale then such manipulation would probably become irrelevant. You can gild the lily, you know.


 7:55 am on Jul 19, 2002 (gmt 0)

GoogleGuy, IMO Google's PR0 campaign took a lot of webmaster brainpower and pitted it against Google's objectives.

If webmasters could know for sure why they got penalties like this, then have their sites fully recover by removing the offense, they would work to make sites that Google approves of rather than work to find new ways load up the Google SERPs with sites that don't necessarily complement the Google search experience.

By now, many of the people that got the penalty are still in business and are doing well and are a whole lot smarter but maybe not in ways that Google would appreciate.

Webmasters who optimize their sites need not be Google's public enemy number one. Most of them will happily work with Google to remove any offenses if they know what they are and can have their sites recover by removing the offense. The end result would be web sites that conform to Google's idea of quality sites, and an environment of mutual support between webmasters and Google.


 9:21 am on Jul 19, 2002 (gmt 0)

>> not wanting to overdo this one<<

It's way overdone already. I'll tell you what, you might not like keywords in domains even if they do help the surfer (which they do), but I might not like:

Keywords in the meta-title, or correct keyword density, or keyword in anchortext, or keyword in description, or... etc etc

See the point? What's the difference? Whatever cocktail Google uses in its algorithm some people will have different parts of it in place, and really it's no good moaning about those that you don't happen to have yourself.

At least keywords in the domain are of help to surfers... which is a lot more than can be said for some of the above. I'll continue to use them simply because I build surfer friendly sites and I would be far from impressed if I was penalized in any way for doing so.


 9:31 am on Jul 19, 2002 (gmt 0)

I'll continue to use them simply because I build surfer friendly sites and I would be far from impressed if I was penalized in any way for doing so.


its not about being penalised, it just has no logic to gain a ranking benefit by using it...

At least I want to spend my time usefully on one site and not on maintaining 10 sites of the same subject just for keeping up with the others...


 9:47 am on Jul 19, 2002 (gmt 0)

OK, this isn't the only point of this thread, but that's a pretty dogmatic gauntlet you're throwing down there.

You call it moaning. I call it not understanding why 2 month old sites, PR of 2 to 3, total pages typically under ten, content thin and largely mirrored, no ODP listing (and they'd never get one) are number one above sites (plural - not just mine) of years standing, thousands of pages, PR's of 6 and 7.

You're not, I respectfully suggest, seeing the wood for the trees here. Friendly to surfers? In what way? If Google returns a good SERP page - as it always does - why on earth would a punter need to sift through the results to pick out the 'relevant-looking' domain names - the other SERPS are not to be trusted?

I build a surfer friendly site. I always have done. Why on earth would a keyword-laden domain name make it more so? Penalty, no, boost - why, in heavens name? As vitaplease says, one constant, steady and updated site SHOULD be what it's about, not lots of mini-ones to keep up with the Jones's. Yours may not be, but most are.


 10:21 am on Jul 19, 2002 (gmt 0)

I didn't say I had stacks of sites... I don't and I agree that one big site is often better, but not always (there is actually a very strong case indeed for smaller focused sites in some disciplines).

I just said that in my opinion a meaningful domain name is easier for Joe Public to handle.

Let's face it, all the short ones went years ago, so we are left with names like blue-widgets-and bricks or bluewidgetsandbricks.

I know which I like best and which others do as well. It's the one that you can read most easily. I also try to think about the real world... which is best to stick on the side of a lorry for example.

In terms of Google, whether it gives an advantage to sites like that is up to them (I don't think they do but we may disagree). The last thing it should do of course is apply any disadvantage though for guys who are simply trying to make the site meaningful in the real world by breaking the domain name up in a friendly way.

The bottom line I suppose is that I use this and you don't. It certainly doesn't make me a spammer (especially bearing in mind that I use it for the above reasons) no more than putting your keywords in your meta-tags makes you one.

I hope this explains things a bit better... I didn't mean to come across as a guy with a host of one page sites (yuk!).


 10:26 am on Jul 19, 2002 (gmt 0)

> What sort of things do you want
> to see Google doing next?

I'd like to see Google get back to what got them started. On the page stuff has crept too far for me.

Ads: 10 ads per page on some searches is too many. The top premium links are not well enough marked as sponsored ads. "sponsor" needs to be moved above at the left or at the left of the premium ads. I bet 30-50% of the general populace don't recognize those as ads.

Junk in the results. I don't want pdfs, docs, or other filetypes unless I specifically ask for them. They are a nusiance to me.

Refinement Suggestions. Remove some of the ads and replace them with search refinement suggestions. aka: out of the google labs thingo. Using Teoma is so much easier than using Google. Unfortunatly, you can't find anything with Teoma's mini db. If they ever get the hardware end of things figured out - it would be a viable alternative.

Signup and Login. I'd like one login for all Google services. I want it to remember my prefs (google side, not some cookie). It would also be nice if it had a "history" that I could review previous searches.

I like the refinements made to the descriptions this last year. I find the odp listings/cats inserted into the listings helps sort out what and what not to click on.

Customized Display: ala AllTheWeb and WebmasterWorld :-)

I'd be willing to pay a monthly service fee ($10) to get the above in an ad free environment.

I'm mildly troubled by some of the stuff I've seen on Googles pages the last six months:

- If you are going to talk the talk about Google being a different company and counting the number of total initials behind employee names, then it's time to walk the walk. It doesn't matter what all the other sites on the net are doing, standards compliance is the cost of doing business on the web today. For Google not to validate 100% to currently agreed upons standards is unprofessional.
- In Usenet, the "on focus" used on pages to force the browser window to the front, is a trick out of the porn industry. There's no room for such gimmicks on Google.
- Advanced search page. Although it has some powerful options, the numerous html errors makes the page a mess in some browsers. There's nothing "advanced" about 125 html errors on a simple one screen form page.

Someone needs to take a stand for standards compliance at Google. If Google really is a different company comitted to doing business better, then there is simply no defensable argument for Google not to come into compliance with web standards. It's not the be all to end all, but it is important. It's time to get with the program on that issue.


 11:00 am on Jul 19, 2002 (gmt 0)

> What sort of things do you want
> to see Google doing next?
I would like Google to let us know by email alert or by a function or even a tool maybe, when new pages APPEAR for a keyword/request (not when they are refreshed, we already have this info).
Also, i would like Google to let us when it is a website or a page from a site, using different colour for example... That's what i would like :D

[edited by: Marcia at 10:13 pm (utc) on July 21, 2002]
[edit reason] notification removed per member request [/edit]


 11:23 am on Jul 19, 2002 (gmt 0)

i would like to see google add an option to the advance search......keyword + PR

that would make finding links easier.



 12:06 pm on Jul 19, 2002 (gmt 0)

However with a clustering feature like Vivisimo's, people could look along the side and see entries for:

Orbit -
Sun around earth (19,942)
Earth around sun (1)

I think a clustering feature would be a useful addition to Google in order to group sites with similar information together and set off sites that may offer a unique perspective on a topic, even if those sites aren't the ones linked to by major institutions.


nice example and good point (Galileo).

It would also seem to give a chance of showing low-PR, non-frequently linked results (as many would like to see, see other postings). I guess this is Google's beginning thereof:


I am not so sure however, that Google would be clever enough to list your Galileo example in the clustering process. Amongst others, Google's glossary seems to take the clustering choices from common occuring word groups in the glossary definitions, combined with frequently used multi-word search queries that are related.

Example for Orbit: "Geosynchronous orbit" does not occur in the definitions of the underlying glossary, but does when searching for "orbit" in Overture suggestion tool.

The problem is, how would Google, or for that matter - anyone else - know about Galileo's non mainstream views if it is not mentioned in a dictionary/glossary/synonym, or if no-one lists it, links to it, or searches for it?


 12:45 pm on Jul 19, 2002 (gmt 0)

SebastianX made some good suggestions. Since the advent of PR0 it seems to me that those of us that run directories are vulnerable to unintentionally, or unknowingly linking to a bad actor and then getting cut off at the knees. We get lots of submissions and it is going to slow everything to a crawl if every site has to pass a Google-purity test.

Or that a domain will change hands and the new owner will be a badguy and we get caught in the crossfire. This whole issue is going to remain a burr under the saddle until it can be resolved.

Also, if Google is keeping a history of what webmasters and SEO's are doing that seems -- creepy. It should stop.

Clustering -

Some sort of topic clustering might be very interesting.

Ads on SERP's -

I have to agree with Brett. No more ads. The SERP's are starting to get cluttered.

my.Google - I don't see it. IMO it would be a mistake to start portalizing Google too much. Keep it lean and fast. :)


 1:22 pm on Jul 19, 2002 (gmt 0)


Just a historical sidenote on the Galileo example (which is a fine idea): Galileos Ideas were very widespread after a short time in learned circles, (the newly invented telescopes made clear to every astronomer that his theory was the better one) so I don't think he would have gotten problems with Googles algorithm if people would have been allowed to express their opinion freely - Many important (high PR ;)) people like Descartes thought Galileo was right, but did not have the guts to write it (quite understandable after it landed Bruno on the stake...).

So, I think the most important thing for Google in this case is not algorithms, but good business ethics: that means, PLEASE never make an agreement like Yahoo recently did with the gov of China!
Google has a very good reputation, it is one of the few companies that people still have much sympathy for, not only because they are the best in their field, but also because their management theory does not seem to be outright predator capitalism. So, be careful not to spoil it.

Besides that, I am one of those that think the PR0 behaviour sucks. I recently got a site set to PR0, and not because I did some fancy promotion thing with 10 domains. I simply added a webshop, needed session ids, and, to make it better crawable to bots, I modified the scripts for bots - they did not get a session in the url (they do not shop, anyway ;)). This immediately set the page to PR0, and the page is from a brick and mortar company that has been on the net since 99.
It is one thing to realise afterwards that you made an error, and got penalised, but to be penalized forever (?) for a trick that had the intention to make the page crawable is quite another. Dynamic content is quite widespread now, and so are the tricks to make the sites crawable or server-side tailoring to user-agents like good old NS4.


 4:02 pm on Jul 19, 2002 (gmt 0)

Here are some things I wish the toolbar would do, more as a user than as a webmaster:

1. If I'm in Google Groups and I type something into the search box and press enter, I wish it would search in Google Groups and not default back to a regular Google search.

2. I wish the previous and next result buttons worked in Google Groups too

3. More advanced. I'd like something like the "serach this site" button except it would be "search all the pages this page links to" - a sort of a search set on the fly. For example, someone has compiled links to patterns for making hats. I'm looking for a crochet pattern with a turtle on it. I'd like to be able to type "crochet turtle" and only search in the links on that page. Does that make sense?


 4:39 pm on Jul 19, 2002 (gmt 0)

I've been researching serps on google around the theme of weddings in the UK. I don't think I've ever seen such a load of junk results - a stack of awful sites delivered from databases with the purpose of achieving a good Google ranking over a zillion similar search phrases.

I would have thought Google would be able to spot a site that is clearly keyword laden to the nth degree... ?

I'm beginning to warm to the idea of Google automatically banning sites that contain a number of pointers to the fact that they are being run by SEO 'experts'.


 5:11 pm on Jul 19, 2002 (gmt 0)

For Google not to validate 100% to currently agreed upons standards is unprofessional.

Most of the validation errors are tricks Google uses to make the pages load faster. I think loading quickly is more important than validating as long as the errors doen't interfere with accessibility and searchability. Unlike other search engines, Google is committed to making its front page and serps uncluttered enough to load quickly, so Google can get away with not vaildating.

Keep in mind that it's possible to write "bad html" that validates, and conversely, non-validating html can be just as good accessibility-wise as the same html with the errors fixed. Validation is mostly useful as a tool to make sure your code is accessible and will be interpreted the same way by multiple browsers, and Google has the resources to test those things itself.

In Usenet, the "on focus" used on pages to force the browser window to the front, is a trick out of the porn industry. There's no room for such gimmicks on Google.

This is a bug in your browser. Google focuses the right-hand frame, not the window, but some buggy browsers interpret it as "focus the window" or "focus the window and then focus the frame within the window". It's similar to how some buggy browsers bring the [google.com...] front page to the front when the page tries to focus the textbox for you, but nobody complains about that.

The Google Groups feature makes life easier for users who scroll with the keyboard and (I haven't tested this) may make it much easier for blind users to read threads. The only way to make browser makers fix the bug is to trigger it on a site. Google is doing the web a great service.


 5:28 pm on Jul 19, 2002 (gmt 0)

> it's possible to write "bad html" that validates

I would be nice if Google would write "good HTML" that validates;). If you look closely enough at Google's HTML, much of the 'invalid HTML for speed and bandwidth' argument fades away, IMO.

Google's not worse than search engines on average, indeed it's much better than average. Google is the leader in mindshare as well as technology, so it would be good to see them setting an example.


 7:06 pm on Jul 19, 2002 (gmt 0)

First of all - thanks for taking the time to ask for feedback, not many big companies bother with this.

1 - update the index every 2 weeks/twice a month.

2- I would like for Google to remain a search engine where the little guy who is willing to
(a) work hard,
(b) learn to write standards compliant and accessible code and
(c) create quality content
- but doesn't have a large budget can still compete in SERP's with large companies with a lot of money to buy lots of domains, acquire high PR just because they are big (big company, not big web site), etc.

3 - new sites vs. old ones: (and another point to add to MHes' reciprocal links list) - some sites have more incoming links because they have been around longer and got quite a few links a few years ago. Many of the sites linking to them have not updated their links pages since. This makes it more difficult for new sites to get established. While it may not be so bad for sites that have been around longer to get a slight boost in SERP or PR (especially if you happen to own a site that's been around a while ;) ), it can make it hard for new sites. How about a section (or link to a section) on the SERP's for sites recently added to the Google index? They would appear in the regular results also (but likely way down on the list since they wouldn't have had time to get links yet), and stay in the 'New Sites' list for a month or so to give them a chance to get more visitors, who, if they like the site's content may then link to them.

4 - While sites that consists of nothing but affiliate links are bad, I would hate to see sites with some affiliate links be penalized, as someone else suggested. There are many quality content sites that rely on a little bit of money they get from Amazon, etc. just to be able to keep their sites up and running.

5 - duplicate content and redirects - I know of a site(s) which has 2 domains, both of the index pages are the same and each redirect to another page, (e.g. 'home.html'), on each one's respective domain, which are identical also. Any combination of 1 to all 4 of those pages appears in the top SERP positions (for the most relevant search terms), depending upon which index update it is (it changes each time, but sometimes all 4 pages take the top 4 spots). I thought redirects and duplicate content were frowned upon by Google, but it doesn't appear that way. Of course, no one wants to have someone else steal their content and then have your site be the one Google doesn't index, still, it would be nice to see something done about this.

Again, thanks GoogleGuy for taking the time to read the posts here and thanks to Brett for providing this forum.

[edited by: Trisha at 7:11 pm (utc) on July 19, 2002]


 7:07 pm on Jul 19, 2002 (gmt 0)

Next for Google!

A world map displayed in the browser.

You click on a country or region and Google builds a list of dates from pages that have information on that country.

You click on a date and Google builds a list of proper nouns for that country and date. The proper nouns would be cities or people or whatever.

You could then click a city or person for historical information about that city or person during the selected time period.

Bingo! Google becomes an instant online cross-reference history resource.


 7:08 pm on Jul 19, 2002 (gmt 0)

  1. news articles are listed instead of the news item they are reporting about
  2. your search results should be more (or less) grammar sensitive, i.e., it should find declinations/stems of words
  3. port the Google Toolbar to Opera/Mozilla/Linux/Mac OS
  4. show some newsgroup articles on regular SERPs or add a "search all media" (similar to Amazon) and group them by "The Web", "Groups", "Images", ...
  5. Use valid HTML
  6. Beneath each listing, name the page elements, e.g., (t)ext, (i)mages, (js), (j)ava, (f)lash, (d)html
  7. offer some refine options, e..g., if someone searches for "pop-up windows", offer refine for MSIE, Netscape, Opera
  8. send that Googlebot out more often
  9. offer hints, e.g., "don't forget to visit the city library. ;)
  10. extend your News search to other countries (hint: Germany ;))
  11. punish JavaScript links (if overdone) and count them as regular links
  12. put a date next to each search result

This 179 message thread spans 6 pages: < < 179 ( 1 2 [3] 4 5 6 > >
Global Options:
 top home search open messages active posts  

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved