| 10:16 pm on Nov 11, 2005 (gmt 0)|
Is Google ever going to provide an incentive for people to build sites/pages that are web standards-compliant and accessible?
|West of Willamette|
| 10:18 pm on Nov 11, 2005 (gmt 0)|
Is the new Google algo located behind the login at base.google.com? Seriously, I'd love to hear more about what Google's up to with "base".
| 10:50 pm on Nov 11, 2005 (gmt 0)|
|Why are there scraper sites using our content on top spot - 1st page for a search on our copyright statement and our site is found 5 pages deep! |
-->I second this!
I think Google, Yahoo, MSN and the others need to time date content or add a copyright field into the mix. I understand that if they have the ability to date links to a site and find duplicate content on multiple domains, then they should penalize the content coming in a second time and a third.
This will also of course devalue article building as a linking strategy, but content protection is important and this update appears to be a problem with duplicate content.
Perhaps w3c can create a new tag that states this is copyrighted material. Or G can place time based priority on the meta copyright tags and promote the use of it and accept it as a timing factor. This will also save them in legal expenses.
Going the legal way is extremely time consuming and expensive for all parties involved.
| 10:53 pm on Nov 11, 2005 (gmt 0)|
What's the Googlebot+Index REAL behaviour with secured [https://*] protocol pages?
| 11:17 pm on Nov 11, 2005 (gmt 0)|
Matt is a Google employee (and shareholder?) who gives a mixture of occasional insights with Googles marketing information. Ask him how we can tell which hat he is using?
| 11:20 pm on Nov 11, 2005 (gmt 0)|
Please forgive me if I'm wrong on the following logic:
From my review of the site maps statistics page you (Google) seem to be attempting to open files on my website before checking the robots.txt file.
I add entries to the robots.txt with the Disallow so you will not attempt to open them or touch them in anyway.
Am I wrong on this or are you doing these operations backasswords?
| 11:47 pm on Nov 11, 2005 (gmt 0)|
How about this... does using AdSense or Adwords on a site REALLY make a difference in positioning....... maybe he'll whisper in your ear the truth....
| 12:02 am on Nov 12, 2005 (gmt 0)|
Are reciprocal links dead?
| 12:27 am on Nov 12, 2005 (gmt 0)|
what is the largest pain for their algo engineers these days?
| 1:34 am on Nov 12, 2005 (gmt 0)|
How can we as webmasters help G in flushing out SPAM? A lot of us make a living from the Internet. IE we are not primary suppliers of any other service than in SEARCH and truly want to "partner" with G with a WIN-WIN scenario for the better good of all internet users.
This was the mission statement I put together over one year ago. From that time to now I am the only person still working on the sites the company owns. IE 40 people at that time to just myself today.
XXX is an information provider on the Internet. Its intention is to strive to provide the most informative web pages on topics searched by the “Internet” population. We strive to answer a “search phrase” as though it where a “question” by firstly identifying what possible questions the search phrases may be trying to ask. Once identified, to provide answers to all these “questions” by producing accurate, precise content and links on the web pages we produce to fulfil all the answers.
This is our primary focus similar to that of Google’s primary focus when it started in providing a search service – Google’s primary focus was never to provide a search service so that it could “cash in” on advertising – this resulted later as a spin-off. In other words our primary focus is not to produce “waffle” content on web pages to purely chase advertising revenue. Our primary focus is to produce quality information and as a spin-off benefit from advertising revenue. If this does not remain our primary focus we stand the very real threat of not achieving our goals – IE if we write garbage it’s only a matter of time before it will be discarded.
| 2:06 am on Nov 12, 2005 (gmt 0)|
Are those really his teeth?
| 2:15 am on Nov 12, 2005 (gmt 0)|
A suggestion. A bright idea or what? Google supplies us trackable links that we can put on our sites - must be placed in a position where it can be SEEN. The link states the following
"If you have reached this page via Google Search and this page does not answer your requirements follow this link and submit the reasons why"
G includes in its algo a +itive weight to pages that have this link compared to pages that don't.
| 2:33 am on Nov 12, 2005 (gmt 0)|
With such a mass of new products such as google chat,google earth,gmail,google news,google desktop, to name a few does google intend to focuss on being a search engine or to become an internet portal?
| 4:03 am on Nov 12, 2005 (gmt 0)|
of course he is. If you read the latest post about the j3 datacentre its a giveaway. He forgot talk in the 3rd person as usual and switched to first person and mirrored a GG post.
Back ontopic. Given your time over, would you go straight to a portal with bells and whistles or keep to your core product of the search as you did for the initial years until branching out with your many current products.
I don't think so.. how about is Brin GG?
| 4:08 am on Nov 12, 2005 (gmt 0)|
Do you ever see some borderline optimisation techniques and think hats off to you for inititaive!
| 4:31 am on Nov 12, 2005 (gmt 0)|
Same old same old then.
Q. Is Google preparing for the next black monday by making search a secondary product.
| 9:15 am on Nov 12, 2005 (gmt 0)|
Why do only a few googlebots use the HTTP/1.1 protocol six years after it was released? The majority are still using 1.0 which doesn't recognise a "410" gone error.
And when will it be standard to immediately remove pages from their index that are reported as 410s by the few bots using 1.1?
Right now, it seems there is no certain way of removing defunct pages permanently.
I also wonder what good it does for google to continue indexing "gone" pages.
| 7:37 pm on Nov 12, 2005 (gmt 0)|
Why does Google not follow mark-up standards or better Accessibility standards?
| 5:56 pm on Nov 13, 2005 (gmt 0)|
|How about introducing a location metatag eg <metaname="location" name="UK"> for example. |
Good idea for sites only aimed at certain countries.
Related to this, what is the best way to set up sites for different countries using the same language (e.g. English) while avoiding duplicate content issues, etc?
| 6:07 pm on Nov 13, 2005 (gmt 0)|
What are the most common *unintentional* errors that they see that hurt sites and how is the pilot project going to help sites identify mistakes and grey hat backfires?
| 9:30 am on Nov 14, 2005 (gmt 0)|
Is the so called "inception date" according to Matts patent in April/05 a scoring factor in the current Jagger-Update (which would suite well)? ;)
| 12:08 pm on Nov 14, 2005 (gmt 0)|
When should we expect to see google's first foray into hosting? If I were a betting man, I'd say the second half of 2006.
| 5:57 pm on Nov 14, 2005 (gmt 0)|
>> If I were a betting man, I'd say the second half of 2006.
why? Every 15 year old with a leased box is a host now. How is google going to make money, and provide service with $4 - $7 a month prices?
Even with dedicated services, the margins are way too low for G to jump in.
| 7:20 pm on Nov 14, 2005 (gmt 0)|
Google is beginning to take faltering steps towards Microsoft's "Crush, Kill, Destroy" policy. Do you really think they are not considering this, along with becoming domain registrars not to mention getting into the lucrative payment business? They probably have plans to launch their own credit card, not to mention purchase or start their own bank.
Google is cash-rich, has a good (if falling) reputation and a very widely known brand/logo. Diversification along all these line is almost certain.
| 9:28 pm on Nov 14, 2005 (gmt 0)|
How are they going to use the inside information gathered by [google.com...] to influence the way they rank sites?
| 11:24 am on Nov 15, 2005 (gmt 0)|
DMOZ Clones and Duplicate Content
Whilst is appears that Google ignores links from all the DMOZ clones the question has to be asked, Why are DMOZ Clones indexed at all? Other than to inflate page statistics it is utterly pointless.
| 9:15 pm on Nov 15, 2005 (gmt 0)|
I made relatively minor changes to all my web pages
and they were reindexed into the Supplemental index
8 months ago and remain there. Traffic went down the tubes.
Banishment to the Supplemental is a harsh penalty
and the rules of engagement seem to be ill defined.
I suggest Google index pages into the General index
or not index them at all.
| This 87 message thread spans 3 pages: < < 87 ( 1 2  ) |