Is Google ever going to provide an incentive for people to build sites/pages that are web standards-compliant and accessible?
|West of Willamette|
Is the new Google algo located behind the login at base.google.com? Seriously, I'd love to hear more about what Google's up to with "base".
|Why are there scraper sites using our content on top spot - 1st page for a search on our copyright statement and our site is found 5 pages deep! |
-->I second this!
I think Google, Yahoo, MSN and the others need to time date content or add a copyright field into the mix. I understand that if they have the ability to date links to a site and find duplicate content on multiple domains, then they should penalize the content coming in a second time and a third.
This will also of course devalue article building as a linking strategy, but content protection is important and this update appears to be a problem with duplicate content.
Perhaps w3c can create a new tag that states this is copyrighted material. Or G can place time based priority on the meta copyright tags and promote the use of it and accept it as a timing factor. This will also save them in legal expenses.
Going the legal way is extremely time consuming and expensive for all parties involved.
What's the Googlebot+Index REAL behaviour with secured [https://*] protocol pages?
Matt is a Google employee (and shareholder?) who gives a mixture of occasional insights with Googles marketing information. Ask him how we can tell which hat he is using?
Please forgive me if I'm wrong on the following logic:
From my review of the site maps statistics page you (Google) seem to be attempting to open files on my website before checking the robots.txt file.
I add entries to the robots.txt with the Disallow so you will not attempt to open them or touch them in anyway.
Am I wrong on this or are you doing these operations backasswords?
How about this... does using AdSense or Adwords on a site REALLY make a difference in positioning....... maybe he'll whisper in your ear the truth....
Are reciprocal links dead?
what is the largest pain for their algo engineers these days?
How can we as webmasters help G in flushing out SPAM? A lot of us make a living from the Internet. IE we are not primary suppliers of any other service than in SEARCH and truly want to "partner" with G with a WIN-WIN scenario for the better good of all internet users.
This was the mission statement I put together over one year ago. From that time to now I am the only person still working on the sites the company owns. IE 40 people at that time to just myself today.
XXX is an information provider on the Internet. Its intention is to strive to provide the most informative web pages on topics searched by the “Internet” population. We strive to answer a “search phrase” as though it where a “question” by firstly identifying what possible questions the search phrases may be trying to ask. Once identified, to provide answers to all these “questions” by producing accurate, precise content and links on the web pages we produce to fulfil all the answers.
This is our primary focus similar to that of Google’s primary focus when it started in providing a search service – Google’s primary focus was never to provide a search service so that it could “cash in” on advertising – this resulted later as a spin-off. In other words our primary focus is not to produce “waffle” content on web pages to purely chase advertising revenue. Our primary focus is to produce quality information and as a spin-off benefit from advertising revenue. If this does not remain our primary focus we stand the very real threat of not achieving our goals – IE if we write garbage it’s only a matter of time before it will be discarded.
Are those really his teeth?
A suggestion. A bright idea or what? Google supplies us trackable links that we can put on our sites - must be placed in a position where it can be SEEN. The link states the following
"If you have reached this page via Google Search and this page does not answer your requirements follow this link and submit the reasons why"
G includes in its algo a +itive weight to pages that have this link compared to pages that don't.
With such a mass of new products such as google chat,google earth,gmail,google news,google desktop, to name a few does google intend to focuss on being a search engine or to become an internet portal?
of course he is. If you read the latest post about the j3 datacentre its a giveaway. He forgot talk in the 3rd person as usual and switched to first person and mirrored a GG post.
Back ontopic. Given your time over, would you go straight to a portal with bells and whistles or keep to your core product of the search as you did for the initial years until branching out with your many current products.
I don't think so.. how about is Brin GG?
Do you ever see some borderline optimisation techniques and think hats off to you for inititaive!
Same old same old then.
Q. Is Google preparing for the next black monday by making search a secondary product.
Why do only a few googlebots use the HTTP/1.1 protocol six years after it was released? The majority are still using 1.0 which doesn't recognise a "410" gone error.
And when will it be standard to immediately remove pages from their index that are reported as 410s by the few bots using 1.1?
Right now, it seems there is no certain way of removing defunct pages permanently.
I also wonder what good it does for google to continue indexing "gone" pages.
Why does Google not follow mark-up standards or better Accessibility standards?
|How about introducing a location metatag eg <metaname="location" name="UK"> for example. |
Good idea for sites only aimed at certain countries.
Related to this, what is the best way to set up sites for different countries using the same language (e.g. English) while avoiding duplicate content issues, etc?
What are the most common *unintentional* errors that they see that hurt sites and how is the pilot project going to help sites identify mistakes and grey hat backfires?
Is the so called "inception date" according to Matts patent in April/05 a scoring factor in the current Jagger-Update (which would suite well)? ;)
When should we expect to see google's first foray into hosting? If I were a betting man, I'd say the second half of 2006.
>> If I were a betting man, I'd say the second half of 2006.
why? Every 15 year old with a leased box is a host now. How is google going to make money, and provide service with $4 - $7 a month prices?
Even with dedicated services, the margins are way too low for G to jump in.
Google is beginning to take faltering steps towards Microsoft's "Crush, Kill, Destroy" policy. Do you really think they are not considering this, along with becoming domain registrars not to mention getting into the lucrative payment business? They probably have plans to launch their own credit card, not to mention purchase or start their own bank.
Google is cash-rich, has a good (if falling) reputation and a very widely known brand/logo. Diversification along all these line is almost certain.
How are they going to use the inside information gathered by [google.com...] to influence the way they rank sites?
DMOZ Clones and Duplicate Content
Whilst is appears that Google ignores links from all the DMOZ clones the question has to be asked, Why are DMOZ Clones indexed at all? Other than to inflate page statistics it is utterly pointless.
I made relatively minor changes to all my web pages
and they were reindexed into the Supplemental index
8 months ago and remain there. Traffic went down the tubes.
Banishment to the Supplemental is a harsh penalty
and the rules of engagement seem to be ill defined.
I suggest Google index pages into the General index
or not index them at all.
| This 87 message thread spans 3 pages: < < 87 ( 1 2  ) |