Welcome to WebmasterWorld Guest from 188.8.131.52
Why are there scraper sites using our content on top spot - 1st page for a search on our copyright statement and our site is found 5 pages deep!
-->I second this!
I think Google, Yahoo, MSN and the others need to time date content or add a copyright field into the mix. I understand that if they have the ability to date links to a site and find duplicate content on multiple domains, then they should penalize the content coming in a second time and a third.
This will also of course devalue article building as a linking strategy, but content protection is important and this update appears to be a problem with duplicate content.
Perhaps w3c can create a new tag that states this is copyrighted material. Or G can place time based priority on the meta copyright tags and promote the use of it and accept it as a timing factor. This will also save them in legal expenses.
Going the legal way is extremely time consuming and expensive for all parties involved.
From my review of the site maps statistics page you (Google) seem to be attempting to open files on my website before checking the robots.txt file.
I add entries to the robots.txt with the Disallow so you will not attempt to open them or touch them in anyway.
Am I wrong on this or are you doing these operations backasswords?
This was the mission statement I put together over one year ago. From that time to now I am the only person still working on the sites the company owns. IE 40 people at that time to just myself today.
XXX is an information provider on the Internet. Its intention is to strive to provide the most informative web pages on topics searched by the “Internet” population. We strive to answer a “search phrase” as though it where a “question” by firstly identifying what possible questions the search phrases may be trying to ask. Once identified, to provide answers to all these “questions” by producing accurate, precise content and links on the web pages we produce to fulfil all the answers.
This is our primary focus similar to that of Google’s primary focus when it started in providing a search service – Google’s primary focus was never to provide a search service so that it could “cash in” on advertising – this resulted later as a spin-off. In other words our primary focus is not to produce “waffle” content on web pages to purely chase advertising revenue. Our primary focus is to produce quality information and as a spin-off benefit from advertising revenue. If this does not remain our primary focus we stand the very real threat of not achieving our goals – IE if we write garbage it’s only a matter of time before it will be discarded.
G includes in its algo a +itive weight to pages that have this link compared to pages that don't.
joined:Dec 1, 2003
joined:May 30, 2005
Are you GG?
of course he is. If you read the latest post about the j3 datacentre its a giveaway. He forgot talk in the 3rd person as usual and switched to first person and mirrored a GG post.
Back ontopic. Given your time over, would you go straight to a portal with bells and whistles or keep to your core product of the search as you did for the initial years until branching out with your many current products.
I don't think so.. how about is Brin GG?
And when will it be standard to immediately remove pages from their index that are reported as 410s by the few bots using 1.1?
Right now, it seems there is no certain way of removing defunct pages permanently.
I also wonder what good it does for google to continue indexing "gone" pages.
How about introducing a location metatag eg <metaname="location" name="UK"> for example.
Good idea for sites only aimed at certain countries.
Related to this, what is the best way to set up sites for different countries using the same language (e.g. English) while avoiding duplicate content issues, etc?
joined:Dec 29, 2003
why? Every 15 year old with a leased box is a host now. How is google going to make money, and provide service with $4 - $7 a month prices?
Even with dedicated services, the margins are way too low for G to jump in.
Google is cash-rich, has a good (if falling) reputation and a very widely known brand/logo. Diversification along all these line is almost certain.
Banishment to the Supplemental is a harsh penalty
and the rules of engagement seem to be ill defined.
I suggest Google index pages into the General index
or not index them at all.