| 6:32 pm on Mar 6, 2003 (gmt 0)|
You mentioned automated queries (eg, WebPosition). What exactly is that? Sounds like I want to stay away from it, but kinda hard to do if you don't know what it is!
| 6:35 pm on Mar 6, 2003 (gmt 0)|
These are peices of software that check your position for your keywords on the major search engines.
Google like to keep resources free for their users and do not like bandwidth being used by Webposition or similar programs.
| 7:10 pm on Mar 6, 2003 (gmt 0)|
So, what I'm getting is that it's ok to check your ranking by hand but not by software. Is that right? And what about all the sites like the Google Dance Tools that search www1, www2, and www3 of Google at the same time? Or the sites that will automatically look up all of your backlinks with multiple search engines? Do these come under automated inquiries?
| 7:44 pm on Mar 6, 2003 (gmt 0)|
>3) expired domains: Google will "soon" be filtering expired domains from its index and link calculations [no further elaboration]
I don't quite grok what the above means. An expired domain will naturally not resolve. Thus, these are already removed in the next index because Googlebot won't be able to spider them. Or, do perhaps they mean they will automatically remove them between index updates as soon as they know it is expired? I'll presume that the link calculations part is that if some page that happens not to be updated in many years has 10 links on it, and now 9 are long dead yet one still works, all PR will be transferred to the one that does work.
>4) when asked about the significance of ODP/dmoz listings to Google, Daniel replied that "links from directories that people still use" have significance to Google. [he did not expand on this. the question was about ODP specifically but he did not refer directly to ODP in his answer.]
Obviously Google will continue to count the ODP. Otherwise, they'd be disregarding links from their *own* directory because that is just an ODP mirror. I wonder if perhaps the above means Google is planning on ignoring in the future a lot of small directories that the don't consider important?
| 8:37 pm on Mar 6, 2003 (gmt 0)|
There are certainly enough editors of ODP who use it to consider it used.
Do you think they could be refering to directories that are never updated, or have a large percentage of dead links?
| 8:40 pm on Mar 6, 2003 (gmt 0)|
Google must mean something like the latter BigDave. If Google thought the ODP wasn't used they wouldn't have a copy on their site.
| 12:04 am on Mar 7, 2003 (gmt 0)|
re expired domains:
This is pure speculation, but I took it to mean that they are addressing the problem of expired domains displaying or being redirected to irrelevant content? This wouldn't necessarily get detected in the crawl/update cycle since there is still a site resolving at the domain. And there are still links to it.
They would have to tackle this algorithmically (I like the sound of that!). Perhaps they could compare link text/context and the actual content of the page, using domain expiration lists as a seed for this comparison. Or maybe there will be an anti-freshbot / death-bot, crawling expired-then-purchased domains looking for irrelevance?
As I said, Google didn't elaborate. This is just my own speculation.
| 1:15 pm on Mar 7, 2003 (gmt 0)|
| 9:40 pm on Mar 7, 2003 (gmt 0)|
|when asked about the significance of ODP/dmoz listings to Google, Daniel replied that "links from directories that people still use" have significance to Google. [he did not expand on this. the question was about ODP specifically but he did not refer directly to ODP in his answer.] |
That was actually my question (is it ok if I take credit), and I am still kinda disappointed with the answer. Does that mean that if nobody is using DMOZ that we shouldn't really care about.
| 9:49 pm on Mar 8, 2003 (gmt 0)|
RE: Blocking ranking programs.
What I got from the conference and talking with the search engine people, including the Google reps, was that you won't be blocked for using a ranking software unless you "abuse" it. I think what they meant was excessive searching on the index. The index doesn't update but once a month, so ranking your site more than once or twice could be considered abuse.
When I spoke with Daniel, he mentioned that users could sign up for the API program if they were worried about being blocked from the search engine. I didn't think many people would be interested in doing that. Plus, that still doesn't guarantee you wont get blocked.
If you're afraid of getting banned or blocked, you can use a dial-up account to rank your web site and never search for your url. Then you really don't have anything to worry about.
| 10:47 pm on Mar 8, 2003 (gmt 0)|
Fast uses the Macromedia SDK to index flash sites. Since that basically just turns flash in to a crappy HTML page I asked Tim Mayer about the potential of duplicate content problems for those that currently provide an HTML & Flash version of thier site. In a nutshell he said it could be a problem so you should robots.txt one version of the site.
I thought this was pretty important since the duplicate content in this case is actually being generated by Fast, not the publisher.
| 8:12 am on Mar 9, 2003 (gmt 0)|
Kinda an aside here, but why was Boston chosen? Las Vegas would be a nice, convenient location. Great weather, too. Just my .02
| This 42 message thread spans 2 pages: < < 42 ( 1  ) |