Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: mademetop
My problem is that some Dutch versions of international SE are using a different algo, which I have to figure out myself, because SEO is rather new in the Netherlands. The same may also be an BIG issue for some other foreign SEO here at SE-World.
One thing I can do is testing the engines by putting in "nonsense-words" in all tags and places I can imagine, all on a different page. That gives me a nice informationbank of which tags are getting indexed and which tags are not.
Then there are still a few other things which are hard to figure out, such as "link popularity" and . It demands a lot of testing and comparison I guess....
I don't assume there is a straight one-way answer to this question, but your thoughts are very much appreciated. To be more specific: "how to test the SE in a advanced way".
Tx in Advance!
How do people know? Part of it is volumn. Myself, we do seo on a huge chunk of sites for a couple of corporations that manage webhosting site farms. That gives us lots to look at. The biggest being log files (the reason I wrote traxis in the first place was there wasn't a way to deal with all those logs in a manner we needed). That type of info can tell books worth of info on search engines.
For example: don't look at a keyword list, but what would you think of "construction" as a keyword? Pull very many hits? There are a few combinations of phrases on construction that can pull 2-3 k referrals a day out of altavista alone. Yet "construction" isn't even in the top 500 on most kw lists. That type of info comes only out of log files. Log files, log files, log files....there is gold in them log files.
Would you routinly give info like that away? After all, seos only have their knowledge to peddle.
This should keep you busy for a while - and it's a pretty safe assumption that your boot-strapping contention will be a thing of the past real soon.
For example: don't look at a keyword list, but what would you think of "construction" as a keyword?
Indeed, log files are even better then custom generated keyword lists from companies like ".....", because you can look at your own targeted audience / segment in the market. When title & description is "only" appealing to a certain group of customers, you'll be able to generated a very specific kind of keyword list which only fits best for your audience, which is gold.
Read, read, read - learn the trade from scratch.
I am often asking myself what the heck I was doing for the last couple of months....
wasting time? What did I do?
thank goodness, that's what's going on. We all have to be so darn patient in the beginning -
You may find this funny or strange, but I simply wrote it up as I went along when posting here.
Hm, if you think it's worth a domain of its own, I guess I'll drop my next scheduled recreation phase and hurry up to build one.
Brett, whats the deal with Traxis - Is it for sale? Could not find it anywhere on SEW.
joined:June 21, 2000
Semi newbie reply. I have started using Fast Stats. Simple install, then just point it to your log file. They have a demo on thier site at [mach5.com....] Reasonable price. For me it beats searching an IP in my log file, allthough sometimes I do default back to my raw log file for certain things.
One caution: look at the numbers very closely before you or a client make decisions based on the reports. I've found some inconsistencies and written to FastStats about it.
Unless you really know what they mean exactly by a certain term, you can get some nasty surprises. For instance, if a site has a lot of AOL traffic (or any place else that uses a bank of proxy servers), the number of uniques on a report can be a pretty meaningless number.
I had one report that showed three times as many uniques as it did page views. Right.
Turns out that AOL proxy servers make hits from dynamically changing IP addresses. Worse than that, they often make six hits for each graphic file on a page, and each one gets counted as a unique by FastStats.
All that really happened is one user found the site on AOL Search and the AOL proxies decided to stash several versions of the page, one for each of several browsers, I understand.
So I learned to be sure to keep DNS lookup turned on and check the number of users reported under domains, not on the summary.
Also I usually export the reports to a .csv file and then play with that in Excel until I'm convinced it all makes sense.
I too have been through all the "Major" log analysis software but have found inconsistancies in nearly all. Firstly Webtrends seemd to be the only one that uses the cookies(pre-defined or custom) to identify unique visitors. With the plethora of transparent proxies and caches out there and the fact that "via" headers are not logged(generally), nor is native address translation taken into account. I can only wonder @ how they get to their "Unique" visitor stats.
Anyway sorry for the essay guys. Just my two bits.
P.S anyone experimented with the "meter" header as layed out in rfc2227 Simple Hit-Metering and Usage-Limiting for HTTP.
I have to wonder if AOL is selling the data. They are not a fiscally foolish bunch, and I'm sure there's an angle somewhere that balances the books.
To tell the truth, I don't really care for either program since they both offer such a limit view of referrals. I need something that can dig down in there deeper. I can't use my own traxis, because when you are talking 20-100meg of logs, there is no way a perl script can deal with that on all but the fastest and largest machines. So, I'm still looking for something that will deal with that kind of volumn.