homepage Welcome to WebmasterWorld Guest from 54.198.42.105
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / Marketing and Biz Dev / General Search Engine Marketing Issues
Forum Library, Charter, Moderators: mademetop

General Search Engine Marketing Issues Forum

    
How to test SE in a advanced way?
VAL@Amsterdam




msg:245775
 2:12 am on May 31, 2000 (gmt 0)

What bothers me: here at SE-World, At ACWS, Planet Ocean and other SE-Forums they know a lot about SE. But how do they know?

My problem is that some Dutch versions of international SE are using a different algo, which I have to figure out myself, because SEO is rather new in the Netherlands. The same may also be an BIG issue for some other foreign SEO here at SE-World.

One thing I can do is testing the engines by putting in "nonsense-words" in all tags and places I can imagine, all on a different page. That gives me a nice informationbank of which tags are getting indexed and which tags are not.

Then there are still a few other things which are hard to figure out, such as "link popularity" and . It demands a lot of testing and comparison I guess....

I don't assume there is a straight one-way answer to this question, but your thoughts are very much appreciated. To be more specific: "how to test the SE in a advanced way".

Tx in Advance!

VAL. /^-^\

 

Brett_Tabke




msg:245776
 10:33 pm on May 31, 2000 (gmt 0)

Testing has never been harder. In the past you could throw up a page and see what it did, wait for a "big bite", then clone that page a zillion times. Not anymore - just to many factors at work.

How do people know? Part of it is volumn. Myself, we do seo on a huge chunk of sites for a couple of corporations that manage webhosting site farms. That gives us lots to look at. The biggest being log files (the reason I wrote traxis in the first place was there wasn't a way to deal with all those logs in a manner we needed). That type of info can tell books worth of info on search engines.

For example: don't look at a keyword list, but what would you think of "construction" as a keyword? Pull very many hits? There are a few combinations of phrases on construction that can pull 2-3 k referrals a day out of altavista alone. Yet "construction" isn't even in the top 500 on most kw lists. That type of info comes only out of log files. Log files, log files, log files....there is gold in them log files.

Would you routinly give info like that away? After all, seos only have their knowledge to peddle.

scott




msg:245777
 1:14 pm on Jun 1, 2000 (gmt 0)

I can see why log files are great, but don't you have the "ole bootstrap problem" with that? If you aren't listed high in the SEs to begin with, then how can you be getting any KWs out of your log files since nobody can find you by entering KWs in the SE?
Scott

fantomaster




msg:245778
 4:47 pm on Jun 1, 2000 (gmt 0)

scott: Well, as Brett's pointed out elsewhere over and again, it's "promote, promote, promote" all the way. Cut up your activities into small, easy to handle chunks.
First comes content (no point if you haven't any, is there? ), optimizing your pages for the search engines, checking your HTML code, etc.
Next, submit to the engines.
Sign up with the BL program if your site qualifies.
Participate in forums like this one.
Promote on Usenet via your sig file if you can answer (or ask) questions in areas your are either proficient or at least interested in.
Check your rankings regularly.
Also, your linkage.
Generate lots of fresh, useful content.
Keep your blatant marketing activities on economy drive (pardon the pun), be subtle about your promotion. (Far too little subtleness/subtlety around in these blaring, excitement driven times! People will notice, will favor: less dumb hysteria, more openness, honesty.
Admit to mistakes if you make them (as you're bound to), but don't cringe, don't give the impression of reacting self-assertive or self-depreciating for the heck of it. (Everybody's tired of everybody's ego but their own ...)
Remember you are free to come and go as you please on the net.
Submit your new content.
Possibly resubmit your older content, depending on ranking and various other factors. (Introduce some minor changes before you do.)
Read, read, read - learn the trade from scratch.
Test out stuff - your mileage may vary immensely from the gurus' - every web site is different, or, at least, should be.
Link to lots of useful sites not directly competing with yours.
Request reciprocal links.
Create yet more, fresh content.
Submit it.
Contribute to mailing lists.
If you can, issue a newsletter of your own. Never mind if you only have yourself, your wife and your stepmother for subscribers - put it on site and submit it to the engines. (They simply love that sort of stuff currently!)
Check your logs daily.
Learn how to recognize search engine spiders.
Don't go for cloaking unless you really know what you're doing. It's risky, but it *can* be highly effective, if you play it right.
Check out all search engine generated hits to determine: a) your ranking, b) what people are actually searching for - you may be in for a surprise or two on that score.
Look at what your competitors are doing. Don't just copy them - BETTER them!
Create more domains and interlink them all.

This should keep you busy for a while - and it's a pretty safe assumption that your boot-strapping contention will be a thing of the past real soon.

VAL@Amsterdam




msg:245779
 7:34 pm on Jun 1, 2000 (gmt 0)

Great impute you all

For example: don't look at a keyword list, but what would you think of "construction" as a keyword?

Indeed, log files are even better then custom generated keyword lists from companies like ".....", because you can look at your own targeted audience / segment in the market. When title & description is "only" appealing to a certain group of customers, you'll be able to generated a very specific kind of keyword list which only fits best for your audience, which is gold.

Read, read, read - learn the trade from scratch.

I am often asking myself what the heck I was doing for the last couple of months....
wasting time? What did I do?
Oh...!
thank goodness, that's what's going on. We all have to be so darn patient in the beginning -

Brett_Tabke




msg:245780
 4:47 pm on Jun 17, 2000 (gmt 0)

No kidding, that is a very good laundry list there Ralph. I even killed a tree for it (printed it). Where are you hiding that kind of content on your own site? Take almost every line you wrote, put it on a separate page, put two paragraphs of explaination about it, and wow, that's enough for another site.

fantomaster




msg:245781
 9:15 pm on Jun 17, 2000 (gmt 0)

Thanks, Brett - as always, your comments are appreciated.

You may find this funny or strange, but I simply wrote it up as I went along when posting here.

Hm, if you think it's worth a domain of its own, I guess I'll drop my next scheduled recreation phase and hurry up to build one.

Pete




msg:245782
 2:03 pm on Jun 18, 2000 (gmt 0)

Have been looking for a decent log file analysis tool for some time. We have recently acquired our own server in the states. Moving the sites to the new server has already proved to be invaluable. When we were on a shared server managed and maintained by our host, we were having serious problems with tracking referers.

Brett, whats the deal with Traxis - Is it for sale? Could not find it anywhere on SEW.

Brett_Tabke




msg:245783
 8:40 pm on Jun 18, 2000 (gmt 0)

traxis.

biggest known issue, only works with ssi. The image counter system will not work at this time.

GWJ




msg:245784
 8:30 pm on Jun 21, 2000 (gmt 0)

Pete:

Semi newbie reply. I have started using Fast Stats. Simple install, then just point it to your log file. They have a demo on thier site at [mach5.com....] Reasonable price. For me it beats searching an IP in my log file, allthough sometimes I do default back to my raw log file for certain things.

tedster




msg:245785
 9:24 pm on Jun 21, 2000 (gmt 0)

I've been using FastStats for the last few weeks as well. I like the filtering it offers, and it sure is cheaper than WebTrends.

One caution: look at the numbers very closely before you or a client make decisions based on the reports. I've found some inconsistencies and written to FastStats about it.

Unless you really know what they mean exactly by a certain term, you can get some nasty surprises. For instance, if a site has a lot of AOL traffic (or any place else that uses a bank of proxy servers), the number of uniques on a report can be a pretty meaningless number.

I had one report that showed three times as many uniques as it did page views. Right.

Turns out that AOL proxy servers make hits from dynamically changing IP addresses. Worse than that, they often make six hits for each graphic file on a page, and each one gets counted as a unique by FastStats.

All that really happened is one user found the site on AOL Search and the AOL proxies decided to stash several versions of the page, one for each of several browsers, I understand.

So I learned to be sure to keep DNS lookup turned on and check the number of users reported under domains, not on the summary.

Also I usually export the reports to a .csv file and then play with that in Excel until I'm convinced it all makes sense.

fantomaster




msg:245786
 11:27 pm on Jun 21, 2000 (gmt 0)

You are right, and AOL is just one blatant example. There must be dozens of ISPs resorting to this type of dynamic IP assignment now. Another major player (self-proclaimed world's #2) doing this is Deutsche Telekom.

Murrayson




msg:245787
 10:33 am on Jun 23, 2000 (gmt 0)

Hi all,
New here, also eek out a living trying to fool the arachnids. Seems there is is wealth of knowledge right here. Just a few comments on the above message by tedster.

I too have been through all the "Major" log analysis software but have found inconsistancies in nearly all. Firstly Webtrends seemd to be the only one that uses the cookies(pre-defined or custom) to identify unique visitors. With the plethora of transparent proxies and caches out there and the fact that "via" headers are not logged(generally), nor is native address translation taken into account. I can only wonder @ how they get to their "Unique" visitor stats.

As regards the 6xPull of images. I wonder. The singles major cost for ISP's is bandwidth. Proxies usually cascade , otherwise why use them. If this were the case, which it may well be , it does not make sense. They should GET a base copy then pass that to the others to be repurposed(for other browsers). The hits, I presume, are comming from the fact that they are using client site JavaScripts to try to defeat the cache problems. All clients(including the proxies) will make the call. The down side of masking the script as an image.

Anyway sorry for the essay guys. Just my two bits.

P.S anyone experimented with the "meter" header as layed out in rfc2227 Simple Hit-Metering and Usage-Limiting for HTTP.

tedster




msg:245788
 6:34 pm on Jun 23, 2000 (gmt 0)

The 6X caching puzzles me too, especially for images. And it's happening many times a day for the same files ... you'd think with the cost of bandwidth that once in a while would be enough, no matter what the reason for the hits.

I have to wonder if AOL is selling the data. They are not a fiscally foolish bunch, and I'm sure there's an angle somewhere that balances the books.


Brett_Tabke




msg:245789
 9:25 am on Jun 26, 2000 (gmt 0)

I hate to knock any logging software, but I really wonder about the value of Fast Stats. I've been trying to use is on a couple of large domains (100k a day), that I want to track in real time. As tedster said, some of the data is suspect. Comparing webtrends and fast stats running over the same generic apache log file is a bit of an eye opener. Will WT will do a good job of filtering out dupes and flagging reloads, Fast Stats doesn't.

To tell the truth, I don't really care for either program since they both offer such a limit view of referrals. I need something that can dig down in there deeper. I can't use my own traxis, because when you are talking 20-100meg of logs, there is no way a perl script can deal with that on all but the fastest and largest machines. So, I'm still looking for something that will deal with that kind of volumn.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Marketing and Biz Dev / General Search Engine Marketing Issues
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved