Welcome to WebmasterWorld Guest from

Forum Moderators: phranque

Message Too Old, No Replies

what defines a blog?

How do bots tell apart a blog from a standard site?



9:31 pm on Nov 10, 2005 (gmt 0)

10+ Year Member

It is easy for a human visitor to identify a blog.

But what do search spiders look for to tell apart a blog from a site?

What if I have a blog as a subsection of a site?

I've installed wordpress to run a blog inside my site. I will do all the usual blog marketing techniques everyboy know.

But since my blog will be an internal section of my site, will it be losing importance for blog specific search engines?




1:20 pm on Nov 11, 2005 (gmt 0)

10+ Year Member

You can learn the basic about blog through this link; en.wikipedia.org/wiki/Weblog


1:49 pm on Nov 11, 2005 (gmt 0)

10+ Year Member

Thanks etechsupport for your reply. That's a very informative article.

Anyway, I guess I didn't explain myself correctly.

Since there is not a special tag that says <HEY BOT, THIS IS A BLOG!>, I was wondering how does a web bot tell appart a blog from a regular site.

Also, how do search engines and blog specific search engines consider a blog when it is just one section of a site and not the whole site. Will this blog make it into the blogosphere (or whatever you call it)?



6:26 pm on Nov 13, 2005 (gmt 0)

10+ Year Member

I'd be interested in knowing this also.



6:35 pm on Nov 13, 2005 (gmt 0)

10+ Year Member

Do they treat blogs and standard sites differently?


6:44 pm on Nov 13, 2005 (gmt 0)

10+ Year Member

Don't know it they are treated differently in ranking terms in standard search engines.

But since there is a specific Google search engine for blogs, it seems that they the can differentiate blogs from regular sites.

Didn't you ever heard this phrase: "Google loves blogs".

Then they must have a way to know which one is a blog.



7:27 pm on Nov 13, 2005 (gmt 0)

WebmasterWorld Senior Member encyclo is a WebmasterWorld Top Contributor of All Time 10+ Year Member

Google doesn't love blogs, but it may love the things that blogs offer - lots of fresh daily content, interlinking and descriptive anchors, descriptive filenames, many pages... Blogs also use content syndication (RSS) and trackbacks/pingbacks which are powerful, automated linking tools. The fact that many blog packages produce good-quality, lightweight, mostly valid markup which increases the ratio of content to structure means that the pages are easier to spider too.

Blog-specific search engines are often looking for RSS feeds, so make sure you are syndicating at least a summary of your blog articles.


7:58 pm on Nov 13, 2005 (gmt 0)

10+ Year Member

Thanks encyclo for your reply.

Yes, I'm aware that it is the standards supported by blogs and the freshness is what Google loves. It was just a silly way of saying the same.

But still I don't get how Google indexes blogs into its blog search engine. Offering RSS feeds doesn't mean a site is a blog. I offer RSS feeds and I don't have a blog (yet).

Anyway, I guess it is not that important and I should focus on making my future blog and my site as good as they can be for my visitors and trying to obey the coding standards.

Eventually, that's all that matters. The rest is pure speculation.

Thanks again,



5:28 am on Nov 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

If an engine is looking for blog RSS feeds it can key off of, for example, the unique identifiers such as:

<!-- generator="wordpress/1.5.2" -->

at the top of a WordPress feed.

Featured Threads

Hot Threads This Week

Hot Threads This Month