homepage Welcome to WebmasterWorld Guest from 54.205.188.59
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 45 message thread spans 2 pages: 45 ( [1] 2 > >     
Notepad. TXT based website to rank in Google
No frills no spills text only based website to rank in google.
AlgorithmGuy




msg:3104683
 10:39 pm on Oct 1, 2006 (gmt 0)

I want to create a .TXT (Notepad) purely text based website with no frills and no spills. No images and no active links.

Maybe 50 to a 100 pages created in notepad only to rank, get pagerank etc. Inbound and outbound links. inactive links for navigation.

Any thoughts to do with such a website regarding sever, navigation, raking, pagerank etc?

Thanks for any help and ideas.

[edited by: engine at 1:23 pm (utc) on Oct. 2, 2006]

 

AlgorithmGuy




msg:3105013
 9:22 am on Oct 2, 2006 (gmt 0)

Oh, I forgot. Also, will such a notepad based website get sandboxed in google?

malaiac




msg:3105028
 9:45 am on Oct 2, 2006 (gmt 0)

Great idea would be to keep it offline.
After all, what's all the fuss about HyperText and this so-called word wide WEB?

pmkpmk




msg:3105032
 9:54 am on Oct 2, 2006 (gmt 0)

What do you want to prove?

Anyway, I don't think links will be followed inside a .TXT document. You might want to have a look at [webmasterworld.com ].

photopassjapan




msg:3105041
 10:11 am on Oct 2, 2006 (gmt 0)

When i saw this on d*****point forums i thought it was a joke.
Actually now i think it's some kind of a test... for the way people react to the post?

( Was it you over there too? No... no way. Still a funny coincidence. )

No, seriously i don't get it.
But putting it in context of your other messages...

AG... you make some good points on every thread, some really important ones too that i would probably never have head about from others, but... i really don't get it what you're after anymore. o.O

Enlighten me, what's behind the idea of the amount / wording / arguements of your posts? Is it a secret? Is this some kind of a sociology test? Are you a double agent? Are you designing a new-wave server OS? What is "your" ideal web like? Mine is a free, fast, and open Yahoo directory with a free, fast and ad-less Google covering the rest of the net :-P

Seriously though.
What's this all about.

AlgorithmGuy




msg:3105065
 10:37 am on Oct 2, 2006 (gmt 0)

Great idea would be to keep it offline.

pmkpmk,

I want to get it on line, not off line.

I don't want to create a site that contradicts or imposes on deepcrawl bots or google's algo.

Yes it will contain links to images too, but you will have to copy paste the link to see the image. I do also want images to rank in google's image search so that the text based notepad website gets its fair share of visibility in search results.

This text based site would be elixir to google. Since no code would be present. Super fast loading and textual content only. That is what the bot sees. I don't want to cloud its vision. Nor the end users.

A no frills no spills website is what I want to create.

Photopassjapan,

I'm asking for assistance, not contradictions or questioning. I'm not questioning your website, as a matter of fact I could ask you the same question as to what on earth are you trying to prove with your site.

[edited by: AlgorithmGuy at 10:41 am (utc) on Oct. 2, 2006]

[edited by: engine at 1:23 pm (utc) on Oct. 2, 2006]

pmkpmk




msg:3105071
 10:38 am on Oct 2, 2006 (gmt 0)

Sounds like extreme-retro or an art-project (or both).

I never saw a textfile ranking though, except if you use the "filetype:txt" operator.

trillianjedi




msg:3105076
 10:42 am on Oct 2, 2006 (gmt 0)

Nor the end users.

How will the end users navigate?

AlgorithmGuy




msg:3105079
 10:47 am on Oct 2, 2006 (gmt 0)

I never saw a textfile ranking though, except if you use the "filetype:txt" operator.

pmkpmk,

There are many 7, 8 pagerank text pages.

There is absolutely no difference in pagerank allocation by google. If a txt page is relevant, google will give it in results.

What I want to do is to gather any knowledge available to make it work. Possibly to filter pagerank from it to where I want etc. But I need to make sure on how to get it to work and how best to get the pages to be seen with title and description.

[edited by: engine at 1:22 pm (utc) on Oct. 2, 2006]

photopassjapan




msg:3105083
 10:47 am on Oct 2, 2006 (gmt 0)

Me? With my site?

My existence >:-)

Seriously though.
Please... pretty please this time do answer my question...
I'm curious to insanity to know what this all is about.

I'm not trying to flame you, i just want to see the whole picture.

AlgorithmGuy




msg:3105084
 10:50 am on Oct 2, 2006 (gmt 0)

How will the end users navigate?

trillianjedi,

Copy paste only.

I also will want to submit this text only based website to DMOZ. Since it should meet the directories "originality criterion" and possibly their unique contents requirement also in order to be listed to benefit from a link from the directory.

The only thing functional or interactive would be whether your browser is viewing small to large text.

[edited by: AlgorithmGuy at 10:52 am (utc) on Oct. 2, 2006]

[edited by: engine at 1:22 pm (utc) on Oct. 2, 2006]

trillianjedi




msg:3105086
 10:55 am on Oct 2, 2006 (gmt 0)

PageRank, which is based solely on link structure, will get affected in so much as you're altering standard link structure. You'll have inbound links only.

How that gets viewed by the SE's will probably come down to volume and quality of inbounds.

You will of course need to get external inbound links to every page you create.

Ranking could be affected quite considerably in the more intelligent engines. The site sounds like a nightmare to navigate around, so taking the ultimate SE end game, to rank pages in a human-like manner, they would be right to tank the pages to the bottom.

Ultimately you're building pages for SE's rather than for users - that tends to backfire eventually, although can work in the short term.

Personally I think you're wasting your time, but it would make for an interesting experiment and I'd be very interested to hear how it works out.

TJ

AlgorithmGuy




msg:3105088
 10:57 am on Oct 2, 2006 (gmt 0)

Please... pretty please this time do answer my question...
I'm curious to insanity to know what this all is about.

Photpassjapan,

I wrote a small book about a project and I want to create a website to reflect the book as near as possible.

How I present my website is entirely up to me. But I need some help from anybody that has done this before to improve search engine and sever issues.

[edited by: engine at 1:21 pm (utc) on Oct. 2, 2006]

AlgorithmGuy




msg:3105097
 11:04 am on Oct 2, 2006 (gmt 0)

Ranking could be affected quite considerably in the more intelligent engines. The site sounds like a nightmare to navigate around, so taking the ultimate SE end game, to rank pages in a human-like manner, they would be right to tank the pages to the bottom.

Ultimately you're building pages for SE's rather than for users - that tends to backfire eventually, although can work in the short term.

trillianjedi,

I will build it for my end users, in the process, yes, a bonus byproduct could be to rank it in search results.

Does quality of information warrant search engines to tank a website to the bottom?

I thought google sees only what it wants to see. TEXT! It's bots don't have eys or ears.

[edited by: engine at 1:21 pm (utc) on Oct. 2, 2006]

jetteroheller




msg:3105102
 11:17 am on Oct 2, 2006 (gmt 0)

In the year 2000, I detected that Google sends visitors to the text files of my local search engine.

The text files had been like this

http://example.com/admin/search/a.txt
http://example.com/admin/search/b.txt
...
http://example.com/admin/search/z.txt

Each file like

searchword=123456abcdef123456abdcedf (codes on what page is the word)
searchword2=123456abcdef123456abdcedf

After the incident, I moved the admin folder out of the http access area at my server.

trillianjedi




msg:3105103
 11:18 am on Oct 2, 2006 (gmt 0)

I will build it for my end users

By making navigation difficult? Are you trying to convince me, or the SE's?

Does quality of information warrant search engines to tank a website to the bottom?

IIn some circumstances yes. Quality content on it's own is not enough.

I thought google sees only what it wants to see. TEXT!

Why do you think all SE's want to see is text? Links are the lifeblood of the search engine - they want to see content and links.

TJ

AlgorithmGuy




msg:3105106
 11:24 am on Oct 2, 2006 (gmt 0)

jetteroheller,

You sound like you can help.

I want my root page to be index.txt

If I make sure my host provider is aware of this, And he makes sure that the server responds accordingly, will all agents and crawlers be served the root text index?

Since all .txt pages will have non active links, is there a way to name a link like an anchor link?

Thanks

[edited by: engine at 1:21 pm (utc) on Oct. 2, 2006]

AlgorithmGuy




msg:3105112
 11:26 am on Oct 2, 2006 (gmt 0)

By making navigation difficult? Are you trying to convince me, or the SE's?

trillianjedi,

I see websites every day where I am totally confused about the navigation. Robots end up dizzy in those websites.

Mine will be easy. User friendly non active links to copy and paste.

[edited by: engine at 1:20 pm (utc) on Oct. 2, 2006]

kaled




msg:3105114
 11:27 am on Oct 2, 2006 (gmt 0)

I will build it for my end users, in the process, yes, a bonus byproduct could be to rank it in search results.

1) You can use CSS to make it look like text and use minimal markup.
2) If you're building it for users, they won't appreciate having to cut and paste links.

So, let's be honest, if you proceed, it will be nothing more than an experiment. Presumably, if successful in SEO terms you plan to use cloaking to deliver plain text to Google while delivering rich text (html) to users. In a twisted sort of way, this might not actually break Google guidelines since the content would be essentially the same, but I'm sure opinions would differ on that.

Kaled.

photopassjapan




msg:3105118
 11:28 am on Oct 2, 2006 (gmt 0)

But users are used to HTML by now... who's your target audience?
Also, if the files are in HTML extension yet you use NO HTML tags, not even the document header, it will look the same, no? Oh wait... no it doesn't.

The font type will be different and the background will be the default of the browser for HTML pages, not grey, right?

What kind of a project is it?
Where you can be this sure people will actually LIKE the txt site compared to a very simple HTML site with nothing but text ( and perhaps links )?

Naturalists on the net...?
What the net would be like in the world of "1984" or "Brazil"?

I like the idea somewhere... it's... a great utopia.
Perhaps a new trend.

AlgorithmGuy




msg:3105124
 11:35 am on Oct 2, 2006 (gmt 0)

Presumably, if successful in SEO terms you plan to use cloaking to deliver plain text to Google while delivering rich text (html) to users

Kaled,

Yes, you are correct. I would consider all options. I will also consider doing temporary redirects from it targetting unethical webmasters that may try to duplicate the site. Via serverside and proxies. Scripts being in the text pages pointing to them.

I don't want this notepad site to be affected by canonical issues.

I would not call it cloaking however. More like original content.

If my end users are happy, why not exploit all possibilities too.

Based on my discussions with top DMOZ editors, My proposed site will far excede their editorial guidelines.

[edited by: AlgorithmGuy at 11:36 am (utc) on Oct. 2, 2006]

[edited by: engine at 1:20 pm (utc) on Oct. 2, 2006]

AlgorithmGuy




msg:3105134
 11:42 am on Oct 2, 2006 (gmt 0)

The font type will be different and the background will be the default of the browser for HTML pages, not grey, right?

Photopassjapan,

If you call google's robots.txt or any txt page there is absolutely no code.

Whichever browser you use, only the notepade created sites contents will be revealed.

It is super efficient when compared to a html page.

Besides, you might use empty pages, and you dynamically produce content into them upon requests. Mine will be "what you see is what you get" No frills and no spills. Just good reading.

[edited by: engine at 1:20 pm (utc) on Oct. 2, 2006]

AlgorithmGuy




msg:3105144
 11:48 am on Oct 2, 2006 (gmt 0)

[]quote]1) You can use CSS to make it look like text and use minimal markup.[/quote]

Kaled,

This is interesting.

But I don't want to use html at all. Just .txt extensions. Website will be produced in notepad. If it is possible to influence a page aesthetically via css, then obviously I would consider it.

I can see what you mean, since the address of the css file may be contained in the .txt file. I'm not sure if that will work though. Or indeed a floating image layered on the txt page. The pages will have only .txt etentions.

[edited by: AlgorithmGuy at 11:48 am (utc) on Oct. 2, 2006]

[edited by: engine at 1:19 pm (utc) on Oct. 2, 2006]

Patrick Taylor




msg:3105151
 11:56 am on Oct 2, 2006 (gmt 0)

As far as I know, if you type markup into a .txt file, all you will see is the typed markup. It won't function as markup. So no stylesheet, no title, no headings, no hyperlinks.

I can't see this being much of a success.

photopassjapan




msg:3105154
 11:59 am on Oct 2, 2006 (gmt 0)

<---- still doesn't get it though

But can a server be set to default the index to a text file? I mean aren't all servers made to serve the index with an HTML response for the directory index even if the extension is different?

Besides you could have a 301 redirect to the txt file on an HTML index page. Sooner or later Google will pick it up as THE index in my opinion.

As for the
good reading
part... that's a matter of design, which can be dealt with much more effective... make that... more widely accepted methods.

A fun experiment though, but apart of the SE impact i don't get the motive. Back to retro teletext?

You could also start lobbying for .txt domains.

AG, as a person who is so insightful on servers, sites and SEs ( you have proven this on other threads ) you probably know that this isn't possible unless you have your own custom made server OS. And perhaps a custom browser? If there was a method to have pages with a content vs. code ratio of 100% i think we'd have seen it somewhere ranking as top 10.

I'm not sure but my guess is that it's a major factor. As is the text vs. text with links ratio. Or at least it should be.

trillianjedi




msg:3105155
 12:00 pm on Oct 2, 2006 (gmt 0)

User friendly non active links to copy and paste.

That's not as user-friendly as the user just being able to click on it though is it.

And how does a blind/partially-sited person with a screen reader get through your navigation?

Links are a fundamental fabric of the web, used by search engines spiders and users alike. You're looking to break something which, in design accessibility/usability terms, shouldn't be broken.

It'll be an interesting experiment.

TJ

AlgorithmGuy




msg:3105168
 12:07 pm on Oct 2, 2006 (gmt 0)

But users are used to HTML by now... who's your target audience?

Photopassjapan,

My target audience will be readers. With nimble fingers quick on keyboards.

Especially those looking for the content that relates to my proposed website. If my site contains exactly what they are looking for, I'd want it to rank accordingly.

Search engines are supposed to do just that. End users might look for it at DMOZ, I'd want it to be listed there too. Especially for the benefit of a link from DMOZ. That is very important and I am confident that when I present the website for inclusion, the editor will pass the site for listing.

This way, at least a title and description recognised by google will have propagated the txt based site externally. And from an ideal source.

[edited by: AlgorithmGuy at 12:07 pm (utc) on Oct. 2, 2006]

[edited by: engine at 1:18 pm (utc) on Oct. 2, 2006]

AlgorithmGuy




msg:3105177
 12:13 pm on Oct 2, 2006 (gmt 0)

And how does a blind/partially-sited person with a screen reader get through your navigation?

trillianjedi,

I can assure you, no partially sighted or full sighted person will see the site until "after" blind algo's and blind crawlers have determined where in search results the sites prominence will be listed.

I don't want to hinder the sites destiny in google.

If it ranks highly, I will consider accessibility requirements. Possibly even the LONGDESC attribute.

[edited by: engine at 1:18 pm (utc) on Oct. 2, 2006]

trillianjedi




msg:3105188
 12:17 pm on Oct 2, 2006 (gmt 0)

OK, so basically the SE's become your navigation?

Well, fair enough, I'll be interested to hear the results of your experiment.

TJ

AlgorithmGuy




msg:3105189
 12:19 pm on Oct 2, 2006 (gmt 0)

I can't see this being much of a success.

Patrick,

Success will be determined by useful content to my end users. If their vote is positive and they found what they where looking for, then it will be a great success.

If ranking it highly is possible, that is a bonus byproduct only.

[edited by: engine at 1:17 pm (utc) on Oct. 2, 2006]

This 45 message thread spans 2 pages: 45 ( [1] 2 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved