homepage Welcome to WebmasterWorld Guest from 54.237.213.31
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Marketing and Biz Dev / General Search Engine Marketing Issues
Forum Library, Charter, Moderators: mademetop

General Search Engine Marketing Issues Forum

    
How to index 10,000 pages - blocked for 3 years by robots.txt
venunath

5+ Year Member



 
Msg#: 4029331 posted 7:25 am on Nov 21, 2009 (gmt 0)

I have 3 old domain. Having 10,000 web pages.

Because of few reasons i used robots.txt and disallow the search engines for six months not to index the web pages. Now i have removed the disallow option.

Now i want to index 10,000 web pages by Google and remaining search engines. For this i have submitted to Google through webmaster tool and doing directory submitting, press releases,SMO

Can any one suggest me how index the pages ASAP.

 

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4029331 posted 4:27 pm on Nov 21, 2009 (gmt 0)

One good step is to create an XML Sitemap and show its location in your robots.txt file. All the search engines will respond to this.

venunath

5+ Year Member



 
Msg#: 4029331 posted 3:37 am on Nov 23, 2009 (gmt 0)

i have create a XML sitemap and placed in robots.txt like this

Sitemap: http://www.example.com/sitemap.xml

Is it correct and any other suggestions.

[edited by: makemetop at 11:10 am (utc) on Dec. 22, 2009]
[edit reason] Changed to example.com and delinked [/edit]

venunath

5+ Year Member



 
Msg#: 4029331 posted 6:47 am on Nov 23, 2009 (gmt 0)

For every page i have add a flash, So this the one reason those web pages are not indexing?
Can any one help me

phranque

WebmasterWorld Administrator phranque us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 4029331 posted 12:46 pm on Nov 24, 2009 (gmt 0)

is that 10,000 pages in flash programs or 10,000 urls of documents served?
if you have replaced your textual content with flash, that will cause a problem.
you should use swfobject to provide alternate content for non-flash-enabled user agents such as SE crawlers.

venunath

5+ Year Member



 
Msg#: 4029331 posted 8:56 am on Dec 22, 2009 (gmt 0)

Hi,

the location of

files

http://www.example.com/category/subcagtegory/abac.html

My website is located like this. How to index the pages. Is there any tips for this?

[edited by: makemetop at 11:08 am (utc) on Dec. 22, 2009]
[edit reason] Changed to example.com and delinked [/edit]

makemetop



 
Msg#: 4029331 posted 11:12 am on Dec 22, 2009 (gmt 0)

Make sure all directories and subdirectories are linked to from your sitemap. As long as all sections of your site are linked to, they should be indexed OK.

Of course, there is a big difference in being indexed and actually ranking for anything - but it is an important first step.

venunath

5+ Year Member



 
Msg#: 4029331 posted 12:20 pm on Dec 22, 2009 (gmt 0)

Hi,

Means i have to place properly in sitemap.html file or i have to include in sitemap.xml file.

Can you explain me briefly.

makemetop



 
Msg#: 4029331 posted 1:32 pm on Dec 22, 2009 (gmt 0)

I would do both, if you have both.

Certainly you should at least have an html version for all spiders.

rj87uk

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4029331 posted 2:42 pm on Dec 22, 2009 (gmt 0)

Not only for search engines but for usability too - Not everyone has flash enabled or people with a reading disability would have trouble enlarging the text in a flash only website.

venunath

5+ Year Member



 
Msg#: 4029331 posted 5:02 am on Dec 23, 2009 (gmt 0)

i have content website and using flash in header of the page in all 10k pages. Is this will effect for indexing my pages.

If so what to do? How to index those pages

suresh



 
Msg#: 4029331 posted 7:49 am on Dec 23, 2009 (gmt 0)

For indexing fast, after creating sitemap.xml submit it to webmaster tools. By doing this, not only indexing will be fast but also if any issue or errors will be occur in sitemap.xml will be report to you by the tool. Also try to get some links from good and fast indexing site because it also help much to crawl your site by search engine crawlers.

venunath

5+ Year Member



 
Msg#: 4029331 posted 7:57 am on Dec 23, 2009 (gmt 0)

Hi Suresh,

As i had previously mentioned in my first post

"I have 3 old domain. Having 10,000 web pages.

Because of few reasons i used robots.txt and disallow the search engines for six months not to index the web pages. Now i have removed the disallow option.

Now i want to index 10,000 web pages by Google and remaining search engines. For this i have submitted to Google through webmaster tool and doing directory submitting, press releases,SMO

Can any one suggest me how index the pages ASAP."

venunath

5+ Year Member



 
Msg#: 4029331 posted 11:57 am on Dec 29, 2009 (gmt 0)

can any one suggest me to index 10,000 pages

phranque

WebmasterWorld Administrator phranque us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 4029331 posted 8:06 am on Dec 30, 2009 (gmt 0)

is GWT showing any problem related to crawling your site?
have you looked at your access logs to see how googlebot is crawling your urls?
do you have inbound links from urls that are already indexed?

venunath

5+ Year Member



 
Msg#: 4029331 posted 10:32 am on Dec 30, 2009 (gmt 0)

In GWT is not showing any error. yes i have inbound links total 180 and internal links too.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Marketing and Biz Dev / General Search Engine Marketing Issues
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved