Forum Moderators: open

Message Too Old, No Replies

Google Add URL

Submitting 500000 pages, do I have to worry?

         

leifwessman

11:56 am on Mar 19, 2003 (gmt 0)

10+ Year Member




From what I've read it shouldn't affect PR if I submit my webpage one or several times since it could be my competitor that wanted to set me up.

My case however is that I have lots of pages that google doesn't know about. Each page is about a product and the only way to find it is thru our search engine. Therefore I'm thinking of making a program that submits every URL to Google using the Add URL-page.

Does this violate any of the Google Rules? How many pages should I submit / minute?

Leif

born2drv

12:09 pm on Mar 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Good luck :)

Google doesn't index that many pages on any site that I know about. If you can't get them in Google's database by creating link maps and regular link structure in your site then I wouldn't bother submitting them at all it can only cause you grief.

Marcia

12:10 pm on Mar 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



leif, 500,000 pages auto-submitted by a program is not only against Google's wishes, but is totally unnessaary. If those pages can't be found by crawling links they won't get PR and can't rank for anything anyway.

Google's made it clear they don't want their servers hammered, even automated rank-checking software use is technically a TOS violation. *Any* site can block an IP and in this case 500K pages from the site isn't likely to cause that site to be cordially received.

I wouldn't do it - I'd just let Googlebot do her thing; there's no way you'll be able to force deep-crawling.

borisbaloney

12:20 pm on Mar 19, 2003 (gmt 0)

10+ Year Member



500 000 pages to submit / 50 programmers / 10 pages per day = 1000 days.

So you mean to say that you have all of these pages that you have not submitted at all in over three years?

Am I being harsh that this sounds suspect to me?

leifwessman

12:31 pm on Mar 19, 2003 (gmt 0)

10+ Year Member




I hear what you're saying. I still have some comments:

born2drv: Amazon has lots of pages [google.com] indexed

If those pages can't be found by crawling links they won't get PR and can't rank for anything anyway.

So. If I'm creating a new website with only one page I should not submit it since it doesn't have any links to it? Why does Google have an Add Url page at all then?

My question "do I have to worry?" is still not answered. Does anyone know?

Marcia

12:32 pm on Mar 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



boris, it sounds to me like those pages don't exist until there's a search done - that they're dynamically generated. If that's the case and they can't be reached by a link at all it's wasted risk to auto-submit.

People have gotten into trouble even for using rank-checking software, probably near that kind of load.

leifwessman

12:36 pm on Mar 19, 2003 (gmt 0)

10+ Year Member



boris, it sounds to me like those pages don't exist until there's a search done - that they're dynamically generated. If that's the case and they can't be reached by a link at all it's wasted risk to auto-submit

Yes, they are dynamic - but they exist without a search enginge! I have links from other websites to maby 100 of pages today.

Creating a sitemap only for Google doesn't make any sence at all. People who finds this sitemap (maby by using google) will get confused. The sitemap will be enormous and will have to be splitted lots of times...

Marketing Guy

12:43 pm on Mar 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Creating a sitemap makes more sense than submitting 500k pages! :)

Do you want the pages to be indexed by Google?

Then create a sitemap.

Split it down into a directory style navigation.

Chances are by doing this for a 500k page site then you will increase your traffic too from just the mapped pages - lots of niche keywords on each page! ;)

Scott

born2drv

12:44 pm on Mar 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



amazon.com also has a pagerank 8/9 what do you have? :)

Maybe someone can better advise you but from what I remember the # of pages you can get indexed is somehow proportional to the pagerank of your site.

I think this was the basis of Everyman's complaints that PR favored big business and they were the only ones allowed to have such high amounts of indexed pages.

If your site for example has a PR4 on the main page and little or no inbound links for other pages on your site there is no way a home page PR4 will be sufficient to give any link value to 500,000 pages. Most of them will be PR0 so it's not even worth Google's time to index them. But when you have a PR8-9 you can get them indexed because the extremely deep content will still be PR1-3 or whatever.

Birdman

12:47 pm on Mar 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>>My question "do I have to worry?" is still not answered. Does anyone know?

YES!

ciml

1:53 pm on Mar 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



> do I have to worry?

I would hope that it wouldn't have any affect on the site (as it could be a competitor), but your ISP connection might find itself blocked from Google (they don't like to be accessed by tools, and you're not going to submit them all by hand!).

> Why does Google have an Add Url page at all then?

I suspect that the only use for it is to stop Google support from being asked many times each day why there isn't one.

Google follows links. IMO, you should consider a robot and human friedly site architecture that opens up your content via well organised categories. That requires effort, of course. You may wish to consider an agency/affiliate deal with someone who's good at information architecture.

rogerd

2:02 pm on Mar 19, 2003 (gmt 0)

WebmasterWorld Administrator 10+ Year Member



I suspect that the only use for it is to stop Google support from being asked many times each day why there isn't one.

I agree, ciml - I think it's a placebo. (Kind of like the "close door" button on the elevator - it gives the Type A individual something to do while the elevator decides when to close the door. ;))

linkshark

2:04 pm on Mar 19, 2003 (gmt 0)

10+ Year Member



If you want to get search results indexed use site mapS.

Break them into themes or product sections. Make a 100-200 link site map per section. If there are more than 100-200 links per section just do a " << Previous Next >> "

If you have a site that big, you probably have a datafile which makes it easy to make these site maps/indexes with something like Webmerge. At 200 links per page that is only 2500 site maps. No big deal.

I have several sites 100K+ pages that are basically search results (all driven by a search engine). Google ate every page of all of them. All of the urls are dynamic
mysite.con/script.cgi?q=my+search+terms.

Just keep it around 100-200 links per page and you should be all set.

Also if you log search terms entered by your visitors, keep a link list of the top 50-100 searches on your site and change it every month or whatever.

leifwessman

3:12 pm on Mar 19, 2003 (gmt 0)

10+ Year Member



Ok! I've created a "sitemap" instead. 100 links to products and a Next/Prev on every page. One link to sitemap from frontpage.

Thank you for your help!

Leif

rogerd

3:22 pm on Mar 19, 2003 (gmt 0)

WebmasterWorld Administrator 10+ Year Member



One suggestion, Leif - if I'm understanding you, it sounds like bots will have to go sequentially through the link pages. Instead of a purely linear list (Page 1 links to Page 2, Page 2 links to Page 3, etc.), I'd recommend providing a hiearchical means of access, i.e., start with major product categories, each page of which is linked to smaller categories, etc.

jranes

4:08 pm on Mar 19, 2003 (gmt 0)

10+ Year Member



The main thing is that subitting the links will at best get you nothing and at worst get you banned. Yes you should worry.

oLeon

5:21 pm on Mar 19, 2003 (gmt 0)

10+ Year Member



I never know that it īd be successful to submit to G

:-)

Donīt do that!
Itīs only bans your site or even the ISP from G.

Jesse_Smith

12:48 am on Mar 20, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



:::Google doesn't index that many pages on any site that I know about.

um Yahoo has 4.5 million pages indexed. [google.com]