Forum Moderators: Robert Charlton & goodroi
I have almost nailed down the site structure and it will be something to this effect.
domain.com/brand/state/city/ with of course each dir having it's own index page targeted towards the brand and it's local outlets.
Based on the numbers I have run with all cities in the US, just the directory structure alone will create close to 2 Million pages of navigation content. Even if I cut the cities down by 3/4th to only cover semi decent sized cities, it will still have 1/2 million pages or if I did REALLY deep cuts maybe 100K pages. This is before even adding the reviews into the mix.
I could launch tommorrow with that many pages but I just don't know. I have thought about setting up the system so it only show links to brands, citys, and states when an actual listing has been added to the database. I bought a list from a data source but they need to be scrubed before each goes live. Scrubing the listings to the database manually is going to take a long time anyway. I figure 100 new listings a day depending on how many hours I put in.
Going in this direction I could work on scrubing a single city at a time so not as to add extra directory pages until that certain city is done. 100 pages a day I would assume would look allot more natural then 100k-2Mil from the start.
So what are others opinions on this. Is all this worry for nothing?
You will have to design and write for people instead of the engines.
Jermey, check what you wrote a couple of weeks ago, [webmasterworld.com...]
I take it you have changed your mind?
The question I was answering was
how to optimize if the results will be always different from user to user?
The big word being "if". I do not believe the serps are to the point to where the only person you should be writing for is the user. They are still predictable based on certain principles. Give it a few years and that may be true, but for now, both what the user and search engine are expecting have to be taken into consideration.
I could launch tommorrow with that many pages
How many pages, 100,000 or 2,000,000?
I have thought about setting up the system so it only show links to brands, citys, and states when an actual listing has been added
This sounds like a good idea, especially if you start with the 100,000 and added a bunch of pages each day.
Can you make a obvious notice that new content is added daily to encourage folks to return and see what's new?
I'm assuming that on a site this size you'll have an internal search. If so, can you coordinate the new content with the unsuccesfull search results? If so, can you automate a message on the results oage saying thanking the person for the inquiry and telling them the content they are looking for is coming soon, or something similar?
I think it was a mistake for me to wait to launch until the site was "ready". I feel I should have launched with a bare minimum and built slowly. I'd be in much better shape at the moment I'm sure. With 2 million pages, you really should have launhced a long time ago I think.
Google likes to see action, while Yahoo moves at the speed of molasis. I won't be waiting to launch again until everything is perfect, after all these are websites, not the space shuttle.
If so, can you automate a message on the results oage saying thanking the person for the inquiry and telling them the content they are looking for is coming soon, or something similar?
Very good idea. It actually just gave me another idea. The main way people will find locations is via a zip code (maybe city) search. I can load 100% of the data into the search function and write a script to release 100 listings a day to the directory pages. The directory structure is actually htaccess created, so the SE's can see the domain.com/dir/dir, but the people doing the search will do so on domain.com/listing.php?var=2452 which I can deny in the robots file.
I'd love to know if this works because I might be in a similar position soon.
The directory structure is actually htaccess created, so the SE's can see the domain.com/dir/dir, but the people doing the search will do so on domain.com/listing.php?var=2452 which I can deny in the robots file.
This is a mistake. People will make links to the URLs they see and as you disallow spiders access to those pages, you will not get ranking benefits from those links. Pages should only be visible under one address.
Your worry is very real. I think attempting to launch 2 million pages at one time is suicide.
Especially if they're just empty vessels waiting to be filled by users.
Google will index the pages as soon as the noindex tag comes off. I did this to a link directory with many categories but few links and it worked great. As soon as links were added to a category, the noindex tag was changed to index and google indexed the page 2 weeks later.
Why don't you start with 10 of the biggest cities and see how those pages work out, then roll it out accordingly. That way you can see how effective the site is in terms of converting customers.
How much better would the web be for users without databases.