Welcome to WebmasterWorld Guest from

Forum Moderators: mademetop

Message Too Old, No Replies

Optimizing a larger site



1:54 pm on Jan 14, 2003 (gmt 0)

10+ Year Member

Hi everyone!

We have just started talking to a company about optimizing their new Web site. The site has a very narrow focus in the financial world w/ businesses.

The site is a membership-driven portal that primarily sells information to individuals who work for large companies (100+ people in Accounts Payable).

This would be the largest (or broadest) site that we've optimized. We've had very good success with our smaller sites (20-100) pages while focusing on only 15-25 keywords.

Is there anything in particular that you need to "do" or watch for when optimizing for a larger site? The problem w/ this site is that ALL of the content is password proteched. About the only thing you can see w/o "registering" is a contact popup window and the about popup window.

Behind the password they will have HUNDREDS and HUNDREDS of pages of good stuff all coming out the database.

Any ideas?


2:04 pm on Jan 14, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

I have heard this being gotten around and I have come by sites that appear on google yet require passwords.

Could you do a little agent detecting and if google, allow in, if user send to password page or have it the other way around so that you let all spiders in but normal viewers get the password?


2:05 pm on Jan 14, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

The problem w/ this site is that ALL of the content is password proteched.

Clearly, you need public pages in order to rank.

I have several of these sorts of clients, and what I usually have them do is have them make white papers and glossaries public.


2:07 pm on Jan 14, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

Hardly much point in optimising behind the passwords, as spiders will be blocked from indexing.


2:07 pm on Jan 14, 2003 (gmt 0)

WebmasterWorld Senior Member sem4u is a WebmasterWorld Top Contributor of All Time 10+ Year Member

Well Google doesn't know the passwords so it cannot access the data!

Why not produce summaries of the documents that can be indexed by the search engines?


2:11 pm on Jan 14, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

This has definitely been discussed in the last six months. Had a look but to no avail. Perhaps a mod with their crazy searching ability can step in.

Unless I was dreaming (dont think i was though :))


3:27 pm on Jan 14, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

The problem with the first suggestion here (to allow Googlebot in without a password) is two-fold. First, I could view your site by simply identifying myself as "Googlebot". Second, and probably more importantly, by allowing Googlebot in, you're doing cloaking and displaying different content to different browsers. You'll surely get hit with a penalty eventually (probably the first time a bored person searches, finds your site, clicks and gets denied).

As everyone else has said, your best bet is to create a "free side" of the site.

I suppose the easiest way to create a routine that determines if the user is logged in and has sufficient access. (i.e. You may require them to pay to keep access, or reply to an e-mail before full access is granted, whatever). IF access is granted, then display the page as it is. IF user is not logged in or access is insufficent, then show the content title and the first X words or X characters or the first paragraph, or whatever. Also, if not logged in, provide a "Sign Up or Log In to view the whole article" link.

The easiest way to limit the length of the article, I guess, would be to pull the recordset and assign it a variable name "article". IF NOT LOGGED IN, include "article-parse.inc". That file contains a routine that sets "article" equal to the shortened version. That's easy - words, read it one letter at a time and count "spaces", paragraphs - look for the first line feed character. etc.

Not sure what you've got this site built on (ASP, .NET, PHP, or whatever), but I hope this is some generic help that will get you going in the right direction. Let us know how it goes!



3:42 pm on Jan 14, 2003 (gmt 0)


Why bother optimising it at all?

There's nothing there for the average surfer - and no sales pitch to convert them.

Does the site really need optimising? Not the code or the content, but the business model. Does that need search engine referals?

It sounds like you need a highly targeted marketing campaign not SEO.



4:48 pm on Jan 14, 2003 (gmt 0)

10+ Year Member

I came across some pages from doing a search in Google yesterday, that when I tried clicking on the link, it said I needed to register and log in. Fortunately, for me, I was able to click on the cache link and still get to the content. I don't know how they set it up. There was no way I would have registered to use the site though.


8:59 pm on Jan 14, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

Trisha, they're using cloaking - which is a big no-no. If - er, WHEN - they get caught, you'll see them vanish from your SERPS. I agree that most people wouldn't subscribe if someone is masking something.

There are lots of fairly big sites out there that are being very successful at the "teaser" format, though - the New York Times, for example, does exactly what I described in my first post.



9:29 pm on Jan 14, 2003 (gmt 0)

10+ Year Member

Well, after my conference call w/ them today they didn't seem too happy w/ my advice - i.e. make some of your content free, don't display every bit of content you do make public in a database-driven template where you can't change the META data (just Titles/Descriptions), etc.

The kinds of keywords for which they want to be optimized are VERY broad and there are some HUGE sites out there w/ tons of content and hundreds of inbound links that are in the top 10 positions for many of them.

We'll see what happens! Thanks for the great input.


9:54 pm on Jan 14, 2003 (gmt 0)

10+ Year Member

It didn't even occur to me that they might be cloaking, as that is something I've never done or know much about. (I guess I didn't read Grumpus' first post carefully enough either!)

What I still don't understand though is, if they are going to cloak like that, why don't they just tell Googlebot not to cache their site? That way it wouldn't be so obvious and they would be less likely to get reported. And no one could do what I did and still access the page anyway. Or do they just not know that much about what they are doing and don't realize there is a way to keep Google from caching pages? Maybe they don't even realize they are doing something that could be considered spam?

RaraAvis - good luck! I think Grumpus'"teaser" format sounds like the best approach, if you can convince them of it.


8:35 am on Jan 15, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member


Even if thet dont get caught using cloaking by some sort of automated system it is almost certain that a competitor will use something like the spam report


which has a tick box for cloaking.

Oringinally I was not suggesting cloaking, or not as I saw it. That said I am sure there was a member here who had gottn round it ok. Perhaps they were just lucky up to that point but that is all i can remember.



12:26 pm on Jan 15, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

RaraAvis - you're in a tough situation because the people you are working for know a LITTLE and act like they know what they are talking about. You're not EVER have a broad range of keywords bringing people in to a single page. If these people have hundreds of keywords/terms they want to optimize for, then they need hundreds of pages that deal with each of those terms. I can't think of a better way than to slap a teaser up. A teaser for each term/concept they want to optimize for is going to bring traffic to that page, and then it's just a matter of converting that traffic to a sale.

My site (larger than the one you're talking about) gets little or no search engine "entry" traffic on the home page. (Bookmarks and type-ins are common, though). That front page isn't optimized at all. It's about everything under the sun with teasers for this, that, the other thing. It's those deep pages that deal with the specific topics in question that bring the traffic. (I know it's a bad word, but, in essence, every page on your site ends up being a focused doorway page to the entire site).

If you want to convince them, take them to google and type in a search term (be it one that they want to optimize for or anything at random). If there is a small and targeted site, you'll likely see the homepage of that site showing in the SERPS, but you're not going to get the front page of any general information site - Google is going to send you right to the page with that information on it. On a larger site, unless you type in the name of the site itself, I bet you'd be hard pressed to find a search term that would actually bring you to the front page. (Try Amazon, for example - find a link in Google to the front page without using the word Amazon).

In the end, any given page can only be "hot" for a limited number of words and terms. Try to get more variety in there and it'll go down in the serps for the original terms. The only way to get a lot of terms hot is to have a lot of pages. The only way to get those pages ranked well is to put words on them. The only way to put words on them is to - well, put words on them. If these sots don't want to put words on these pages, then you're pretty much up Serp's Creek.

And you can tell 'em I said so.



1:59 pm on Jan 15, 2003 (gmt 0)

10+ Year Member


That's exactly what I'm going to do! "Listen folks, Grumpus said you're just up the creek!" :)



Featured Threads

Hot Threads This Week

Hot Threads This Month