Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: mademetop
We have just started talking to a company about optimizing their new Web site. The site has a very narrow focus in the financial world w/ businesses.
The site is a membership-driven portal that primarily sells information to individuals who work for large companies (100+ people in Accounts Payable).
This would be the largest (or broadest) site that we've optimized. We've had very good success with our smaller sites (20-100) pages while focusing on only 15-25 keywords.
Is there anything in particular that you need to "do" or watch for when optimizing for a larger site? The problem w/ this site is that ALL of the content is password proteched. About the only thing you can see w/o "registering" is a contact popup window and the about popup window.
Behind the password they will have HUNDREDS and HUNDREDS of pages of good stuff all coming out the database.
Could you do a little agent detecting and if google, allow in, if user send to password page or have it the other way around so that you let all spiders in but normal viewers get the password?
The problem w/ this site is that ALL of the content is password proteched.
Clearly, you need public pages in order to rank.
I have several of these sorts of clients, and what I usually have them do is have them make white papers and glossaries public.
As everyone else has said, your best bet is to create a "free side" of the site.
I suppose the easiest way to create a routine that determines if the user is logged in and has sufficient access. (i.e. You may require them to pay to keep access, or reply to an e-mail before full access is granted, whatever). IF access is granted, then display the page as it is. IF user is not logged in or access is insufficent, then show the content title and the first X words or X characters or the first paragraph, or whatever. Also, if not logged in, provide a "Sign Up or Log In to view the whole article" link.
The easiest way to limit the length of the article, I guess, would be to pull the recordset and assign it a variable name "article". IF NOT LOGGED IN, include "article-parse.inc". That file contains a routine that sets "article" equal to the shortened version. That's easy - words, read it one letter at a time and count "spaces", paragraphs - look for the first line feed character. etc.
Not sure what you've got this site built on (ASP, .NET, PHP, or whatever), but I hope this is some generic help that will get you going in the right direction. Let us know how it goes!
joined:July 3, 2002
Why bother optimising it at all?
There's nothing there for the average surfer - and no sales pitch to convert them.
Does the site really need optimising? Not the code or the content, but the business model. Does that need search engine referals?
It sounds like you need a highly targeted marketing campaign not SEO.
There are lots of fairly big sites out there that are being very successful at the "teaser" format, though - the New York Times, for example, does exactly what I described in my first post.
The kinds of keywords for which they want to be optimized are VERY broad and there are some HUGE sites out there w/ tons of content and hundreds of inbound links that are in the top 10 positions for many of them.
We'll see what happens! Thanks for the great input.
What I still don't understand though is, if they are going to cloak like that, why don't they just tell Googlebot not to cache their site? That way it wouldn't be so obvious and they would be less likely to get reported. And no one could do what I did and still access the page anyway. Or do they just not know that much about what they are doing and don't realize there is a way to keep Google from caching pages? Maybe they don't even realize they are doing something that could be considered spam?
RaraAvis - good luck! I think Grumpus'"teaser" format sounds like the best approach, if you can convince them of it.
Even if thet dont get caught using cloaking by some sort of automated system it is almost certain that a competitor will use something like the spam report
which has a tick box for cloaking.
Oringinally I was not suggesting cloaking, or not as I saw it. That said I am sure there was a member here who had gottn round it ok. Perhaps they were just lucky up to that point but that is all i can remember.
My site (larger than the one you're talking about) gets little or no search engine "entry" traffic on the home page. (Bookmarks and type-ins are common, though). That front page isn't optimized at all. It's about everything under the sun with teasers for this, that, the other thing. It's those deep pages that deal with the specific topics in question that bring the traffic. (I know it's a bad word, but, in essence, every page on your site ends up being a focused doorway page to the entire site).
If you want to convince them, take them to google and type in a search term (be it one that they want to optimize for or anything at random). If there is a small and targeted site, you'll likely see the homepage of that site showing in the SERPS, but you're not going to get the front page of any general information site - Google is going to send you right to the page with that information on it. On a larger site, unless you type in the name of the site itself, I bet you'd be hard pressed to find a search term that would actually bring you to the front page. (Try Amazon, for example - find a link in Google to the front page without using the word Amazon).
In the end, any given page can only be "hot" for a limited number of words and terms. Try to get more variety in there and it'll go down in the serps for the original terms. The only way to get a lot of terms hot is to have a lot of pages. The only way to get those pages ranked well is to put words on them. The only way to put words on them is to - well, put words on them. If these sots don't want to put words on these pages, then you're pretty much up Serp's Creek.
And you can tell 'em I said so.