Forum Moderators: Robert Charlton & goodroi
My problem is this: That first page is huge. There is a "comments" section, which hasn't really been cleaned up, so there are over 500 comments right there on the main page. As tempted as I am to only display the 20 or 50 most recent ones and create a "Next" paging mechanism to display the rest, I fear that could hurt my rankings. Currently, I feel like I'm on top - and that I don't want to "fix what ain't broke" so to speak. But that main page is nearly 1mb. I'm not concerned with bandwidth, I'm concerned with user experience. But I fear that moving some comments (which trigger some keyword searches) might hurt my ranking, or at least the number of visitors driven to the site.
Any thoughts?
I'd say use your analytics to see how much traffic is coming in on which keywords
Analytics is a bit buggy. According to analytics X-amount of people arrived to one of my [extremely] new sites with a certain keyword. That "certain keyword" was - according to analytics to be a #7... In reality, that "certain keyword" wasn't even in the top 720 !
I think I know why it is called analytics now, because of the results it can produce.
Although I know it's considered a very serious violation, I was thinking about employing cloaking on the main page. To Google, I would present all the comments. To non-Google visitors, perhaps only 20 or 50 recent comments, with a paging mechanism. Although this is considered cloaking, which I know is a violation, I'm not really trying to "game" the search engine, since all the content is certainly visible on the site, just not all in one place. What are the odds of getting booted from the rankings? And is there an updated source of googlebot ip ranges? I've never done this before, so, are there any other pitfalls I should look out for?
But I fear that moving some comments... might hurt my ranking
Usually someone replies to say that they changed their site without any loss of ranking, therefore there must be other factors involved -- which may or may not be the case.
But given how many people have reported this sort of situation, I'm inclined to think where there's smoke there's fire. Which is to say, if you change your site, you may find a shift in your positions, and it may not be for the better.
So take tedster's advice and be sure to keep your pages optimized for those keywords that seem to deliver the best traffic, based on your study of your server logs. But also, be prepared for whatever happens, because if the reports on this board are any guide, you are in fact taking your chances.
But also, it seems to me that if you make the page experience for your visitors more user friendly, that may outweigh a lot of other factors. So even if you take a temporary hit with Google but then bounce back -- this time with a better designed site -- then it will be worthwhile.
The fear is, of course.... how long before the bounceback?
..............................
Something as simple as reducing white space in your source might save thousands of characters.
Look for ways to use CSS more efficiently. Is there style / formatting information that could be moved to an external CSS file? Wise use of contextual selectors can often reduce the number of classes you need.
That might buy you some time while you decide on more drastic surgery.
Although I know it's considered a very serious violation, I was thinking about employing cloaking on the main page
What a lot of site owners don't seem to consider is risk/reward. Via web analytics (as tedster has pointed you to) and other data, you can get a pretty good idea of what's at stake, and compare that to what your expected return is. That's a lot easier with a website than many other routes to market, because there's so much more data available.
Black hat tricks are fire - and you can get burnt. Include that in how you quantify risk and reward. Be aware that there are viable alternatives out there, that can deliver a similar reward, with a greatly decreased risk. Usually, that will be a slower burn.
Most online companies make a lot of sales based on their brand/company name, which means it's not an acceptable risk to get "booted" at all.
The more paragraphs, the better; the more words you use, the better; and the closer you come to maintaining frequency percentages the better (from the perspective of maintaining rankings and maintaining traffic for the full breadth of your long tail keywords) but don't fret over it too much -- make sure the copy is of value to your visitors, though. If you're up to it, also try a frequency counter that counts the frequency of two and three word phrases within your comments and use those phrases in the copy you write, as well.
With that, you get to eliminate your comments and still maintain relevant content on your home page. Of course, with the goal being simply to preserve your rankings while tidying up your homepage, you would know better than to alter your title tag in the midst of the process and you'd know better than to eliminate or substantially alter any other text you have on the page.
[edited by: tedster at 12:46 am (utc) on Oct. 22, 2009]
I'm surprised Google doesn't penalize such a large page or view it as keyword stuffing. I suspect other search engines do, as it doesn't rank nearly as well on bing of yahoo.
Anyway, I do like the idea of cloaking. Perhaps a loophole could be to display a subset of comments to browsers that accept cookies (most visitors), and to display ALL comments to browsers that don't (since, AFAIK, googlebot doesn't accept cookies - is this true?). Would that still be considered cloaking?
Perhaps a loophole could be to display a subset of comments to browsers that accept cookies (most visitors), and to display ALL comments to browsers that don't (since, AFAIK, googlebot doesn't accept cookies - is this true?). Would that still be considered cloaking?
Googlebot will not accept cookies, but Google send other automated traffic than downloads scripts images and so and does accept cookies. Your risk is that they algorithmically detect the difference between the two pages, or if you show up for competitive keywords that a human evaluator will flag the site as spam.
How you achieve it technologically is only going to have a small impact on risk, unless you do anything particularly blatant.
To be honest, you should probably just use a javascript "expand" style of link to hide an excessive number of comments. That kind of thing is in wide use of numerous websites.
That said, for a page with that much content, you'd likely see a better increase in visitors by splitting it out into individual pages that could be more targeted.
Tabs? Sliding windows? They're used on basically every major news/content portal for the very purpose of uncluttering the page while keeping the content in place.