Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Maintaining #1 ranking while revising the page content

         

nicu

7:20 pm on Oct 17, 2009 (gmt 0)

10+ Year Member



One of my sites ranks on Page 1 for quite a few relevant phrases. It gets 5-8k unique visitors per day, mostly from non-SE sources, but Google still provides a good amount of traffic. The site itself is informational in nature, and most of the content lies right there on the first page.

My problem is this: That first page is huge. There is a "comments" section, which hasn't really been cleaned up, so there are over 500 comments right there on the main page. As tempted as I am to only display the 20 or 50 most recent ones and create a "Next" paging mechanism to display the rest, I fear that could hurt my rankings. Currently, I feel like I'm on top - and that I don't want to "fix what ain't broke" so to speak. But that main page is nearly 1mb. I'm not concerned with bandwidth, I'm concerned with user experience. But I fear that moving some comments (which trigger some keyword searches) might hurt my ranking, or at least the number of visitors driven to the site.

Any thoughts?

tedster

12:58 am on Oct 18, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'd say use your analytics to see how much traffic is coming in on which keywords - then use that information to inform your content surgery.

Lame_Wolf

6:45 am on Oct 18, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I'd say use your analytics to see how much traffic is coming in on which keywords

Analytics is a bit buggy. According to analytics X-amount of people arrived to one of my [extremely] new sites with a certain keyword. That "certain keyword" was - according to analytics to be a #7... In reality, that "certain keyword" wasn't even in the top 720 !

I think I know why it is called analytics now, because of the results it can produce.

tedster

7:08 am on Oct 18, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm talking about using server analytics (and it certainly doesn't need to be Google Analytics) to understand the search terms used by your actual search traffic - not making judgments from any supposed ranking position.

Lame_Wolf

7:55 am on Oct 18, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



thanks for clearing that up tedster.

nicu

7:35 pm on Oct 21, 2009 (gmt 0)

10+ Year Member



Thank you for the replies. I wanted to give consider to your suggestions before I responded. I think I've come up with another solution, and I wanted to hear your opinions.

Although I know it's considered a very serious violation, I was thinking about employing cloaking on the main page. To Google, I would present all the comments. To non-Google visitors, perhaps only 20 or 50 recent comments, with a paging mechanism. Although this is considered cloaking, which I know is a violation, I'm not really trying to "game" the search engine, since all the content is certainly visible on the site, just not all in one place. What are the odds of getting booted from the rankings? And is there an updated source of googlebot ip ranges? I've never done this before, so, are there any other pitfalls I should look out for?

tedster

7:42 pm on Oct 21, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If you do that and are successful, then Google will send traffic to the page based on ALL the content, but the visitor will sometimes not see their search words on the page they get served after a click in the results. That's pretty much a guaranteed visitor bounce, I think.

bwnbwn

7:54 pm on Oct 21, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I wonder what your bounce rate is or # of page views if the visitor hits such a large page. Bounce rate page views and time spent on site will tell you much about the stickness of the site.

Reno

9:19 pm on Oct 21, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



But I fear that moving some comments... might hurt my ranking

I've noticed over the years that one of the primary topics in this forum goes something like this: "I made some changes to my site and lost my rankings".

Usually someone replies to say that they changed their site without any loss of ranking, therefore there must be other factors involved -- which may or may not be the case.

But given how many people have reported this sort of situation, I'm inclined to think where there's smoke there's fire. Which is to say, if you change your site, you may find a shift in your positions, and it may not be for the better.

So take tedster's advice and be sure to keep your pages optimized for those keywords that seem to deliver the best traffic, based on your study of your server logs. But also, be prepared for whatever happens, because if the reports on this board are any guide, you are in fact taking your chances.

But also, it seems to me that if you make the page experience for your visitors more user friendly, that may outweigh a lot of other factors. So even if you take a temporary hit with Google but then bounce back -- this time with a better designed site -- then it will be worthwhile.

The fear is, of course.... how long before the bounceback?

..............................

buckworks

10:25 pm on Oct 21, 2009 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Consider whether there's ways you could streamline that page a bit that wouldn't remove any content just yet.

Something as simple as reducing white space in your source might save thousands of characters.

Look for ways to use CSS more efficiently. Is there style / formatting information that could be moved to an external CSS file? Wise use of contextual selectors can often reduce the number of classes you need.

That might buy you some time while you decide on more drastic surgery.

Receptional Andy

10:39 pm on Oct 21, 2009 (gmt 0)



Although I know it's considered a very serious violation, I was thinking about employing cloaking on the main page

What a lot of site owners don't seem to consider is risk/reward. Via web analytics (as tedster has pointed you to) and other data, you can get a pretty good idea of what's at stake, and compare that to what your expected return is. That's a lot easier with a website than many other routes to market, because there's so much more data available.

Black hat tricks are fire - and you can get burnt. Include that in how you quantify risk and reward. Be aware that there are viable alternatives out there, that can deliver a similar reward, with a greatly decreased risk. Usually, that will be a slower burn.

Most online companies make a lot of sales based on their brand/company name, which means it's not an acceptable risk to get "booted" at all.

tedster

11:11 pm on Oct 21, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



How about retaining a selection of comments (based on your traffic analysis) and call that "Key Comments." Then link to "Full Comments" at the bottom.

Metapilot

12:38 am on Oct 22, 2009 (gmt 0)

10+ Year Member



Without having analytics to go off of, you could do this: Copy and paste all of text from those comments into a .txt file (it doesn't have to be orderly what's so ever) and then copy and paste them from that file into a word frequency counter. Then sort by frequency. This will give you a list of the most used words in your comments. Delete all of the none thematic words and note the top 25 to 50 percent of those that are left--as well as their frequency. Then write a couple of paragraphs of quality copy incorporating them-- keeping an eye towards maintaining frequency percentages.

The more paragraphs, the better; the more words you use, the better; and the closer you come to maintaining frequency percentages the better (from the perspective of maintaining rankings and maintaining traffic for the full breadth of your long tail keywords) but don't fret over it too much -- make sure the copy is of value to your visitors, though. If you're up to it, also try a frequency counter that counts the frequency of two and three word phrases within your comments and use those phrases in the copy you write, as well.

With that, you get to eliminate your comments and still maintain relevant content on your home page. Of course, with the goal being simply to preserve your rankings while tidying up your homepage, you would know better than to alter your title tag in the midst of the process and you'd know better than to eliminate or substantially alter any other text you have on the page.

[edited by: tedster at 12:46 am (utc) on Oct. 22, 2009]

nicu

7:20 pm on Oct 22, 2009 (gmt 0)

10+ Year Member



tedster, to solve the problem of the visitors search terms not appearing, I could dynamically include the comment(s) that contain the search terms by parsing the referring URL and comparing that to the database of comments. That could solve the issue of having a guaranteed bounce.

I'm surprised Google doesn't penalize such a large page or view it as keyword stuffing. I suspect other search engines do, as it doesn't rank nearly as well on bing of yahoo.

Anyway, I do like the idea of cloaking. Perhaps a loophole could be to display a subset of comments to browsers that accept cookies (most visitors), and to display ALL comments to browsers that don't (since, AFAIK, googlebot doesn't accept cookies - is this true?). Would that still be considered cloaking?

Receptional Andy

7:39 pm on Oct 22, 2009 (gmt 0)



Perhaps a loophole could be to display a subset of comments to browsers that accept cookies (most visitors), and to display ALL comments to browsers that don't (since, AFAIK, googlebot doesn't accept cookies - is this true?). Would that still be considered cloaking?

Googlebot will not accept cookies, but Google send other automated traffic than downloads scripts images and so and does accept cookies. Your risk is that they algorithmically detect the difference between the two pages, or if you show up for competitive keywords that a human evaluator will flag the site as spam.

How you achieve it technologically is only going to have a small impact on risk, unless you do anything particularly blatant.

To be honest, you should probably just use a javascript "expand" style of link to hide an excessive number of comments. That kind of thing is in wide use of numerous websites.

That said, for a page with that much content, you'd likely see a better increase in visitors by splitting it out into individual pages that could be more targeted.

darkyl

7:47 pm on Oct 22, 2009 (gmt 0)

10+ Year Member



If your concern isn't bandwith but just user experience I'd suggest to try to reduce weight optimizing white spaces, images and code (like someone else said) and then visually "hide" content while keeping that content on the page without using dangerous tactics.

Tabs? Sliding windows? They're used on basically every major news/content portal for the very purpose of uncluttering the page while keeping the content in place.