Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: mademetop
Are there some techniques to minimize the impact I've heard redirects can have on rankings? Currently we are plan on using a full basic page of MSIE friendly code with a JS detect in the HEAD. The redirect for Netscapers will be a "replace" so that the back button doesn't get confused.
Will this look like SPAM to the SEs?
I'm coming to the conclusion that we must go away from any automatic redirects, even though click pages will look pretty "geeky" on this particular site. We really prefer the mechanics to be invisible as much as possible.
I've got another question about the same issue. The content of the two versions of the site - MSIE and Netscape - is obviously very similar. Would it be best to use robots.txt and meta robots tags to keep the search eninge spiders away from seeing the set of near-duplicate pages?
Or is there a possibility of changing page titles and meta tags to gain an advantage and enhance the theme?
You may have just given us a clue to a small part of the AV problems many have been having. Once you cleaned out the redirects did you have to get AV to take a special look at your site, or did you just re-submitt?
We're now building in favor of click-throughs, but have several hundred pages remaining with JS or meta redirects. Guess I'll be busy!!!
In our new approach we're building the page that is clicked through to in a frame (header type) This avoids the appearence of lots of page that seem to do nothing but leave the site - and the spiders usually don't go past the frame page. By having a dozen or so of those frame pages on the site we also avoid lots of pages with only one destination on or off the site.
Even on my site, AV has spidered some on my draft pages, and found some specific, standalone home pages devised for directories. These pages look just like my index.htm page but they allow me to closely monitor entries from the directories. Unfortunately, AV has spidered and indexed some of these pages (although it should not) and I'm holding my breath incase I have a problem of being booted out for spamming, albeit unintentionally.
There are other people here with far greater knowledge on the technical aspects of redirects and may know of a working solution. My personal experience is that I will avoid any redirect, unfortunately. Redirects can amke the site work well from the users point of view. Good luck.
I tried experiments of putting links (normal and hidden) as high up the page as possible and setting refreshes to delay fro greater than 10-seconds, to see if that helped. Actually, it did. I can't remember which engines it solved problems with but decided to remove the scripts and meta refresh to solve the problem of indexing.
Once the site was tidied up, I resubmitted the key pages to the SEs - all of them. Not the directories, of course (unless they had not listed the site). It took many months to get indexed, but in the end - success!
Good luck - keep us informed.
We've decided we do need to have two versions of the site, but one of them will be kept isolated -- totally away from the spiders (we hope!). We plan to do the browser testing and redirects by calling .js files, so the actual redirect code will not be on the page.
Without the input from this forum, our design crew would never have thought of this solution. In fact, we never would have known we were headed right into a problem until it blew up in our face.
In fact, we decided to move as much JS and CSS as possible off the page and into separate files. Besides keeping the redirects off the HTML page, this will also help the ratio of text to code.
I'll give a report on our results, whenever there is something new to report, and thanks again.