Welcome to WebmasterWorld Guest from 184.108.40.206
I'd be happy for google to not index this bit of content as well as not display it, if that makes a difference.
The extra HTTP request can actually speed up your site if you request the iFrame from a subdomain, because it forces the browser to open another connection while loading your site rather than the standard 2 per domain name, and if you cache the results from the iFrame the request will not be made for each time it's included on the page... Whether this applies or not of course depends on your overall system.
As for the complexity, I guess I've always been able to copy and paste my DB info from the main page to the iFrame page and just edit it, so it hasn't ever been much of a challenge to move portions of the page off the page for me, but maybe your situation is different.
As far as an image goes, the only thing I can think of is load time, but an undetailed, compressed image takes about no time to load... as long as the image is not complex they can be done fairly small and you can have the alt be the text so no one ever misses it.
I actually thought they were all fairly usable solutions and although I don't use the image solution I do use both of the other two, so it's interesting to hear someone thinks what I'm doing is using horrible solutions and it should be obvious to me they're horrible when I haven't thought of them that way....
Fascinating, thanks for the reply.
[edited by: tedster at 9:32 pm (utc) on Jan. 26, 2010]
I agree with the K-I-S-S principle, and I also try to optimize for the number of HTTP requests. Sometimes, I introduce an exception when there is an overriding priority to address. There's no mandate always to be compliant with those kind of guidelines.
I've got a link from the Apache Forum to one of my sites someone posted a year or so ago and still get visits (almost daily) from it, so some of these threads do get read for a long time into the future and if you think the standard ways all of us who do what you are asking about are horrible I think it's good for all of us who made suggestions and those who read in the future to know why you are stating they are horrible, because there's quite a few of us who have been around for quite a while and use the solutions we presented...
It obviously doesn't mean you need to use them or like the solutions presented, but if you're going to state they are horrible sharing some of your knowledge with the rest of us is cool too, so the 'why' behind the statements is appreciated.
There is nothing wrong with this solution and any claims that it is inadequate comes from a lack of understanding or some hyper l33t mentality.
Anyone surfing with JS turned off is used to things not working and isn't going to be upset that your site works that way... many sites work that way.
You mention that these solutions will all take extra work and will give you more to maintain. Well that is the truth for any new logic you want to implement. If it is too much work for you then just don't do it, but don't slag on the solutions for causing you to have to do work.
You can iFrame all but the <head> section of a page if you know what you are doing and if you really know what you are doing you can use Mod_Rewrite or PHP to access the original file/site from within the iFrame to generate the thread, so there's really more knowledge than work involved IMO.
[edited by: tedster at 9:47 pm (utc) on Jan. 26, 2010]
[edit reason] edited some off-topic content [/edit]
How would you achieve that with an iframe without needing to use JS to resize the columns so everything lines up?
Use the same table twice with a width=100% iFrame...
In the iFrame you set the table to display the head and either don't get the body information of the table or if you do, set the display to none on the rest of the cols... Then in the main HTML you don't include the head text in the HTML output. It would take a few minutes to get the spacing right, but an iFrame within a div without borders on either should keep everything the same size...
[edited by: tedster at 9:48 pm (utc) on Jan. 26, 2010]
[edit reason] edited some off-topic content [/edit]
If the pages are dynamically generated then detect google's IP range and put an IF statement around the block of text. That's what I do.
I say google IP range rather than user-agent because they sneak up on sites in disguise.
Actually that seems to be the theory, and maybe it's more webmasters now or something, so people think that's the way it is, but the numbers I've seen have the % of browsers not supporting JS dropping steadily in year-over-year comparisons. At one time it was as high as 20% or 30% from the data I saw, but is no where near there now according to the same stats, and if I remember correctly the data was from w3schools, which is a techy site, but maybe more people enable it for them? IDK, but the numbers I saw reported are dropping steadily... I think there wasa another site reporting a similar drop with aggregated data from a huge number of websites too... I'll try to find the sources again because I found the numbers interesting. They're posted in the supporters forum as part of a JS discussion some time before Oct. 1, 2009 but not by too much.
The PHP if statement is an option too, but I'm always hesitant to go that direction because of the IP detection involved and I keep thinking, but what if I miss one (LOL), and I've heard of some spidering with different UA / IPs, but cloaking's not really my thing. IMO It shouldn't really matter in a situation where the change is only slight though, because it could be 'an update' to the page between visits, but I do recommend checking out the cloaking forum for more information...
There's a few reasons for my position:
1.) Most sites use JS in some way and the people with noscript or no JS are used to things not working.
2.) I keep PHP stats as well as JS stats and the numbers I've looked at aren't significantly different after removing bots from the PHP version, even on sites where JS is only used for stats.
3.) The functionality JS (AJAX) allows for puts static sites to shame and I like to build cool sites.
4.) If people won't turn JS on for one of my sites, then it's not the site for them, because I honestly don't have time to do everything twice and if they don't want to run JS there's other sites that are probably more accommodating to them and it's cool if they want to visit those instead of mine, because my sites don't cater to everyone, but once you've used one you probably won't like the static sites that aren't anywhere near as easy to use, don't have the functionality, and really aren't anywhere near as cool in what they do and how they do it...
Anyway, if you build sites and have the time or want to absolutely maximize traffic, then you probably need to make sure everything works without JS, but personally I don't do it...
You are correct about JS being enabled more often over the years, according to W3, but W3's JS stats stop in Jan 2008 (5% off) so not much use really, as NoScript has been taken up mainly, I think, since then. Even so, 5% is a LOT of people.
Of three or four sites I checked, W3 is highest on FF at around 46%; other sites show about 20% to 32% (several results sources shown on wikipedia).
There is also a significant increase in internet traffic since that date, so perhaps 5% now is really several times the actual number it was then. :)
Also, IE does not have the easy control of FF so is far less likely to turn off JS etc; the stats should probably be reviewed in the light of that: how many FF users (who are probably more security concious anyway) turn it off would be more realistic than how many in total.
And I can't see many of the 70,000,000 downloaders turning off the default of "block scripts". Ok, that's downloads (presumably new ones) not users, but still an impressive number.
I guess some more of my thoughts are:
Niche, Target Audience, PHP v JS Stat Differences, Trust You Can Generate Via the Site Itself and other factors should weight in the decision...
Also, as I said, I usually make sure other people's sites work for everyone, but there are some sites I have you can't find the products from anywhere else and if you want one, you'll have to turn JS on for me, so I think 'Uniqueness' or 'Need/Want to Use' also play a role in what you can do as much as the JS / No-JS numbers do.
Really, I think quite a bit of whether someone who surfs with JS off will turn it on or not has quite a bit to do with the niche, the site itself and the 'message' the site communicates to visitors...
The biggest thing IMO is you're not changing the subject or the theme or the content significantly, and you're not trying to rank for cat, then displaying dog.
Reason Number 42 I require JS on some sites:
I don't cater to people who won't run JS, because you can do sooooo much more cool stuff with it than you can without. I refuse to take away from most user's experience or do the same work twice for a few...
The preceding was the closing of a post RE making a preview page with editable text on it where one of the people surfs without JS... They're really the ones who are losing, because there are so many things you can do with it you cannot do (or cannot accomplish as a coder with the same simplicity) without running JS.
Of course Reason number 43 is I'm as stubborn as everyone who says I have to make sure everything is accessible both ways and those who refuse to turn it on...
Think about it this way: If you have to do the same work twice and it takes roughly twice as long to provide the same functionality it's like building a site for 12% (or so) of the people who visit. Does it make practical business sense if you place a value on your time? Would you build a site only a small percentage of visitors can use? I can't see catering to the few when I could build another one in the same amount of time or increase the functionality for most of the visitors who browse with JS turned on. Why would I decrease functionality or take time away from increasing functionality for a few visitors, when I could spend the time otherwise, which will improve the experience and site for many? If people want to see some sites and use them the way they are intended they have to put their stubbornness to rest, IMO.