Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Is it possible/allowed to hide part of a page from Google?

         

downhiller80

2:47 am on Jan 26, 2010 (gmt 0)

10+ Year Member



I have a legitimate reason to want google not to use part of my page in its snippets in the SERPs, a reason that would benefit my users. As far as I know there's no way of doing this is there? Shame you can't just put a rel="ignore" or something on any HTML tag. Or am I missing a trick?

I'd be happy for google to not index this bit of content as well as not display it, if that makes a difference.

Cheers

TheMadScientist

10:41 pm on Jan 28, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



On some of them, no... I build/run sites as a businesses and there are times when I have to make decisions on where my time is most effective WRT the bottom line, the same as any business, because I have to pay my bills, so, unfortunately, there are some sacrifices I have to make there, and what's surprising to me is the contradictions in posts around the forums here... (not your's specifically), but some say you can't keep your content from being scraped if someone wants it, even with AJAX, which means there must be an automated way to access it, and yet others post if you don't program the same site for everyone you're wrong, so which is it? Can scrapers access everything or not, and if they can, then why aren't screen readers programmed that way once rather than the rest of us all having to do the same work twice?

Of course I have sites that are noindexed, so I'm making them unavailable for finding by everyone who uses search engines, which probably isn't fair either...

IMO there are some things not everyone can do, like play paint-ball or snowboard, and there are some sites just not built for everyone to access. One of the main one's I'm talking about caters to those who are involved in one of the preceding sports which means those with screen readers probably have no interest in the first place, and although you cannot buy anything from the site with a screen reader or without JS there are items we are only keeping 10% of the sale price from and the rest of the after tax profit will be donated to cancer research and support for wounded troops and their families, which is way more than most here could claim they are doing...

TheMadScientist

11:42 pm on Jan 28, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Here's another thought on the whole JS and inaccessibility to screen readers:

Why is it NO ONE complains about password protection if accessibility to all information is an entitlement to anyone one the web? (I'm not saying that's your point specifically. It's more of a general question.) Some charge, some just make you register, on some (like here) you can't use a free e-mail address... MOST people CAN use JS and a few choose not to.

The price of admission on some of my sites is: Run JavaScript

You don't have to create an account.
You don't have to sign in.
You don't have to fill out a form.
You don't have to give me your e-mail address or any personal information.
All you have to do is turn JavaScript on... IMO it's a small fee for access to the functionality of a site you want to visit when compared to many and if you can't and don't have a friend who can and tell you what it says, then you can't afford to be a visitor, I guess.

FaceBook doesn't 'gracefully degrade' and not only requires an account, but requires the use of JS... People who use screen readers can't access FaceBook and I don't see anyone up in arms screaming 'not fair', so I don't have a problem building some niche sites where JS is required, and I don't think that's thoughtless or cold or anything else, because not everything on the Internet is accessible to everyone, much like the second floor of an apartment complex where they don't provide elevators is inaccessible to wheelchairs... It's life... Not everyone can go everywhere.

downhiller80

12:21 am on Jan 29, 2010 (gmt 0)

10+ Year Member



There's no way a site as big as facebook can get away with ignoring blind etc users:

[insidefacebook.com...]

[edited by: tedster at 1:08 am (utc) on Jan. 29, 2010]

TheMadScientist

12:27 am on Jan 29, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Cool, I used a bad example... Thanks for pointing it out.

[edited by: tedster at 1:06 am (utc) on Jan. 29, 2010]

tangor

12:44 am on Jan 29, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Some sites might fall foul of accessibility (government mandated) if there is a general requirement for JS. Depends on the site info, use, etc. Merely an observation.

I don't code for IE6 these days because of the low percentage, however, all of my sites will run in any browser, regardless of version... I tend to avoid the bleeding edge of technology. :)

ppc_newbie

7:40 am on Jan 29, 2010 (gmt 0)

10+ Year Member



An option I haven't seen mentioned yet.

About making the text an image.
PHP can very nicely return text as an image. For that matter if it's a bot return an image containing text, else return the text.

The poster was also worried about resizing stuff for the users. Why bother changing things for the bot, it probably isn't going to care about alignments much.

TheMadScientist

7:55 pm on Jan 29, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Alright downhiller80. I'm thinking about making the sites I work on in AJAX work for the screen readers until the screen readers work for everyone, because it's cool to do and #*$!, this is still America isn't it? I think I'll make em 'more readable', just because it's cool... Thanks for sharing your opinion (idea). Much appreciated!

Edited: Verbiage (or something similar) LOL

downhiller80

3:52 am on Jan 30, 2010 (gmt 0)

10+ Year Member



So I decided this evening to add a "noindex" to a few of my pages as they're kinda-duplicate content. I googled it just to remind me of the syntax as I don't use it often, and stumbled upon this:

[en.wikipedia.org...]

Obviously not supported by google yet, and quite possibly/probably never will be, but I was amused to see it.

tedster

5:04 am on Jan 30, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Google does support noindex in the robots meta tag. Googlebot still needs to spider the page in order to read the meta tag, so it's different from a robots.txt disallow rule. But the page content will not be in the index. It won't solve your "part of the page" goal, but it is a worthwhile tool to have in your kit.

Matt Cutts: So, with robots.txt for good reasons we've shown the reference even if we can't crawl it, whereas if we crawl a page and find a Meta tag that says NoIndex, we won't even return that page.

Interview with Eric Enge [stonetemple.com]

[edited by: tedster at 5:08 am (utc) on Jan. 30, 2010]

TheMadScientist

5:07 am on Jan 30, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I think he was referring to this part of the page:

Russian search engines Yandex and Rambler introduce a new tag which only prevents indexing of the content between the tags, not a whole Web page.

<body>
Do index this text block.
<noindex>Don't index this text block</noindex>
</body>

Maybe it would be too easy to keep content from being indexed and returned as 'the one right answer' or something for Google to use it? Not sure why else they couldn't adopt it rather than us needing to change or replace robots.txt as some have suggested in other threads... (And yeah, it's a bit humorous to me too.)

tedster

5:10 am on Jan 30, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Ah, so - I'm sure you're right.

TheMadScientist

5:20 am on Jan 30, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Of course maybe I'm not giving the Russian SEs enough credit and they're really more advanced than Google and that's why Google can't do the same thing? It really has to be Google can't or doesn't want to give us that much control if they can, right? It doesn't seem like there are too many other possible answers to me, so maybe Google's not as advanced as we think or something and they really can't do what some other search engines are doing WRT allowing us to tell them to not index portions of the page?

It seems like it would make sense to do if they could and didn't want us to 'hide' information, because if we could just tell them to 'noindex' a portion, then we could make things more transparent even if there are parts of pages we don't want them to return in the results, couldn't we? I think Yahoo! does something similar to the Russian engines don't they? IDK It really doesn't make too much sense to me for them to not do it if they can and want us to be transparent in what our content is, so maybe they can't... Whatever the reason Google doesn't, I've got to applaud Yandex and Rambler for giving the site owners control of the indexing of their content. That's cool to do...

This 42 message thread spans 2 pages: 42