Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Locking Down Sites For Clients' Use Only

Pros & Cons Plus Unexpected Occurences

         

RedBar

12:34 pm on Oct 18, 2023 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



As an international company of 50+ years we are now determining whether to lockdown our company widget sites by the end of the year. The www is no longer driving new bulk B2B commercial enquiries however we do realise that most of our existing customers do use the sites for research, reference and especially technical purposes.

Issuing user names and passwords or advising customers to create their own is not a problem, my question is to anyone who has done this is did you experience any unforeseen issues?

This lockdown is a direct result of G's recent shenanigans plus why should we generate our knowledge and experience simply for AI to scrape and use with no recompense nor source acknowledgement?

Obviously this is something new to us however it is a situation which we are being forced into by companies that take for free yet are giving nothing in return insofar as we are concerned.

Opinions may vary however it would be interesting to see others' views.

superclown2

5:28 pm on Oct 18, 2023 (gmt 0)



Why not? It's your website to do with as you wish. I've done it before with private club websites without any real problems. Then again you could always use robots.txt or even .htaccess (a very blunt instrument) to control usage.

RedBar

6:27 pm on Oct 18, 2023 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



However not everyone honours those instructions, at the moment G does but who's to say "what if?" may happen if it finds it doesn't have the answer(s)?

superclown2

7:18 pm on Oct 18, 2023 (gmt 0)



However not everyone honours those instructions, at the moment G does but who's to say "what if?" may happen if it finds it doesn't have the answer(s)?


Sure. Nothing is completely secure on the Web. I've blocked 'search engines' in the past (remember Mamma? Dogpile? Massive spidering taking up my tiny bandwidth allocation in those days, and zero business), they've changed their IP address, and they are back in again. You could password protect it but some joker with Kali Linux would probably soon crack it. I suppose it all depends on how far you want to go.

lucy24

8:33 pm on Oct 18, 2023 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



you could always use robots.txt or even .htaccess
Apples and oranges. Malign robots don't honor robots.txt, and it's counterproductive in the case of search engines because if they can't crawl an URL they won't see a noindex tag.

This sounds more like a case for an .htpasswd file. (Tangent: Is it your own server? If so, you'd be using a <Directory> section in the config file rather than htaccess. I don't know if there is a config-level equivalent to .htpasswd, but someone else will.) You just need to give some thought to how new customers could get to your site, since you presumably have no objection to making additional sales. Even if it isn't an expanding industry, existing customers will sometimes come in from unfamiliar IPs.