Forum Moderators: open

Email Revalidation Required - System Login Changes Coming

         

Brett_Tabke

2:55 pm on Sep 1, 2025 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month Best Post Of The Month



We are working on the login/registration routines for the board over the next month:

Some updates will require you to:
  1. Revalidate your email address. Please check your email as listed on your profile.
  2. Pass codes: Login will use "email me a passcode code" format to login.
  3. Passwords will be eliminated. A recovery email address will be added.
  4. Cookies will be required to view the forums
  5. Javascript will be required to view the forums
  6. CloudFlare will be tested
  7. Anyone viewing more than 10,000 pages per month and not a senior member, will be required to subscribe.


This is all in response to abusive behavior by some bots, people, and users.

I just finished looking at the logs for August and there 95 million - yes 95 million - page views. That is clearly impacting system performance and making the board almost unusable for actual users at times.

I have always tried to get the system wide open for users to peruse how they choose, but the situation has become untenable and I feel I have to aggressively address it.

lucy24

4:18 pm on Sep 1, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



:: vague mental association with bus stops removing all seats, because Reasons ::

graeme_p

4:43 pm on Sep 1, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I really hate 2. Less secure and a nuisance. Not keen on 5 and 6 either.

Who views 10k pages a month! I can understand blocking that!

Have you considered Anubis for bot blocking?

engine

4:45 pm on Sep 1, 2025 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Great news, Brett!
I'll look out for the notifications.

Brett_Tabke

5:19 pm on Sep 1, 2025 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month Best Post Of The Month



>Who views 10k pages a month! I can understand blocking that!

about 10 senior members and about 50 others. It is the others I am concerned about.
If Google is updating, it is not uncommon for some senior members to view the thread 500 times a day.

not2easy

5:42 pm on Sep 1, 2025 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



I think I viewed over 500 threads yesterday. Felt like anyway. ;)

tangor

5:46 pm on Sep 1, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Might change how I access WW. I usually log in, read what's new, check a few threads (my last twenty or so for changes) and then log out. Might log in two-three times a day during the week. Needing a passcode each time might suggest logging in and STAY logged in. Will have to think about that as I like to keep things lean and clean, regardless how much horsepower my device of choice might have.

HOWEVER! Do what is necessary and needed to face the miscreants, scrapers, bots, and bad actors. 95M is a heck of a number considering the ACTUAL "posted traffic" per day! My bot finger gets itchy when 95,000 per month hits my hobby site!

thecoalman

10:23 pm on Sep 1, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I've been using CF for about 8 years now. If you want to fully implement the DDOS protection firewall ports 80 and 443 except for Cloudflare IP's, isolate http traffic on it's own IP (e.g. no email server on same IP) and make sure any outgoing requests triggered by users can't occur. The last one is probably not an issue here but phpBB forum software has few features that will cause this like remote avatar upload.

You may want to look at disabling the feature SpeedBrain if it's enabled, I refer to it as SpeedDummy. I don't know if this has been fixed but I have no intention of finding out. This was effectively causing double clicks so for example unread anchors are now marked read when user arrives on page. It was also causing a lot of issues with other web application like Wordpress. I spent three days trying to figure out what was going on, there is no indication in browser console this activity is occurring.

I wrote an article for this for phpBB, hers is the rules I suggest. Some of it is phpBB specific but can be adjusted as needed.

Cloudflare's Automated Tools
Go to Security >> Settings. There are various tools here, the one you are most interested in is "Bot Fight Mode". This will automatically block some of the most aggressive bot traffic Cloudflare has identified as malicious. Optionally, you can also enable some of the AI blocking tools.


Cloudflare Custom Rules
Go to Security >> Security Rules >> Create new Rule >> New Custom Rule. CF has an easy-to-use GUI. With the free plan, you get 5 rules. Each rule can have multiple conditions but only one action. Rules are fired in order so make sure the top rules do not interfere with subsequent rules. The following actions can be applied:

Skip - This will skip further rules based on whatever you select under WAF components to skip
Block - The request is blocked
Managed Challenge - Cloudflare will choose what challenge to issue.
Interactive Challenge - CAPTCHA that requires user interaction
JSChallenge - The "Checking your browser...." page that requires no user interaction.

Rule 1 will be used for whatever you want to allow through and skip the rest of the rules. CF maintains a list of known bots that adhere to robots.txt so you can add that if you are using robots.txt. RSS readers cannot pass the Cloudflare check, that is something else you might want to allow through if you have feeds enabled.
Field: Known Bots Operator: Equals Value: <checked>
OR
Field: URI Full Operator: Wildcard Value: https://example.com/forum/feeds/*
Action: Skip All Remaining Custom Rules
Rule 2 will be used for what you want to outright block. You can block using a variety of criteria like ASN, user agent, country, continent and many others. For this example we are blocking the "country" T1 which is used for the Tor network and the continent of Antarctica. These are just examples, phpBB harbors no ill will toward TOR or penguins :).
Field: Country Operator: Equals Value: Tor
OR
Field: Continent Operator: Equals Value: Antarctica
Action: Block
Rule 3 are phpBB specific rules for phpBB's registration page to help stop spammers from registering and brute force attacks for logins. phpBB has it's own brute force detection but for the convenience of users it's not that strict.
Field: URI query string Operator: Contains Value: mode=register
OR
Field: URI query string Operator: Contains Value: mode=login
Action: Managed Challenge
Rule 4 adds a rule for problematic countries or other conditions you want to elevate the Challenge. For action issue an Interactive Challenge. The Interactive Challenge requires the user to perform some action on screen, usually a check box. In the following example it's issued to India and China.
Field: Country Operator: Equals Value: China
OR
Field: Country Operator: Equals Value: India
Action: Interactive Challenge
Rule 5 allows you to whitelist countries and deploy a blanket policy for the rest of the world. For the action, use the JSChallenge, which is the brief "Checking your browser..." page. Countries listed here will not be challenged, add countries where you expect the bulk of your traffic to come from. It's important to note you need to use the "Does not equal" operator with AND. In the following example the US, Canada and the UK are whitelisted.
Field: Country Operator: Does not equal Value: United States
AND
Field: Country Operator: Does not equal Value: United Kingdom
AND
Field: Country Operator: Does not equal Value: Canada
Action: Interactive Challenge

Brett_Tabke

12:27 am on Sep 2, 2025 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month Best Post Of The Month



Thanks for the tips. I will come back to it before long.

Kendo

2:10 am on Sep 2, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



That sounds like a lot of bot activity.

As for personal activity, I try to log in once a day while having a break from chores, check the crab pots and then get back to work. But I only ever see 10-20 new posts per day so anyone pulling 500 pages a day is not one of us.

thecoalman

11:44 am on Sep 2, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Assuming most of the scraper traffic is outside of US, Canada and the UK rule 5 by itself will eliminate most of the bot traffic with no impact on users in those countries. Even legitimate traffic from outside those countries is not impacted that much. You could just whitelist everything in one rule.

Field: Known Bots Operator: Does not equal Value: <checked>
AND
Field: URI Full Operator: Wildcard Value: https://example.com/forum/feeds/*
AND
Field: Country Operator: Does not equal Value: United States
AND
Field: Country Operator: Does not equal Value: United Kingdom
AND
Field: Country Operator: Does not equal Value: Canada
Action: JSChallenge

Brett_Tabke

2:03 pm on Sep 2, 2025 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month Best Post Of The Month



There is a great deal outside the us, but there is also much inside. I finally had to ban amazon and googleusercontent.

thecoalman

8:08 pm on Sep 2, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



You would use rules 2, 3 and 4 to fine tune for traffic inside whitelisted countries depending on if you wanted to block or challenge. It's also possible to create more complex rules combining AND/OR. You could :

ASN equals Amazon's
AND
user agent does not equal Duckduckgo
OR
some other condition


Note that's not the exact syntax.I only use 5 rules in my article because that is limitation on free plan but it's more than plenty. You get 20 with pro plan. Each rule can only have one action but multiple conditions. I don't know if there is limit on conditions.

explorador

10:35 pm on Sep 2, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Cool security measures, I can totally understand the problem.

I don't remember the title of the thread I created related to this, but, in terms of key points, what's described here it's something I would add to my narrative and concerns about -the weight of being a webmasters nowadays-. It's been a while since being a webmaster it's not about design, code and content anymore, not precisely... because you literally feel the weight of other things that take your time and efforts.

Pass codes: Login will use "email me a passcode code" format to login.

I don't see anyone asking about this in detail, maybe everyone understands this perfectly and I'm the exception. Does this mean...

A) I can't stay logged in anymore? like... every time I want to log in I have to push some buttons, then receive a code, then confirm the code? like some banking apps do on smartphones? with expiring time?

or

B) I can stay logged in, it's just the first log in that will require this dynamic?

BTW, just saying, just sharing. One forum I sometimes visit, closed it's doors to the public, there is absolutely no search engine ranking anymore, and only registered members can browse the topics and participate, this keeps the bots away. It helps that they charge money in a crowdsourcing way to pay the fees (it's dying anyway).

tangor

1:27 pm on Sep 3, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



that they charge money in a crowdsourcing way to pay the fees (it's dying anyway).

That's sometimes the result. Can also be the start of a better future. Having something THAT AIN'T FREE, where the participants have to put skin in the game, and the "whisper campaign" of THIS IS WHERE IT'S AT!" can make all the difference.

Might be smaller number, but they will be loyal, dedicated, and LONG TERM in a bot free, spam free, collegiate environment.

There's a reason why "country clubs" still exist. :)

graeme_p

3:02 pm on Sep 3, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@explorador I interpreted it as meaning that we will login using an email login link every time which i hate doing - its clunky. If its only required once its not a problem for me,

@tangor with WW a lot of us found it through search engines when trying to solve a specific problem. New people will not find it easily.

tangor

3:27 pm on Sep 3, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@tangor with WW a lot of us found it through search engines when trying to solve a specific problem. New people will not find it easily.

Is that because AIO gets in the way?

</sardonic satire>

lucy24

3:44 pm on Sep 3, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



There's a reason why "country clubs" still exist.
I wouldn’t want to belong to a club, country or otherwise, that would have me for a member.

explorador

6:23 pm on Sep 3, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Delay.
Given the described problems (top of the thread), what about a delay? if you browse #1, #2 pages within X, then you can't instantly access #3, or, the forum will make you wait for a while, let's say 1 second. And if you then access page #3 and want to jump right away to #4 and then #5, the waiting goes from 1 second to 3, and so on. This would limit the bots. I used something similar to protect form submission in the past, and naturally... if you browse 300 pages that's enough to keep you on a waiting list for the whole 24 hours.

Tangor: That's sometimes the result. Can also be the start of a better future.

Yes, I agree. But... there has to be a community awareness on the cost of the approach. A private forums works well protecting the infrastructure and resources (also the community), but it can lead to isolation. I know we understand this, just... keep in mind I've been a long time apprentice on this forum and I appreciate what it means, it's just: when people ask me --where did you learn that concept?--, and I tell them "WebmasterWorld", their reaction is "ugh...", why? they think it's old, not practical, and the forum interface is far from optimal for modern standards. This means I'm happy about this forum, but I see people failing to see the same value here, and this hurts the community, hurts the forum, and leads to having the same long time forum members only.

I'm fully aware value and content weight tons here, way more than design and presentation, but I know it, new people don't. And if... I came here to day knowing nothing, I wouldn't find: mobile app coding languages; modern design discussions; Xcode; Expo; React Native; cool design; cool forum interacting features; etc., and it would probably drive me away.

@graeme: thanks, I hope it's a one time feature. Because, if it's a form of "always prove you are you before login in, and check your code on your mail", I'm pretty sure... eventually... like others, this would push me away. And yes, new people will not find the content here via SE's.

Kendo

12:47 am on Sep 4, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Logging into my Google now requires quoting a number generated by an authenticator app on my mobile. I find that a pain.

But my bank, one account that I certainly do want to see strictly monitored, gets the code signing certificate stored on my PC and logs me in automatically. This I like.

Martin Potter

11:16 pm on Sep 4, 2025 (gmt 0)

5+ Year Member Top Contributors Of The Month



Like most others here, I understand the problem and the real need for each of the projected security measures.

Although I didn't used to log in every day, I do now (almost every day). And using a password has never been a problem for me. (It was an occupational skill, acquired over the years, to memorize all the different passwords, each of them used for only one account/vault/room.)

But, for some reason, having to copy and paste an e-mailed passcode has always been an annoyance, even though it is now used more and more frequently.

My browser is Firefox, running under Linux, and I think it deletes all the cookies at the end of each session. And I am not sure that I have javascript on this machine. Have to check on that.

Obviously I will have to make some adjustments, physical and mental, to continue here.

graeme_p

1:24 pm on Sep 5, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Another thought. I would be nice to have some fallback if we lose access to an email address.

@Martin_Potter cookie deletion is a setting, there are extensions to make it more convenient. Not sure what you mean by having JS - its built into Firefox. Do you mean you have turned it off? If so NoScript lets you turn if off per site or tab.

tangor

2:51 pm on Sep 5, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



My browser is Firefox, running under Linux, and I think it deletes all the cookies at the end of each session. And I am not sure that I have javascript on this machine. Have to check on that.

Javascript is included with Firefox. To see if it is enabled (default is ENABLED) use about:config in the address bar, accept the "risk", search for javascript.enabled and see what the setting is.

Under Settings, you can manage cookies and list exceptions, to retain or delete, with every browser load or session(s).

If using NoScript as an extension, just TRUST webmasterworld.com AND ajax.googleapis.com to get full functionality.

tangor

3:20 pm on Sep 5, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Heh! graeme_p and I crossed paths! Back in the day I would log in on Sunday and stay logged into until the machine rebooted for lack of memory (up to two weeks). These days the current FF runs (single tab) about 350mb memory. Each tab adds another 150-300mb to that total. On 16gb ram machines it doesn't take long to fill up memory with other apps, work, data sorting, etc., so I started logging out after each visit to reclaim that memory (most of it, never goes back to what it was).

If we go with the email passcode I'll probably return to "stay logged in" per day (logging out when I'm done) to avoid having to deal with the cut and paste. THAT IS, however, if that activity does NOT bog down the WW server with "always on" connections. In which case Brett might have to put time limits on individual sessions to keep the system open.*

*Reminds me of the old BBS days when I allowed a VERY GENEROUS 30 minutes over a bundle of phone lines at 56k baud so that my users had a chance during a 24 hour period to log in. Just things to consider.

mcneely

6:30 pm on Sep 6, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Not sure how all of this will work out for us here but at the end of the day you've got to do what you've got to do.

Re-validating the email every time I go to login would probably be a bit of a stretch - I guess if Facebook can do that, you can too.

As far as Amazon? I'm totally with you on that one -- I've seen more than anyone's fair share of abuse coming from Amazons servers over the years.

Cloudflare? - Not a tool I would use personally - we're pretty autonomous around here and try as much as possible in keeping everyone else's fingers out of our pie. It's a 50/50 split on hosting clients from my end that really love it or really hate it.

On the internet it's always going to be that sometimes you're hot and sometimes you're not - Cloudflare likes to lock you in on the traffic regardless of ebb and flow. Not a business model I prefer to subscribe to. Bandwidth on our hosting services is fluid, just like the internet itself and our price for those services are more of a suggestion than anything else. I don't ever have a problem bumping up the bandwidth on occasion if the situation calls for it - If the increase in bandwidth becomes a constant I'll suggest the client upgrade, otherwise they stay where they are on the plan they've selected.

I'm set up to dump cache and cookies every time I end the browser session so I'm guessing that WW will most likely become like Facebook in that every time I go to login I have to send them an email telling them that it's really me.

Signs of the times I suppose and good on you for wanting to take the time out in order to run a clean ship.

Oh and thank you Brett for bringing back the top topics as the WW landing page.

[edited by: mcneely at 6:40 pm (utc) on Sep 6, 2025]

mcneely

6:37 pm on Sep 6, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



If using NoScript as an extension, just TRUST webmasterworld.com AND ajax.googleapis.com to get full functionality.


I use the NoScript extension so that's most likely the route I'd go in this case

Brett_Tabke

4:39 pm on Sep 7, 2025 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month Best Post Of The Month



> some fallback if we lose access to an email address.

There will be a 'recovery' email address addition.

Martin Potter

7:10 pm on Sep 8, 2025 (gmt 0)

5+ Year Member Top Contributors Of The Month



Thanks, everyone, for those tips. And a reminder ! (I had forgotten, more frequent lately, it seems.)

The amount of help given here is very much appreciated. Thanks again to all.

jmccormac

5:47 am on Sep 14, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



That looks like a lot of potential scraper activity, Brett,
It might be worth breaking down the IP addresses of the logs and the number of requests. There has been quite an uptick in scrapers from mobile phone ranges. They are often used to request a single page of HTML from a single IP.

Many of the more aggressive AI maggots are scraping information rich websites having being blocked under their original UAs in robots.txt. Some problem data centre ranges may also need to be blocked at an IP level. That's probably easier with Cloudflare. The scrapers using ISP botnets are more of a problem.

It may need more analysis before moving to e-mail validation.

Regards...jmcc

clickbak

5:18 pm on Sep 24, 2025 (gmt 0)



Thanks for sharing this update—it really highlights how challenging it is to balance security, performance, and accessibility for genuine users. The new measures like passcode-based login and recovery emails sound like a smart move to cut down bot activity while still keeping things user-friendly.

I’ve also been following discussions about forum usability and content sharing, and I recently came across this resource that might interest other members: [cstleapp.com...]
. It’s more entertainment-focused, but it shows how communities are experimenting with open access while still managing engagement.

Looking forward to seeing how these updates improve overall performance for everyone.
This 35 message thread spans 2 pages: 35