So is the problem that *all* visitors are being blocked, or rather just that your custom 403 page is not being served to visitors whose IP addresses are (correctly) being blocked?
If the latter, then note that your custom 403 error page is a... well, it's a page, and so the mod_access restrictions apply.
In order to allow your custom 403 error page to be served, you will have to provide an exclusion to the IP-address-based Deny. I also recommend that you provide an exclusion for robots.txt as well -- Otherwise, some robots will take a non-200 OK response as carte-blanche to attempt to spider your entire website, leading to a "403 storm" that continues unless and until they give up.
This solution requires mod_setenvif, use of the "Deny,Allow" mod_access priority setting, and an additional "Allow from" line, using the "Allow from env=<varname>" syntax:
SetEnvif Request_URI "^/(robots\.txt|custom-403-page\.html)$" AllowAll
#
Order Deny,Allow
#
Allow from env=AllowAll
#
Deny from 192.168.0.12
Deny from 10.10.0.10
Here, any request not matching a denied IP address or any request for robots.txt or your custom 403 error page will be allowed.
Be aware that in .htaccess, only one "Order" directive may be used in any non-exclusive scope; Otherwise, only the last "Order" directive will be applied. To avoid "Order directive collisions," it may be useful to use <Limit>, <LimitExcept>, <Files>, and <FilesMatch> containers -- if required.
Jim