Is there a way to configure Apache or another add-on tool to override the web application and return a non load intensive 404 page (static .html, etc)?
It is rather specific to your web app.
If it uses a front-controller pattern, which is, e.g. a single index.php processing virtually unlimited number of different SEO URLs, e.g. /foo/bar, and there are too many of different "category" (first level) pages to list in the configuration, then there is no luck there.
However, suppose that your website handles just / (homepage), then some /shop/<product name> and /blog/<name-of-the-article>.
You can construct a simple rule (not Apache man here, sorry) in NGINX like the folowing:
location ~ ^/(?!($|shop/|blog/)) { return 404; }
Which would have the web server deliver straight not found error without invoking PHP at all, for requests which are know to be not part of your app URL "scheme".
Another technique that is efficient in this regard is honeypots.
Typically bots are trying to check for software/plugins which are vulnerable, but not even present on your website to begin with.
You can leverage the fact that you know you don't have them. And then instantly ban whoever tries to load those endpoints. (e.g. see this honeypot for NGINX technique).
You can implement the same with Apache. Essentially you'll need to list locations which are not belonging to your website and commonly attempted for exploits.
E.g. you know you have a Magento website, but so many bots will try to see/login as Wordpress. So /wp-login.php is one of your honeypot locations.
Once defined in config you need to pass through those requests to a FastCGI script which interacts with your firewall to immediately ban.
Not only it will not cause any PHP load, but also trigger instant ban in the firewall.
Thus this would fire off much faster in comparison to Fail2ban (which monitors logs for e.g. repeat login failures), because the banning happens as soon as the request has arrived. This can work as a complementary measure to Fail2ban though.