Gary Illyes from Google loves posting his PSAs on Mastodon. His next PSA is that if you are using a CDN or cloud hosting service, then make sure to test up a routine test to ensure that the service is not blocking Googlebot or other search engine crawlers that you want to crawl and index your site.
Gary wrote on Mastodon “If you care about Googlebot/Bingbot/otherBot visiting your site and you’re using solutions like CDN or f5, you should make a habit of going through every now and then the automatically generated firewall rules generated by some of these solutions. Sometimes they perceive crawler traffic as malicious and block it.”
Generally, this is good practice with any aspect of your website or software. You should set up routine tests that you can run manually on a schedule or just set to run in an automated way and be notified of the result. If Googlebot is blocked, you can jump on it before it becomes a larger issue. If your credit card form is broken, you can jump on it before losing out on many orders. Setting up automated or routine tests for sites or software is just good practice and maybe should be done to see if bots or people are being restricted on your website.
Forum discussion at Mastodon.