I found the reason is because the robots.txt file prevents Googlebot so this seems intentional:
https://localchurchdiscussions.com/robots.txt
Is this to save on server cost? Did you know you can adjust the crawl rate in the Google search console?
Would be happy to help with server costs if this is an issue.