Well, Google “crawls” sites by following navigational links that are publicly available. So, that means if there’s no link to the page from the public side of your site, and no one else links to it from another site, it’s quite unlikely GoogleBot is going to trip across the page randomly.
That said, robots.txt is really your best bet to block GoogleBot. Yes, there is no mandate from above that says Google has to follow your robots.txt file, but Google, and most other reputable search engines, do. You should also add a meta tag to the specific page to tell robots not to index the page or follow any links on the page.
Robots.txt info: https://support.google.com/webmasters/bin/answer.py?hl=en&answer=156449
Meta tag info: https://support.google.com/webmasters/bin/answer.py?hl=en&answer=93710
With respect to WordPress, there’s some additional things you might want to look out for:
- If you’re using an SEO plugin that automatically publishes sitemaps, you’ll want to make sure the page isn’t listed in the sitemap. Check your plugin’s documentation for more info.
- There are a few plugins I’ve seen about that allow you to modify your robots.txt file from within WordPress. Some SEO plugins will also have this functionality. Check the plugin repository to find one that suits your needs.
All that said, if you’re really concerned about information on this page getting out to other people, your best line of defense, other than not putting it online at all, is to put it behind a password or other authentication system.
I hope this helps!