There's already place for doing this in the http protocol. I would assume that crawlers respect this, if provided, although I haven't tested to verify my expectation.
Years ago, Googlebot would send If-Modified-Since headers, and Apache would honor them.
I ran into this by chance when writing a wrapper to obfuscate e-mail addresses in mailing list archives. I didn't change the URL but had it served by a script instead of being a flat file. When it first went online, all of the robots kept crawling the files over and over. I finally made it supply the right mtime to Apache, which then did the right thing with the incoming IMS header, generating a 304 and not sending out new content.
It's possible this has regressed, but I would hope it hasn't.
I have a newly registered domain with only a sparse page up as the index so far. It's been getting crawled fairly regularly by Google, Baidu and Yahoo. Google and Baidu are sending If-Modified-Since (Baidu is also sending If-None-Match) and are receiving 304 Not Modified responses each time they crawl. Yahoo sends neither header and is requesting the full page every single time. This is without any explicit cache headers set on my end.
That is to be expected. `If-None-Match` and `Etag` are a relatively late caching strategy, that is done at the server (or edge) side.
Have you tried serving your pages with `Expires` and `Cache-Control` headers? I you give it - say - a timeout of a week, then a well-behaving client shouldn't retry before that time has went by.