One option would be to render a second camera with fog, probably using a basic copy of the 3D scene. Then, that image can be used to control the amount of blur applied because it represents the depth field, so effectively eliminating the need to affect all objects - overheads with this are another matter. I think this is the only way to do it, with a second camera and render plain... but it does invite some neat features as well, like for weapon scopes, instead of using the second camera render to decide the blur amount - use a fixed image. And fire sources too, have particle effects that only show on the second camera render, like raising up from a fire - then that will animate the depth image and give a shimmering effect. You could also affect the fog parameters to increase the depth blur directly, like blurring more when the player is hit.
I'm sure Evolved's shaders sometimes use this method.

I am the one who knocks...
