Suppose a government has the option of having a policy of bailing out the major businesses within a certain market if there is a major crash within that market, which will limit the damage caused by the crash and improve confidence in the absence of a crash. If the firms within this market become aware of this guarantee, however, they will take increased risk and so make a crash more likely. Averaging out probabilities and utilities, the result is something like this:
Average Social Utility Firms take risks Firms do not take risks
Bailout guarantee -2 5
No bailout guarantee -10 0
For a given behaviour on the part of the firms (who are all assumed to act in the same way) the government is always better to operate a bailout guarantee; however, if firms are able to predict that the government will operate a bailout guarantee, then society is worse off for this. Does this situation sound familiar? It is Newcomb's problem in a very slight disguise. The view on Newcomb's problem which I espouse is that if it would have been useful for one to have pre-committed to acting in a particular way, one should act in that way even if it is not in one's short-term interest. (This, at least, is my understanding of Timeless Decision Theory - that the decision to perform action X and the decision to pre-commit to performing action X should be the same decision). In slightly more layperson's terms, one should aim to be the kind of person who one-boxes. Philosophers as a whole lean more against this view than in favour of it, but by far the most popular view is "other", which I take to mean "don't know".