Vic N (yes I know who he is) posted a comment on a recent item where I raised the point that it is okay to change your mind in the face of new facts. The reference there was to the reluctance of some people to own up to their previous beliefs and be prepared to explai why they had changed.
The post from Vic referred to some research. While the published version requires a subscription, there is an earlier working version available on line.
The it is worth quoting extensively from the conclusions of the paper.
The experiments reported in this paper help us understand why factual misperceptions about politics are so persistent. We find that responses to corrections in mock news articles differ significantly according to subjects’ ideological views. As a result, the corrections fail to reduce misperceptions for the most committed participants. Even worse, they actually strengthen misperceptions among ideological subgroups in several cases. Additional results indicate that these conclusions are not specific to the Iraq war; not related to the salience of death; and not a reaction to the source of the correction.
Our results thus contribute to the literature on correcting misperceptions in three important respects. First, we provide the first direct test of corrections on factual beliefs about politics. Second, we show that corrective information in news reports may fail to reduce misperceptions and can sometimes even increase them. Finally, we establish these findings in the context of contemporary political issues that are salient to ordinary voters.
These findings seem to provide further support for the growing literature showing that citizens engage in motivated reasoning....
It would also be helpful to test additional corrections of liberal misperceptions. Currently, all of our backfire results come from conservatives – a finding that may provide support for the hypothesis that conservatives are especially dogmatic...
[F]uture work should seek to distinguish the conditions under which corrections reduce misperceptions from those under which they fail or backfire. Many citizens seem or unwilling to revise their beliefs in the face of corrective information, and attempts to correct those mistaken beliefs may only make matters worse. Determining the best way to provide corrective information will advance understanding of how citizens process information and help to strengthen democratic debate and public understanding of the political process.
In this the matter is very similar to the field of "risk communication" which I became familiar with on the issue of health effects from EME. Just telling people that the science shows it is "safe" doesn't work and can indeed backfire. The essential first step is to acknowledge their concerns before giving them the facts.
I would suggest the same issues emerge with climate change, no amount of "fact" or "science" helps change the position of the "climate change deniers". However, from personal experience, you can make progress if you don't rely on the science as the only tool. Most importantly if you acknowledge the denier's concern that the wrong policy could damage the economy and that reacting is a risk weighted assessment rather than dogmatically "right" can achieve change that doesn't occur by stating the "facts" alone.
In this I think I pick up an earlier post in which I exhorted scientists to be "humble". Interestingly Vic N commented on that earlier post...but now perhaps the "science" is in, to be persuasive you need to do more than restate the "facts".