| 
			
			 The Planet's Technical Bubba 
			
			
 How Do You Identify?:  FTM 
Preferred Pronoun?:  He/Him/Geek 
Relationship Status:  Married to my forever! 
			
				
			
			
				 
				Join Date: Nov 2009 
				Location: Redondo Beach, CA 
				
				
					Posts: 5,440
				 
				 
	Thanks: 2,929 
	
		
			
				Thanked 10,727 Times in 3,172 Posts
			
		
	 
				
				Rep Power:  21474858 
				
				     
			 
	 | 
	
	
	
		
		
			
			
				 
				How facts backfire
			 
			 
			
		
		
		
			
			Interesting article that I think may be worthwhile for us to consider. It does explain Fox News et al. But also explains the effect of debate online to a degree. 
 
I bolded and blued the interesting bits.
 
	Quote: 
	
	
		
			
				It’s one of the great assumptions underlying modern democracy that an  informed citizenry is preferable to an uninformed one. “Whenever the  people are well-informed, they can be trusted with their own  government,” Thomas Jefferson wrote in 1789. This notion, carried down  through the years, underlies everything from humble political pamphlets  to presidential debates to the very notion of a free press. Mankind may  be crooked timber, as Kant put it, uniquely susceptible to ignorance and  misinformation, but it’s an article of faith that knowledge is the best  remedy. If people are furnished with the facts, they will be clearer  thinkers and better citizens. If they are ignorant, facts will enlighten  them. If they are mistaken, facts will set them straight. 
 
 In the end, truth will out. Won’t it? 
 
 Maybe not. Recently, a few political scientists have begun to  discover a human tendency deeply discouraging to anyone with faith in  the power of information. It’s this: Facts don’t necessarily have the  power to change our minds. In fact, quite the opposite. In a series of  studies in 2005 and 2006, researchers at the University of Michigan  found that when misinformed people, particularly political partisans,  were exposed to corrected facts in news stories, they rarely changed  their minds. In fact, they often became even more strongly set in their  beliefs. Facts, they found, were not curing misinformation. Like an  underpowered antibiotic, facts could actually make misinformation even stronger. 
 
 This bodes ill for a democracy, because most voters — the people  making decisions about how the country runs — aren’t blank slates. They  already have beliefs, and a set of facts lodged in their minds. The  problem is that sometimes the things they think they know are  objectively, provably false. And in the presence of the correct  information, such people react very, very differently than the merely  uninformed. Instead of changing their minds to reflect the correct  information, they can entrench themselves even deeper. 
 
 “The general idea is that it’s absolutely threatening to admit you’re  wrong,” says political scientist Brendan Nyhan, the lead researcher on  the Michigan study. The phenomenon — known as “backfire” — is “a natural  defense mechanism to avoid that cognitive dissonance.” 
 
 These findings open a long-running argument about the political  ignorance of American citizens to broader questions about the interplay  between the nature of human intelligence and our democratic ideals. Most  of us like to believe that our opinions have been formed over time by  careful, rational consideration of facts and ideas, and that the  decisions based on those opinions, therefore, have the ring of soundness  and intelligence. In reality, we often base our opinions on our beliefs,  which can have an uneasy relationship with facts. And rather than facts  driving beliefs, our beliefs can dictate the facts we chose to accept.  They can cause us to twist facts so they fit better with our  preconceived notions. Worst of all, they can lead us to uncritically  accept bad information just because it reinforces our beliefs. This  reinforcement makes us more confident we’re right, and even less likely  to listen to any new information. And then we vote. 
 
 This effect is only heightened by the information glut, which offers —  alongside an unprecedented amount of good information — endless rumors,  misinformation, and questionable variations on the truth. In other  words, it’s never been easier for people to be wrong, and at the same  time feel more certain that they’re right. 
 
 “Area Man Passionate Defender Of What He Imagines Constitution To  Be,” read a recent Onion headline. Like the best satire, this nasty  little gem elicits a laugh, which is then promptly muffled by the queasy  feeling of recognition. The last five decades of political science have  definitively established that most modern-day Americans lack even a  basic understanding of how their country works. In 1996, Princeton  University’s Larry M. Bartels argued, “the political ignorance of the  American voter is one of the best documented data in political science.” 
 
 On its own, this might not be a problem: People ignorant of the facts  could simply choose not to vote. But instead, it appears that  misinformed people often have some of the strongest political opinions. A  striking recent example was a study done in the year 2000, led by James  Kuklinski of the University of Illinois at Urbana-Champaign. He led an  influential experiment in which more than 1,000 Illinois residents were  asked questions about welfare — the percentage of the federal budget  spent on welfare, the number of people enrolled in the program, the  percentage of enrollees who are black, and the average payout. More than  half indicated that they were confident that their answers were correct  — but in fact only 3 percent of the people got more than half of the  questions right. Perhaps more disturbingly, the ones who were the most  confident they were right were by and large the ones who knew the  least about the topic. (Most of these participants expressed views that  suggested a strong antiwelfare bias.) 
 
 Studies by other researchers have observed similar phenomena when  addressing education, health care reform, immigration, affirmative  action, gun control, and other issues that tend to attract strong  partisan opinion. Kuklinski calls this sort of response the “I know I’m  right” syndrome, and considers it a “potentially formidable problem” in a  democratic system. “It implies not only that most people will resist  correcting their factual beliefs,” he wrote, “but also that the very  people who most need to correct them will be least likely to do so.” 
 
 What’s going on? How can we have things so wrong, and be so sure that  we’re right? Part of the answer lies in the way our brains are wired.  Generally, people tend to seek consistency. There is a substantial body  of psychological research showing that people tend to interpret  information with an eye toward reinforcing their preexisting views. If  we believe something about the world, we are more likely to passively  accept as truth any information that confirms our beliefs, and actively  dismiss information that doesn’t. This is known as “motivated  reasoning.” Whether or not the consistent information is accurate, we  might accept it as fact, as confirmation of our beliefs. This makes us  more confident in said beliefs, and even less likely to entertain facts  that contradict them. 
 
 New research, published in the journal Political Behavior last month,  suggests that once those facts — or “facts” — are internalized, they  are very difficult to budge. In 2005, amid the strident calls for better  media fact-checking in the wake of the Iraq war, Michigan’s Nyhan and a  colleague devised an experiment in which participants were given mock  news stories, each of which contained a provably false, though  nonetheless widespread, claim made by a political figure: that there  were WMDs found in Iraq (there weren’t), that the Bush tax cuts  increased government revenues (revenues actually fell), and that the  Bush administration imposed a total ban on stem cell research (only  certain federal funding was restricted). Nyhan inserted a clear, direct  correction after each piece of misinformation, and then measured the  study participants to see if the correction took. 
 
 For the most part, it didn’t. The participants who self-identified as  conservative believed the misinformation on WMD and taxes even more  strongly after being given the correction. With those two issues,  the more strongly the participant cared about the topic — a factor known  as salience — the stronger the backfire. The effect was slightly  different on self-identified liberals: When they read corrected stories  about stem cells, the corrections didn’t backfire, but the readers did  still ignore the inconvenient fact that the Bush administration’s  restrictions weren’t total. 
 It’s unclear what is driving the behavior — it could range from  simple defensiveness, to people working harder to defend their initial  beliefs — but as Nyhan dryly put it, “It’s hard to be optimistic about  the effectiveness of fact-checking.” 
 
 It would be reassuring to think that political scientists and  psychologists have come up with a way to counter this problem, but that  would be getting ahead of ourselves. The persistence of political  misperceptions remains a young field of inquiry. “It’s very much up in  the air,” says Nyhan. 
 But researchers are working on it. One avenue may involve  self-esteem. Nyhan worked on one study in which he showed that people  who were given a self-affirmation exercise were more likely to consider  new information than people who had not. In other words, if you feel  good about yourself, you’ll listen — and if you feel insecure or  threatened, you won’t. This would also explain why demagogues benefit  from keeping people agitated. The more threatened people feel, the less  likely they are to listen to dissenting opinions, and the more easily  controlled they are. 
 
 There are also some cases where directness works. Kuklinski’s welfare  study suggested that people will actually update their beliefs if you  hit them “between the eyes” with bluntly presented, objective facts that  contradict their preconceived ideas. He asked one group of participants  what percentage of its budget they believed the federal government  spent on welfare, and what percentage they believed the government  should spend. Another group was given the same questions, but the second  group was immediately told the correct percentage the government spends  on welfare (1 percent). They were then asked, with that in mind, what  the government should spend. Regardless of how wrong they had been  before receiving the information, the second group indeed adjusted their  answer to reflect the correct fact. 
 
 Kuklinski’s study, however, involved people getting information  directly from researchers in a highly interactive way. When Nyhan  attempted to deliver the correction in a more real-world fashion, via a  news article, it backfired. Even if people do accept the new  information, it might not stick over the long term, or it may just have  no effect on their opinions. In 2007 John Sides of George Washington  University and Jack Citrin of the University of California at Berkeley  studied whether providing misled people with correct information about  the proportion of immigrants in the US population would affect their  views on immigration. It did not. 
 
 And if you harbor the notion — popular on both sides of the aisle —  that the solution is more education and a higher level of political  sophistication in voters overall, well, that’s a start, but not the  solution. A 2006 study by Charles Taber and Milton Lodge at Stony Brook  University showed that politically sophisticated thinkers were even less  open to new information than less sophisticated types. These people may  be factually right about 90 percent of things, but their confidence  makes it nearly impossible to correct the 10 percent on which they’re  totally wrong. Taber and Lodge found this alarming, because engaged,  sophisticated thinkers are “the very folks on whom democratic theory  relies most heavily.” 
 
 In an ideal world, citizens would be able to maintain constant  vigilance, monitoring both the information they receive and the way  their brains are processing it. But keeping atop the news takes time and  effort. And relentless self-questioning, as centuries of philosophers  have shown, can be exhausting. Our brains are designed to create  cognitive shortcuts — inference, intuition, and so forth — to avoid  precisely that sort of discomfort while coping with the rush of  information we receive on a daily basis. Without those shortcuts, few  things would ever get done.  
 
Unfortunately, with them, we’re easily  suckered by political falsehoods. Nyhan ultimately recommends a supply-side approach. Instead of  focusing on citizens and consumers of misinformation, he suggests  looking at the sources. If you increase the “reputational costs” of  peddling bad info, he suggests, you might discourage people from doing  it so often. “So if you go on ‘Meet the Press’ and you get hammered for  saying something misleading,” he says, “you’d think twice before you go  and do it again.” 
 
 Unfortunately, this shame-based solution may be as implausible as it  is sensible. Fast-talking political pundits have ascended to the realm  of highly lucrative popular entertainment, while professional  fact-checking operations languish in the dungeons of wonkery. Getting a  politician or pundit to argue straight-faced that George W. Bush ordered  9/11, or that Barack Obama is the culmination of a five-decade plot by  the government of Kenya to destroy the United States — that’s easy.  Getting him to register shame? That isn’t. 
 
 
 Joe Keohane is a writer in New York
			
		 | 
	 
	 
 
		 
		
		
		
		
		
		
			
		
		
		
		
	 |