Like way too many social scientists, I’ve been overly obsessed with the LaCour/Green retraction. Partly, it’s the perverse attraction of watching the slow-motion crash of a career, but more substantially, it’s a chance to reflect on how social science is supposed to work, and where activism fits in–or not.
I’m glad that I try to read across the political spectrum, although I’m less happy that I drift into the comments sections of websites reporting on politics. Anyway, one surprise is that many conservative sites have used the opportunity to rail against both same sex marriage and science, suggesting that the publication of the article reflects the inherently liberal bias of college professors (e.g.; e.g.). This seems misplaced for a number of obvious social sciencey reasons:
1. the study was about opinion change; gay marriage was a case to explore that process. Whether or not you support marriage equality, those results–had they been real–should be of interest.
2. The fraud was outed through normal social science procedures.
3. The only principal to go public about sexual orientation or position on marriage equality was David Broockman. He says he’s gay and supports same sex marriage, but was critical in debunking the study that he claims to have been hyper-enthusiastic about.
In summary, I don’t think the retraction story tells us anything about the political biases of academic social science. To the extent that Michael LaCour displayed a visible agenda, it was about self-advancement, and not any apparent cause. If he cared about the causes at stake, the story is even worse, for he convinced groups that he would have supported to embrace and fund strategies that were, minimally, unproven, and likely less effective than other approaches. [Kieran Healey made this point effectively on his invaluable blog a week ago.] It could be worse: might the strategy of long-form open ended conversations with gay canvassers have been completely counterproductive to causes he might endorse? There is, in fact, non-retracted peer-reviewed social science that finds appeals coming from someone similar to the target are more likely to be effective than appeals from someone different.
Many researchers go wrong when they decide to research questions that they already know the answers to. Mark Regnerus, a tenured sociologist at the University of Texas, provides an unfortunate example of how activist social science goes wrong. Regnerus has been been clear about his moral and political commitments, particularly his opposition to same sex marriage. With initially camouflaged support from a group that agreed with him, he assembled a large data set to find the downsides for children of growing up in a gay household. To get to this finding, he had to squeeze and twist the data in ways that violate basic social science practices. He gave his political opponents–and the federal judge he testified before–plenty of material to use in discrediting both his claims and his integrity. The work didn’t help his cause.
But lots of people go into social science with moral and political commitments. You can serve both ends, but only by picking questions that you can’t answer without honest data and research. (I have written more on this.) If you know that gay marriage is bad for kids–or that universal pre-school is good, for that matter–you can’t really study those questions without gathering and evaluating your data from a distorted perspective.
Activist social science means seeking out answers that you don’t know, where you are completely invested in getting the right answer. As analogy, think about the science behind developing an effective vaccine. An outcome that works is more important, obviously, than any particular strategy. The costs of compromising on research or evaluating the data inaccurately are obvious. Bad science hurts the cause and maybe people.
Oddly, the retracted article asked the right kind of social science question: are people more likely to respond to a gay canvasser than a straight one? Presumably, opponents and supporters of same sex marriage would both want to know the truth. Actually, advocates of any cause would want to know.
Alas, we still don’t know.
I always like to point out Jane Mansbridge’s contention that activist polling that tried to “support” the ERA wound up, perhaps, mobilizing the opposition.
Moral and instrumental reasons to tell the truth.
Two excellent posts Professor Meyer.
But I think the problem of social science finding verifiable truths or social laws goes much further than just mere manipulation of data or statistics.
The problem can be summed up in (Erwin) Schrodinger’s cat thought experiment.
I’m a firm believer that no matter the methodology, the observer will always find consciously or even sub-consciously what he or she is looking for, since you must begin any research with a premise behind the examination of any statistical data collected.
And the structure of the premise alone prejudices the outcome.
J.R.Werbics is a Canadian writer and philosopher
I operate as though my eating, breathing, moving cat is indeed alive. Science is surely imperfect, one reason why we look for a larger community conducting investigation. I, for one, lack a better epistemology for navigating the world.
Your cat may indeed be alive,but any and all data is dead.
J.R.Werbics is a Canadian writer and philosopher.