• Buro Jansen & Janssen, gewoon inhoud!
    Jansen & Janssen is een onderzoeksburo dat politie, justitie, inlichtingendiensten, overheid in Nederland en de EU kritisch volgt. Een grond- rechten kollektief dat al 40 jaar, sinds 1984, publiceert over uitbreiding van repressieve wet- geving, publiek-private samenwerking, veiligheid in breedste zin, bevoegdheden, overheidsoptreden en andere staatsaangelegenheden.
    Buro Jansen & Janssen Postbus 10591, 1001EN Amsterdam, 020-6123202, 06-34339533, signal +31684065516, info@burojansen.nl (pgp)
    Steun Buro Jansen & Janssen. Word donateur, NL43 ASNB 0856 9868 52 of NL56 INGB 0000 6039 04 ten name van Stichting Res Publica, Postbus 11556, 1001 GN Amsterdam.
  • Publicaties

  • Migratie

  • Politieklachten

  • Where Terrorism Research Goes Wrong

    Van nieuwsblog.burojansen.nl

    TERRORISM is increasing. According to the Global Terrorism Database at the University of Maryland, groups connected with Al Qaeda and the Islamic State committed close to 200 attacks per year between 2007 and 2010, a number that grew by more than 200 percent, to about 600 attacks, in 2013.
    Since 9/11, the study of terrorism has also increased. Now, you might think that more study would lead to more effective antiterrorism policies and thus to less terrorism. But on the face of it, this does not seem to be happening. What has gone wrong?
    The answer is that we have not been conducting the right kind of studies. According to a 2008 review of terrorism literature in the journal Psicothema, only 3 percent of articles from peer-reviewed sources appeared to be rooted in empirical analysis, and in general there was an “almost complete absence of evaluation research” concerning antiterrorism strategies.
    The situation cries out for the techniques of prevention science. For a given problem (like terrorism), prevention science identifies key risk factors (like alienation), develops interventions to modify those risk factors (like programs to promote positive relations with the dominant culture) and tests those interventions through randomized trials. Using this methodology, scientists have identified interventions that effectively prevent problems as diverse as antisocial behavior, depression, schizophrenia, cigarette smoking, alcohol and drug abuse, academic failure, teenage pregnancy, marital discord and poverty.
    Jon Baron, who leads the Coalition for Evidence-Based Policy, which advocates for the use of randomized trials to evaluate government programs, reports that his organization has been able to identify only two experimental evaluations of antiterrorism strategies. One of them, a field experiment reported in a paper from a World Bank office in 2012, randomly assigned 500 Afghan villages to receive a development aid program either in 2007 or after 2011. The aid program had significant positive effects on economic outcomes, villagers’ attitudes toward the government and villagers’ perceptions of security. The aid program also reduced the number of security incidents, though that effect was not maintained after the program ended and was observed only in villages that were relatively secure before the program began.
    Thus the study found an unequivocal but limited benefit of an aid program in reducing insurgent violence. I say “unequivocal” because randomizing villages to receive or not receive the aid made it extremely unlikely that differences in attitudes and security resulted from anything other than the aid program itself.
    The second study was published last year in The Economic Journal. The researchers randomly assigned neighborhoods and villages in Nigeria to have, or not have, a campaign to reduce pre-election violence. The campaign made use of town meetings, theater and house-to-house distribution of material. The study found that the campaign increased empowerment to counteract violence and voter turnout, and reduced both perceptions of violence and the intensity of violence.
    Imagine how much more we would know about the prevention of terrorism if even a small proportion of the hundreds of antiterrorism efforts implemented worldwide in the past 15 years had been properly evaluated. As it is, we can say almost nothing about their efficacy. Do we know whether drones are increasing or decreasing the rate of terrorists’ attacks? Whether our current surveillance activities are thwarting more terrorists than they are radicalizing young people?
    In 2012, the National Institute of Justice (the research arm of the Department of Justice) began a program to study domestic radicalization. Over the first three years it has funded nearly $9 million in research. While the studies underway will undoubtedly contribute to our understanding of the risk factors that contribute to radicalization, none of the projects funded thus far are adequately evaluating a strategy to prevent radicalization.
    One of the projects, for example, is an effort to increase awareness of risk factors for radicalization as well as civic-minded responses to them among members of the Muslim community. The program’s impact will be assessed by comparing outcomes for those who never participate, those who participate once and those who participate multiple times. If the project finds that those who participated multiple times were less radicalized than those who never participated, you might be inclined to conclude that the program is working. But experience from evaluation research over many years has taught us that such a difference could just as likely be because those who were less inclined to become radical were more likely to participate.
    The only way to really be confident that it is the program that is making the difference is to randomly assign some people to get it and others not. That way any differences are very unlikely to be caused by pre-existing differences between the two groups.
    Estimates of the cost of the war on terror have varied between one and five trillion dollars. Surely we can invest a tiny fraction of that in improving our antiterrorism strategies through rigorous experimental evaluations.
    Correction: March 15, 2015
    The Gray Matter feature last Sunday misstated an estimate for the growth in the annual number of attacks by groups connected with Al Qaeda and the Islamic State. It was more than 200 percent, not more than 300 percent.
    MARCH 6, 2015
    By ANTHONY BIGLAN
    Find this story at 6 March 2015
    © 2015 The New York Times Company