As an organizational psychologist, Kevin Murphy has spent most of his career studying how people make decisions about other people, typically in the context of job interviews or performance appraisals.
In the mid 1980s, Murphy, now professor and head of Penn State's psychology department, wrote a paper about the hazards of relying on polygraph tests for screening potential employees, at the time a fairly widespread practice. "The responses I got from the polygraph community were so intense—often really vituperative—that it got me interested in the whole field of deception," he says.
In the late 1990s, as a result of concerns raised by the use of polygraphs in the investigation of Wen Ho Lee, the Los Alamos scientist accused of spying, the National Academy of Sciences convened a panel to look closely at the scientific evidence on the polygraph, and Murphy was asked to serve. "We issued a report that said largely the same thing people have been saying for 100 years," he says: "The science is weak, and we don't know how to interpret a lot of this stuff. There are just a host of reasons to be concerned about using polygraphs as a way to make important decisions about people."
At the same time, the Department of Homeland Security was being created in the aftermath of the 9/11 attacks. "That got me thinking seriously about trying to improve the quality of the science in the field of terrorism and counterterrorism," Murphy says.
In late 2004, in response to a Homeland Security call for proposals for research into the social psychology of terrorism, he put together a team of psychologists, sociologists, political scientists, religion scholars and historians from Penn State and numerous other institutions, with the aim of creating a multidisciplinary center. "I think there's a lot of room for improvement, a lot of room for good basic and applied science that could attack this problem," Murphy says.
Research Penn State editor David Pacchioli interviewed Murphy in January.
Q: Where do we begin?
A: If you look at the folk theories about the psychology of terrorism, there are two broad theories out there, both of which are pretty radically wrong. One, that these people are crazy—in the sense of psychotic—or the other, that these people are stupid. Neither of these is a very useful way of thinking about what's going on.
In fact we know a fair amount about motivational factors that tie into willingness or lack of willingness to do all sorts of things, to further all sorts of causes. And we can use and expand that research to do a better job of understanding the why, the when, the how of terrorism—who's likely to engage in terrorist acts, and what could be done to reduce the likelihood of terrorism.
Q: What are some of those motivational factors?
A: People tend to think of poverty and oppression as the things that breed terrorism. It's not that simple. Some of the best scholarship suggests that it is in fact the combination of seeing opportunities and feeling powerless to achieve them that is one of the most important factors. If you look at the places, especially in the Middle East, where terrorism has been a large-scale problem, it's often a place where people feel like things really could be better but we can't get there because someone's blocking our way. This notion of blocked aspirations is very powerful.
Q: What role does the increasing pervasiveness of Western media play in that sense of blocked aspirations?
A: It's hard to parse all this stuff out. We spend a lot of time worrying, "Gee, the media spend hours showing a terrorist attack over and over, and that's just going to cause more terrorists." That may be true—we don't know—but it might also be that a lot of these other images, contributing to a feeling of blocked progress, may turn out be more important in the long run.
Q: In the realm of prevention, you talk about learning to identify the suicide-bomber type person, trying to spot characteristic behaviors.
A: Right. There's research going on mainly in Israel focusing on exactly that. There is some empirical evidence that people who are engaging in suicide attack—getting on a bus with a bomb under your coat—there might be some behavioral cues that are consistent enough to be useful.
This is a very difficult area. My guess is that the answer will be very similar to what we see in the field of deception, that there are a set of consistent behavioral cues, but they are consistent within a person, and not consistent from one person to the next.
One of the recurring things you hear in the field of deception is: "If you want to know whether somebody is lying, ask their mom." Somebody who's observed you over a long period of time under all sorts of conditions might be able to pick up behavioral predictors with some reasonable accuracy, but those won't be very helpful for telling whether the next kid down the street is lying.
Q: Is the suicide bomber crazy?
A: Probably not. The recurring themes that do come up are a strong level of commitment to a cause and some sense of helplessness or frustration. The other thing is that suicide bombing is a desperation tactic. If you look at the evolution of groups for whom this is a tactic of choice, they have run through a gamut of things previously and those have not been effective.
Q: The "evolution of groups" is an interesting idea. Is there an identifiable pattern here that might be used in combatting terrorism?
A: There is, in a broad historical sense. Terrorist groups engaged in armed political action are born, they grow, and they die. Are there things that might be done to accelerate that process?
One of the things we have proposed is to study the notion of trying to develop strategies for undermining charismatic leaders—to identify what we know about charismatic leaders in nonviolent situations, and see if we could use that information to develop specific strategies for violent groups. Charismatic leaders usually do very well as long as they're winning. And at some point the winning streak ends and their charisma fades away pretty quickly.
Q: Another element of the organizational behavior research that comes into play here has to do with understanding group dynamics. "Diffused" organizations is one term you use, and also "virtual" organizations.
A: We're seeing more work organizations adopt this sort of model, and we're starting to learn a lot about what works and what doesn't, and what problems come up when you're trying to coordinate the effort of 100 people who are in 10 different countries. I think there's likely to be lessons learned that could be applied to Al Qaeda.
Q: But I guess it's hard to generalize from everyday organizations to terrorist organizations.
A: We don't know which sort of knowledge is transportable, and which sort isn't. In part because it's hard to get the two worlds—the organizational psychology world and the world of counter terrorism research—it's hard to get them in the same room talking.
Q: Another aspect to this is the whole business of understanding risk.
A: Terrorism, like all forms of asymmetric warfare, is essentially psychological warfare. It's most effective when people's response is disproportionate to the real risk. And certainly if you think about it in those terms, terrorists have been successful on a variety of fronts.
One of the ways of attacking terrorism is to build skills in people to understand the real risks, to not panic when panic's not called for, to learn how to bounce back from trauma, physical and psychological. In the jargon of the military, this is the whole notion of hardening the target. If it's really clear that people are not going to jump off a cliff when there's a terror event, and that people are well-prepared to respond properly and constructively to the risk that's really there and not to respond to phantom risks, it makes terrorism a much less effective set of weapons.
Q: It seems to me that terrorism has in some ways defined us as a nation since 9/11. Is that skewing people's understanding of risk?
A: There are two competing schools of thought on that. One is that before 9/11 we were never serious enough, and the other is that the response is disproportionate in a way that's dysfunctional—that we're not buying hundreds of times more safety with hundreds of times more spending. And that we may create a variety of other problems for ourselves. This is one of the recurring civil liberties concerns.
Q: You mention civil liberties, and one of the tactics you have talked about is identifying patterns. Does that verge into the area of profiling?
A: I think it really does. I think the concern about profiling is right on. You could argue that profiling would be a good thing if you had the right variables. We just don't really know how to do it especially well.
The notion of pulling people over on the basis of their skin color has so many more down-sides than up-sides—it ties into the perception of differential enforcement, for example—that even if it sometimes might turn up the right result, it's not a sensible way to proceed.
On the other hand, if we have evidence that suicide bombers are visibly nervous when getting on a bus, we ought to take advantage of that. The challenge is to identify cues that are reliable, easy to pick up, and don't have side effects that outweigh the benefits.
Q: What are the best long-term strategies for prevention?
A: Understanding the nature of people's aspirations—what it is they really care about—at a more sophisticated level would certainly give us a leg up in terms of understanding how to predict and prevent terrorist acts. Understanding who is and who is not the focus of terrorism, and why.
We really could use a variety of tools to frame these things as scientific questions. And I think there's tremendous potential to do good work that combines the societal level examinations that come from sociologists and political scientists with the individual and small-group-level thinking that comes from psychologists.
Kevin R. Murphy, Ph.D., is professor and head of the department of psychology in the College of the Liberal Arts, firstname.lastname@example.org.