Twitter Facebook Google RSS
Homeland Defense 

Machine That Predicts Terrorists’ Intent Showing Progress 

11  2,009 

By Austin Wright 

Metal detectors screen for the means to commit a crime. The Department of Homeland Security is developing technology that screens for the intent to do so.

The department’s Future Attribute Screening Technology, or FAST, uses body scanners to sense the fear in your eyes — and in your skin, your heartbeat and even your movements. The system places these and other variables into an algorithm that may be able to determine whether the sum of certain bodily signals is the result of hostile intent, or just someone having a bad day.

Robert P. Burns, deputy director of the Homeland Security Advanced Research Projects Agency, says the technology could help security officers at checkpoints decide which travelers should be called aside for secondary questioning. The technology is still years from completion, he adds.

About 5 percent of Homeland Security’s science and technology budget goes toward these reach-for-the-sky endeavors that could have game-changing effects on national security if they succeed. The agency also is developing technologies that would detect drug-trafficking tunnels from above ground, reduce the likelihood of massive power outages and strengthen levees. “We go after the projects or ideas that other people don’t want to go after because they are incredibly high-risk and could fail,” says Burns, a 1981 Naval Academy graduate who spent 21 years in the private sector. “We’re really pushing the envelope in terms of science and technology.”

The FAST project, which began in 2006, involves 40 to 50 developers from several organizations and so far has cost the agency $20 million. It combines research from a number of academic fields into a futuristic model for stopping crime before it happens. Project collaborators from Draper Laboratory, a Massachusetts-based not-for-profit research company, have been working with the agency to translate decades’ worth of physiological studies into algorithms that gauge people’s involuntary bodily signals.

“We look at a series of physical cues or behavioral cues that you give off that are a direct relation to your physical and emotional thought processes,” Burns says. “You can’t base anything on any one of these signals. It’s the compilation that we look at and that come together.”

The technology could be described as a more comprehensive, less intrusive polygraph exam. Already, privacy advocates are expressing concern. The Electronic Privacy Information Center, a Washington-based think tank, plans to push for laws that would ensure that the federal government doesn’t keep records of the FAST system’s measurements, and the American Civil Liberties Union is exploring its legal options for trying to halt the project.

“We think that it’s an invasion of privacy to read someone’s physiological bodily functions without their permission,” says Jay Stanley, a spokesman for the ACLU. “It’s nobody’s business what my pulse rate is. It’s a profound invasion of human dignity.”

Burns counters that the system was designed as a way to help checkpoint security guards make better-informed decisions about which travelers to call aside for further questioning — and not as an Orwellian device for keeping medical tabs on unassuming citizens. He says he worked with Homeland Security’s privacy office to make sure the program adheres to all federal laws. “The system does not record or maintain your information,” Burns says. “Once any issues are resolved, the information is dumped.”

The program’s long-term goal is to allow the public to move with greater freedom through airports, border checkpoints and government buildings, he adds. But in its current form, the FAST system can scan only one traveler at a time, and it requires that each person answer a series of questions. Computer software compares physiological measurements taken during questioning to measurements taken before questioning.

“This is a case where technology has finally caught up to the theory, and each of our sensors has a specific theory behind it,” Burns says. The system relies mainly on low-cost, widely available equipment. “We should have something that’s reasonably cheap.”

In 2007, Burns left his job as an executive at American Systems, a technology consulting and engineering firm, to become manager of the FAST program. In mid-2009, he was tapped to become deputy director of Homeland Security’s entire advanced-projects agency. The former submarine officer is loud and animated as he describes what he says have been the most fulfilling two years of his professional life. “I was raised to value duty, honor and country,” he says.

“Being a public servant works really well for me.”

In addition to FAST, he oversees several cutting-edge, often-secretive projects. Department researchers are developing contact-less fingerprint scanners, liquid-explosive detection devices and software to help public-safety officials make snap judgments during disasters. Another project, the resilient tunnel, aims to seal leaks and smother fires in commuter tunnels that run under rivers and other bodies of water.

University and private researchers collaborated on the tunnel project in an effort to develop gigantic deployable airbags capable of withstanding extreme water pressure. If the project succeeds, airbags would be able to keep cracking tunnels temporarily intact, giving rescue workers more time to search for trapped survivors. And in the event of a tunnel fire, airbags would be able to seal off the entrances and deprive the fire of oxygen.

“We needed to find a way to protect tunnels, because today’s safeguards are super expensive,” Burns says. “If we’re able to bring some of these projects forward, they could truly change the game.”

The FAST system has the potential to catch terrorists and their accomplices before they ever get the chance to launch their attacks, he says. Agency researchers showcased FAST at a September technology exhibition in Cambridge, Mass. They tested the system’s ability to identify study participants who planned to commit a crime.

Some of the participants were given items that, if smuggled into the exhibition, would have been capable of causing a major disruption. Others were told to enter the exhibition hall, locate a hidden device and set it off. In both cases the items were inert, a fact that was unknown to the participants. Burns declines to discuss the items, the hidden device or the study’s findings. Such information, he says, could compromise future experiments.

“We’re doing amazingly well, but this is not ready for prime time,” he says. “We would like, in fiscal year 2011, to have a single prototype that we can take to an operational location and perhaps test and do further data collection in a real-world environment.”

At the exhibition, study participants, including some who were part of a control group, walked single-file through a security checkpoint. A guard asked them a series of questions. Meanwhile, a laser measured their heart and respiratory rates, an eye tracker measured their blink rates and pupil dilation, a thermal camera measured the heat on their skin and a reconfigured Nintendo Wii Balance Board measured their fidgeting. Nearby computers processed the data, and the system’s software recommended to the security guard which participants should be taken aside for follow-up questioning.

“We’re looking at the combination of those factors,” Burns says. “We’ve got to make sure that whatever system is developed doesn’t cause false readings. I want to make sure that if you’re running late to catch your mass transit and you’re carrying a large backpack, that I don’t pull you over for secondary questioning because you’re hot and sweaty. If you’ve had a bad day and are a little terse, I don’t want to pull you over because you’re grumpy.”

So far, the results show that “we’ve made great progress,” he says.

Paul Ekman, a prominent psychologist and a consultant on the FAST project, says he’s skeptical of the technology. Over the past 40 years, Ekman has pioneered the study of human emotions and their effects on facial expressions. He is ranked 59th on the Review of General Psychology’s list of the 100 most eminent psychologists of the 20th Century.

Ekman was glad to offer recommendations to project researchers, but he says he doubts the system will ever outperform human observers. “I’m a little dubious, but data could convince me,” he says. Ekman runs a company that trains security workers to detect signs of malicious intent in people’s behaviors, such as body posture and facial twitches that last a quarter of a second. The premise of the Fox television show “Lie to Me” is based on his research.

“Whether the FAST project will succeed in being practically useful is unknown at this point,” he says. “Also, testing the technology is very difficult. You can’t get the stakes high enough, because of ethical constraints.”

People’s physiological behaviors change in measurable ways only when the cost of failure is high, and the agency’s experiments lack strong negative consequences for the mock criminals, he says. Also, he adds, innocent people sometimes display physiological behaviors that would make them appear malicious, while criminals might be able to train themselves not to display those behaviors.

“You can’t, in my view, simulate a terrorist.”               
Reader Comments

Re: Machine That Predicts Terrorists’ Intent Showing Progress

article is wasted. you cannot screen for intent./ you may be able to say you can play the psyops card, but do NOT bet on tech to conduct brainmapping

jsmith on 01/15/2010 at 13:16

Submit Your Reader's Comment Below
The content of this field is kept private and will not be shown publicly.
Please enter the text displayed in the image.
The picture contains 6 characters.
*Legal Notice

NDIA is not responsible for screening, policing, editing, or monitoring your or another user's postings and encourages all of its users to use reasonable discretion and caution in evaluating or reviewing any posting. Moreover, and except as provided below with respect to NDIA's right and ability to delete or remove a posting (or any part thereof), NDIA does not endorse, oppose, or edit any opinion or information provided by you or another user and does not make any representation with respect to, nor does it endorse the accuracy, completeness, timeliness, or reliability of any advice, opinion, statement, or other material displayed, uploaded, or distributed by you or any other user. Nevertheless, NDIA reserves the right to delete or take other action with respect to postings (or parts thereof) that NDIA believes in good faith violate this Legal Notice and/or are potentially harmful or unlawful. If you violate this Legal Notice, NDIA may, in its sole discretion, delete the unacceptable content from your posting, remove or delete the posting in its entirety, issue you a warning, and/or terminate your use of the NDIA site. Moreover, it is a policy of NDIA to take appropriate actions under the Digital Millennium Copyright Act and other applicable intellectual property laws. If you become aware of postings that violate these rules regarding acceptable behavior or content, you may contact NDIA at 703.522.1820.

  Bookmark and Share