RESEARCH AND DEVELOPMENT
Robots Primed for Disaster Operations
Scientists here at one of Japan’s leading engineering universities are developing robotics technology and information systems that can help first responders track down victims in the aftermath of a natural disaster.
Earthquakes are a “really serious problem for our country,” says Tohoku University professor Satoshi Tadokoro, two days after a 7.2 magnitude quake shook this city of 1 million people and a large portion of northeastern Honshu, Japan’s main island.
Spurred to action by the 7.3 magnitude earthquake that rocked the city of Kobe in 1995 and killed 6,432 people, mechanical engineers and scientists across the nation are focusing their efforts on search-and-rescue robots.
“No living person has been rescued by a robot. It will come true soon,” says Tadokoro, who is president of the International Rescue System Institute, a non-profit organization promoting research and technology development for disaster relief.
With government funds, the institute organized a five-year nationwide project to develop technologies for emergency responders conducting search-and-rescue operations in urban areas. More than 100 academic researchers participated in the effort.
Out of the initiative, the researchers produced ground and aerial vehicles, sensors and information and communications equipment.
Information gathering is the most critical issue in search and rescue, says Tadokoro. Studies that followed the 1995 Kobe earthquake showed that reconnaissance technologies are critical to being able to rapidly locate victims. Robotics technologies are particularly effective in this realm, he says.
“The robots should not be an intelligent system. They should be a good tool first, and the human responder makes the rescue,” says Tadokoro.
Tracked ground vehicle robots have been used widely in the U.S. military for bomb disposal. These systems also are ideal for search and rescue. Tadokoro’s laboratory developed a high-mobility tracked and wheeled robot named Kenaf that can climb steps and traverse 70-degree slopes of rubble. The six-motor wireless robot carries cameras and other sensors and is remotely operated with a joystick.
Last year, it beat iRobot’s Packbot and Foster-Miller/Qinetiq North America’s Talon through an obstacle course and maze at the RoboCup conference in Atlanta and subsequently won a prize for mobility.
Tadokoro’s lab tested the robot in the Kobe subway system during the summer. “We could confirm its effectiveness,” he says. “However, we still have several problems to be solved.” The scientists are developing better sensors and other technologies to help the robot navigate the rubble-strewn environment semi-autonomously.
His team also developed a serpentine robot, a three-segmented tracked vehicle that can move its body like a snake. “Rubble is a very complicated place. Usually robots get stuck,” says Tadokoro. But not this robot, which can climb over low walls and roll to escape tight situations. “It’s very good in confined spaces,” says Tadokoro.
The robot was demonstrated at a U.S. Federal Emergency Management Agency training site for first responders near Las Vegas.
Another serpentine robot developed by Tadokoro is based on a video fiberscope. Traditional scopes can be threaded underneath rubble to allow operators to see what lies buried. But they have had a difficult time maneuvering the flexible tube around corners and over obstacles.
A so-called “active scope” camera solves that problem, says Tadokoro. It looks like a long bristled caterpillar. The white bristles are made of nylon thread. Beneath them, actuators run along the length of the scope’s 8-meter body. They allow the video camera to squeeze through 3-centimeter gaps by ciliary vibration drive and snake deep into debris piles and drain pipes where most robots cannot go. Users simply twist the cable toward the direction they want the scope to turn. It can climb 20-degree slopes and clear obstacles 200 millimeters high.
FEMA officials last year tested the system at a training site in Texas. The device also was employed earlier this year during the investigation of a parking garage collapse in Jacksonville, Fla. Investigators inserted the scope beneath the rubble to examine the debris and determine the cause of the collapse. The system revealed concrete flakes and cracks in the structure and pillars that should have remained standing.
A number of fire departments in Japan have been testing the device. Initial feedback has been good, Tadokoro says, and some have suggested that the scientists give operators a wider field of view on the monitor. A unit also was lent to the Center for Robot-Assisted Search and Rescue — which recently moved to Texas A&M University — for further study.
Kazuya Yoshida, a professor in Tohoku University’s department of aerospace engineering, has been developing robotics systems for planetary and lunar exploration. But his research can be applied to robots in the disaster response arena.
“In such operations, we need rough terrain mobility,” he says in his laboratory, where he has developed a number of rover test beds to improve the mobility of tracked space vehicles.
The soil on the moon and other planets can be rocky, sandy or dusty — similar to the rubble environment after an earthquake. It can be difficult for robots to navigate those types of terrain. During its exploration of Mars, NASA’s rover, Opportunity, encountered a sandy environment and became stuck because its tracks were slipping in the soil.
“Slippage is a very critical issue,” says Yoshida.
To track how far remotely operated robots travel, scientists have relied upon a method called autometry, in which the distance is measured based on wheel rotation. If there’s no slippage, scientists can get an accurate measurement. But if the wheels are sliding in sand or soil, the reading can be skewed because it takes more rotations to move the vehicle.
Yoshida’s lab is researching a slippage compensation control system that would allow robots to counter-steer against the pull of gravity and travel over the terrain in a straight line.
His lab also is investigating a new way for robots to map an environment simultaneously as it rolls through an area. Using commercial technologies, the team has integrated five scanners onto a 1-meter high ground robot. As the robot ventures through the lab in a demonstration, it detects objects and structures and beams the data to a computer. The data are assembled into a 3-D map that can be displayed from a bird’s eye view. “You don’t need a flying robot. We can get it from the ground,” says Yoshida. The mapping tool could be useful for first responders who are trying to locate disaster victims. But further research is necessary to apply the mapping technology to such an environment where the debris is strewn more randomly. “It’s difficult to correspond the different images,” he says.
The search-and-rescue robotics project concluded last year, but Tadokoro is hoping for additional funding to further the research.
“In addition to the technologies, we need more things in order for the robots to be used in real situations,” he says. For example, first responders need to be well trained to use the technologies, and that is an area that has not yet been fully addressed.