One of the fictional technologies on the “Star Trek” television series enabled humans to interact with a ship’s computer system simply by talking. The sci-fi computer could understand entire sentences and converse almost naturally.
In today’s automated world, rudimentary versions of the sci-fi technology come in the form of speech recognition systems that pick up key words and phrases to help consumers accomplish menial tasks. But the communications are often stilted and awkward at best, and limited in scope.
To aid computers in comprehending conversational speech, researchers are developing software that enables systems to extract meaning and context from a string of sentences. One such technology is called “Spoken Language Interaction for Computing Environments.” By being able to understand a speaker’s intent and draw logical inferences based upon available information, the system can interact with humans in a richer way, says Kenny Sharma, an engineer at Lockheed Martin Corp.’s Advanced Technology Laboratories. The software is flexible and can be applied to a broad range of industries, he says.
One application is in battlefield medicine. The U.S. Army Institute of Surgical Research and Lockheed Martin Information Systems and Global Services have paired up to fund the development of a medical voice documentation system that will allow medics and surgical teams to collect patient information as they administer care to casualties.
Little, if any, data about a trauma case gets passed along the chain of medical intervention in the war zones. From the time a casualty first receives tactical care to when a medevac transports him to a field hospital, the focus is on saving the patient’s life rather than spending precious minutes gathering information on injuries and treatments.
“If we can passively capture that information, it’s really valuable stuff,” says Sharma, a member of the Lockheed Martin team developing the prototype.
The system audio records medical personnel as they speak during treatment. It simultaneously transcribes the information by using an electronic form based upon the medic casualty card created by the 75th U.S. Army Ranger Regiment. Key data including the cause of injuries, the types and doses of medication administered, the timing of treatments and the status of the casualty are captured instantaneously and can be transmitted in seconds to the next intervention team.
“If you have a system that can fill in that report format for you and relate the data together … [and] you can store that in a digital format that’s searchable and retrievable, then you’re maintaining what really happened at that specific time frame,” says Sharma.
Having treatment history readily available can mean the difference between life and death because surgical teams can adequately prepare to receive the patient with full knowledge of how best to heal him. Later on, they can review the form to do analysis.
“Medical personnel in general would like to know what interventions were most effective,” says Polly Tremoulet, program manager with Lockheed Martin. “Having the data of what was done to a patient and later being able to see what the outcome was can help inform what kind of equipment should go with the medics and what they should be doing.”
In a video demonstration of the technology, two actors show how the system can interpret different speaking styles to glean pertinent information. The first medic is curt and to the point: “Huge wound present on right leg. Casualty got hit by shrapnel from IED. Packed the wound with Kerlex. Applied an ACE wrap. Five hundred milliliters of normal saline being administered. Casualty stable.” As she speaks, the computer quickly fills in the corresponding fields on a display.
The other medic speaks in complete sentences: “Hemorrhage found on chest — chest has been shot. I applied a Bolin chest seal. I applied HemCon. I put in a surgical airway. Five hundred milliliters of hextend on board. One gram fentanyl lollipop delivered orally. Casualty unstable.”
Development team members work with medical personnel to gain a broad sampling of the lexicon and grammar typically used in trauma or triage situations.
“We’ll ask them all the different ways they might want to report this type of information. And we’ll ask enough of them to where we think we have a pretty good sample,” says Tremoulet.
Sharma adds that it’s not so much the software technology that is making the breakthrough, but it’s the approach of building a system with users dynamically. “We try to understand what they understand, and they’ll tell us if the system is completely off or wrong,” he says.
Correction: The January 2010 science and technology column, page 17, “Water ‘Jerrycans’ Quench Thirst, Save Lives,” misidentified Michael Pritchard, inventor and CEO of Lifesaver Systems.