Stand still too long and you'll be watched
New imaging software alerts surveillance-camera operators to suspect situations by monitoring patterns of motion
KINGSTON UPON THAMES, ENGLAND
Spotting crime before it happens may sound like the stuff of sci-fi novels, but researchers here at Kingston University are trying to do just that.
Detecting overcrowding and suspicious packages are potential uses of their system, which marries imaging software and the closed-circuit TVs that are abundant on this side of the Atlantic. Airports and train stations are prime locations to use the new technology, which is currently being tested in subway systems in London and Paris.
Comparisons are frequently made between the project and the precrime fighting offered in the movie "Minority Report." But Prof. Sergio Velastin is adamant that his team's work is not about predicting events, but flagging the characteristics of potentially problematic situations.
"I prefer the word 'correlate' rather than 'predict,' " he says in an interview in his office at the university's campus near London. "We have a rough idea of what might be happening, but we leave the final decision to the human operator."
Interest in the system is coming from transit groups in Europe - where many of the project's partners are located - and even some organizations in the United States, where citizens are still sorting out how they feel about widespread use of surveillance cameras.
Britain leads the West in the use of closed-circuit TV (CCTV), which proliferated after IRA bombings in the early 1990s. The number of cameras countrywide is possibly as high as 2.5 million, with estimates suggesting that Britons are photographed by 300 separate cameras in a given day. The London Underground currently has about 5,000 CCTVs, a number that is expected to double in the next two years, says Professor Velastin.
As CCTVs increase in Britain, so do the challenges of finding enough people to do the tedious work of watching them. Even when operators are available, they might have dozens of cameras to keep track of. As a result, many of the cameras are not watched at all, says Velastin.
Instead, video from CCTVs is often used after the fact - to try to figure out if a missing girl got into someone's car, or to track the movements of a murdered TV personality. "The cameras on the whole are very useful for reacting to things that you know have happened," says Velastin.
But he and others in Europe envision ways to use the CCTVs that would allow those monitoring them to act quickly before, or while, an event is happening.
"What we want to do is bring in technological support to the operator, so this operator has much better information, more timely information," says the professor.
After a decade of research, Velastin and his colleagues at Kingston's Digital Imaging Research Centre have come up with a system that can be hooked up to existing CCTVs and does not require the operator to navigate complicated computer menus. Remembering the system's name is perhaps more difficult than using it: Modular Intelligent Pedestrian Surveillance Architecture, or MIPSA.
The system is programmed with scenarios - such as unattended objects, too much congestion, or people loitering - and when it detects one of those, it alerts the operator through a series of flashing lights and messages.
To determine what is suspect, the system memorizes the features of an image that are constant, and then subtracts those to figure out what is happening. It looks at patterns of motion and their intensity. Things that are stationary for too long in a busy environment raise alarms.
Operators with the London Underground who have tested the system are happy with it, Velastin says. "We detect 90 percent of things that a human would have detected, and about two to four percent of our detections are false alarms," he says of the system's effectiveness. Within two years it could be used more extensively in the London Underground, and also in the Paris Métro, he says.
Still, use of CCTV increasingly raises issues of privacy among those being watched. In France, the public is generally more protective of its privacy. But in Britain, says Velastin, "people recognize that there is a balance between privacy and being looked after."
He fends off concerns by explaining that it's very difficult to recognize people in these images, and also that no one is going to be targeted for features like unusual clothes or an identifiable walk. Instead, he says, more universal situations will be programmed into the system, such as people walking against a crowd, which can be a sign of pickpocketing. "They're only going to be stopped and investigated if there is sufficient reason to do so," he says.
Eventually researchers may be able to track the whereabouts in a facility of a person who left a suspicious package.
Perhaps even more intriguing is the idea of reaching a time when computers determine what we're up to without any humans having a say. Concerns about computer judgment in this case are misplaced, Velastin says. "Should we not then have the same concern about police officers?"
The way he sees it, "They make decisions all the time about who they call suspicious, and they seem to be much more difficult to control than a computer - they bring their own bias."
He says there could come a time 1,000 years from now when decisions might be made by a huge computer whose programming is unknown and over which humans have no control.
But, he notes, "We are very, very far away from that."