Facial recognition - the death knell of anonymity?

By Greig Watson
BBC News

image captionLeicestershire Police said the software showed them "which door to knock on" but was not itself evidence

Catch the terrorist, secure the computer, even feed the cat - facial recognition software has varied, often astounding possibilities.

Leicestershire Police has become the first in the UK to use the NeoFace system to compare suspect images with the 92,000 faces it has on file.

The advantages are obvious, reducing a job that manually could take weeks to a matter of seconds. Justice becomes, in theory, swift, cheap and precise.

But could it also mean a walk in the street becomes a trial by pixel, with cameras analysing everything from your criminal record to your credit rating?

The idea of getting computers to respond to faces has existed for decades.

After a secretive start in the US in the 1960s, the technology grabbed UK headlines in 1998 when Newham Council in east London linked it to CCTV.

It was quietly abandoned some time later and the council now says it does not have "any formal data from that period" - but reports suggested the system did not yield a single positive identification.

image copyrightBBC/Getty
image captionWill faces become as readable as number plates - and as easy for CCTV to track and analyse?

Dr Chris Solomons, managing director of Visionmetric and Reader of Physics at the University of Kent, explained how most systems worked with two properties of the face.

"One is geometry - the configuration and placement of features," he said. "And the other is texture - colours, tones and condition.

"Both of these can change. If I shout or turn my head, distances between the tip of my nose and my eyes, for example, can change. If I go for a run, the skin on my face will get much redder.

"Given a bit of time I can grow a beard or - with a bit more time - acquire wrinkles.

"Those are natural factors but there are also things like glasses, hats and lighting conditions, which can dramatically alter the way a face looks.

"The human brain can compensate for this but early software, which could map a static face, really struggled with this.

"But systems have become more robust by recognising that these factors can change, but only within certain limits - for example, I can't put my mouth where my ear is.

"It's by no means fool proof but it is becoming ever more sophisticated."

Techno fear - other options

image copyrightUSAF
  • Artificial Intelligence: Smarter and faster: Could computers soon outwit their creators?
  • Geo-engineering: Altering the atmosphere to stop climate change - what could possibly go wrong?
  • Cyberwar: Everything with a link to the web stops working: No money, no light, no help.
  • Drones: Prototype emotionless killing machine: Military now, civilian security soon, bad guys not too long?
  • Driverless cars: Why worry? My sat nav is 100% reliable....

The NeoFace system being used by Leicestershire Police is made by NEC.

Chris de Silva, the company's head of global face recognition solutions, says the strength of their technology lies in its tolerance of poor quality, highly compressed surveillance videos and images.

"While facial recognition technology was once considered of marginal value, use of NEC's face recognition technology is now leading to a higher rate of positive identifications in real life investigations," he adds.

NeoFace chalked up its first conviction in June in the US when a Chicago armed robber was sentenced to 22 years in prison after having his mugshot matched to crime scene footage.

Hilary Gazzard, identification officer at Leicestershire Police, explains the system produces 50 "possibilities" it ranks in similarity to the original picture.

"It is amazing to be able to sort through 92,000 images in a matter of moments and it is going to be such a useful tool to officers out on the street, from low-level crime like shoplifting to murder scenes.

"It's not DNA, it's not fingerprints, it's a suggestion, that this person is potentially of interest to your inquiry," she says.

image copyrightIndiegogo/Bistro
image captionFur-cial recognition: Safeguard your cat's wellbeing with a feeding station which only responds to them

While briskly analysing thousands of file photos is new for UK police, the US Defence Department is a step ahead.

It has reportedly ordered 500 X6 'spy glasses' which allow wearers to match faces they are looking at against a database, in real time, as they walk down the street.

But security applications are just the tip of iceberg. Facial recognition has dizzying potential.

It is being developed for use on keyless doorbells, diagnosing genetic disorders, online education courses - to both verify identity and make sure you are paying attention - interactive cars, longevity prediction for possible use in the insurance industry and yes, even make sure only your cat eats from their bowl.

Facebook's DeepFace system, while not yet live, has been claimed to almost match the human brain for distinguishing between faces. The social media giant, recently criticised over an experiment on users' emotions, has said the software will only be used to improve tagging in photographs.

Inevitably, such powerful technology has prompted strident misgivings.

While other biometric identification - fingerprints and iris scans - requires the subject's co-operation, facial recognition does not. Images can be recorded anywhere.

image copyrightMatthew Lloyd/Getty
image captionOnce recorded, a face can be matched against any data which a government, company or individual holds

And crucially, who controls which 'mugshots' these are matched against? How many times has your face been recorded, for work? By the council? Everyday government agencies?

The highest profile case surrounds the FBI's integrated biometric Next Generation Identification (NGI) programme designed, officials say, to protect public safety and national security.

Technology watchdog Electronic Frontier Foundation (EFF) says NGI's facial recognition database, which could be 52 million strong by 2015, will combine images from both criminal sources, such as case files, and non-criminal sources, such as work background checks.

The group also highlights research which indicates facial recognition becomes less reliable the larger the database gets.

Techno fear - A proud history

image copyrightPA
  • Writing: Socrates warned it would "create forgetfulness in the learners' souls, because they will not use their memories."
  • Printing: 16th Century monk Filippo di Strata was unimpressed; "The pen is a virgin, the printing press a whore."
  • Education: 1883 medical journal the Sanitarian argued schools "exhaust the children's brains and nervous systems with complex and multiple studies."
  • The Y2K bug: Computers used two digit year fields. Going back to 00 signalled doom. Or not.
  • Mobile phones: Crashing planes, exploding petrol stations and brain cancer. Has that happened to you lately?

EFF Senior Staff Attorney Jennifer Lynch said: "Facial recognition technology poses critical threats to privacy and civil liberties.

"Images of our face can be captured covertly and remotely, as well as repeatedly and on a mass scale, as we go about our day in the public sphere.

"This kind of tracking can expose intimate details about our lives - details that we don't want the government, law enforcement, or data marketers to know."

And it is perhaps the commercial dangers of this technology that cause the most concern.

Stores in Japan pioneered systems which recorded general information, like age and gender, to build marketing data on who shopped where and when.

While makers insisted these did not match the information to an outside database, they admitted it could identify a repeat shopper.

So how long can companies be trusted to resist such a powerful opportunity?

Analysis by Dr Noel Sharkey, Emeritus Professor of Artificial Intelligence and Robotics at the University of Sheffield

This new technology can all seem benign for many applications in civil life and helpful for catching dangerous criminals, but we urgently need informed public debate and parliamentary discussion.

We need to regulate this technology while we still have a shred of privacy left.

We are putting tools into the hands of authorities that can expand their reach beyond the necessary. It will make it possible to create new laws against peaceful protest and strikes. It will make it possible to monitor all of us and our family members without us committing a crime.

We do not want fishing expeditions looking for anti-social behaviour, we need clear police guidelines and possible new laws.

Dr Solomons says: "Already many websites remember what you have previously bought and recommend related items - and some people get very uptight about that.

"What if they also knew about how much money you had? What sort of job you had?

"I can imagine a situation, with facial recognition, where adverts, just about anywhere, will start to say, to an individual; 'Don't you think you would look better with this lipstick? This food? Even plastic surgery?'."

In November last year Amscreen installed advertising screens in 509 UK Tesco petrol station shops which detected the age and gender of a customer and then displayed products targeted at that group.

But the firm insists personalised billboards are not imminent.

Mike Hemmings, Amscreen's marketing director, says: "Once we pointed out nothing was being recorded, the furore, the sense of being spied on, subsided and everyone carried on regardless.

"There is an acknowledgment in the industry that online can do (personalised adverts) and mobile is good at it but outdoor is so much more tricky.

"To be honest, it might happen but it is too much of a leap now, it is five or six steps down the road and you may well get a backlash against that sort of physical tagging.

"Our next step is to incorporate moods, so that we can look at you and say 'You are happy', meaning you are more likely to buy this sort of a product, or that 'You are sad' and the ad can react to your mood."

More on this story

Related Internet Links

The BBC is not responsible for the content of external sites.