With an estimated  one CCTV camera for every 14 people in London, it would appear to be one of the most surveillance-minded cities in the world outside of China. It has been calculated that during the working day of an average person  in the UK's capital would be caught on around 300 different cameras.  Certainly CCTV cameras have a place in society nowadays for monitoring  traffic and supposedly acting as a deterrent to criminal activity. Most  of us have learned to live with them but whilst we are monitored like  this as we drive and walk around or simply hangout with friends and  family, we should not ignore the fact that our personal privacy is at  risk. The trouble is, these are public spaces and currently there is not  much that we can do about it.

But do they really reduce crime?  Studies in the UK suggest that the use of cameras has had little effect  as regards crime prevention in town centers. The UK Home Office  commissioned a study by a local university of 14 CCTV systems and only  one showed a drop in crimes committed. Car parks were the only real  winners with a decrease in crime due to the presence of cameras. The  reliance upon humans to constantly watch a vast array of monitors seemed  to be the weak link in the chain.

Then along came facial  recognition. This works by mapping facial features, mainly the eyes,  nose and chin, by identifying the light and dark areas then calculating the distance between them. This information is then matched  against others on a database.

London's Metropolitan Police (the  Met) have been setting up vans outside shopping centers which are known  for crime in order to scan the faces of the general public and compare  the results with a database of 5000 registered villains and  troublemakers. Admittedly the police had a signboard up saying that live facial recognition scanning (LFR) was in progress. Its effectiveness is  certainly questionable, with one survey showing that 93% of those  stopped were inaccurately identified and an independent review by Essex  University found an accuracy in just 19% of cases.

Privacy advocates are certainly not taking kindly to this move by the Met as,  unlike fingerprinting and DNA, there are no regulations currently in  place governing how this biometric data can be snatched, stored and  used. Since this data is being taken without the individual's consent,  this will have negative consequences as regards the attitude of the  general public towards the police force and legal challenges concerning  this practice may well arise. Use of this technology in such  an indiscriminate manner is like a virtual identity parade, with all the  good people amongst us being lined up with the bad in an effort to get a match. We don't want to move down the path where those attending  legitimate protests and meetings are identified and targeted as  troublemakers.

In  the US and Europe, LFR has been shown to be more accurate when it comes  to identifying white people thus leaving ethnic minorities vulnerable to  false positive readings. In the States, cellphone cameras are used by  police to compare suspects with databases which hold disproportionately  more African Americans. However, AI is only as good as the subject  matter upon which it has been trained and in the US it has been  predominately set up using white faces. In China, the AI has been trained on Asian faces and therefore works a lot more efficiently in  that environment. Studies have indicated that people have difficulty  recognizing faces of another race and this kind of bias could be moving  over into AI. LFR can at times struggle with the lack of contrast on  darker skin and also with women using makeup or wearing their hair  differently.

But now in the UK one major concern is if the  technology could soon make the jump from a simple arrangement on a van  outside a shopping mall to operating via a city-wide network of cameras.  If it goes unchecked, this will just be the start of more and more  surveillance as the authorities push the boundaries as to what they can  get away with.

An adviser to the Home Office is suggesting that  such use of LFR is something for Parliament to decide upon and a legal  framework needs to be in place. It does not stop there though as other  technologies such as voice recognition, gait and iris analysis will  require legislative control, as will other biometric technologies which  will no doubt emerge in the future.

A leak of an internal memo  from the European Parliament showed that it was considering the use of  LFR to provide "security and services" to its members. This did not go  down well with MEPs and thus the Parliament was forced to drop its  "digital transformation programme". Taking into consideration the EU's  data protection laws, it has now transpired that they are debating a ban  on LFR in public places such as stations, stadiums and shopping  centres. Good news indeed for the vast majority who value their privacy.

In China the technology is far more advanced and the shift has already  been made with many aspects of people's lives being subjected to LFR. In  the PRC it is not just a means of catching criminals but a formidable  method of social control. From banks to airports, from hotels to public  toilets, it is the police and state security who are the most  enthusiastic about it, with the plan being to interconnect all private  and public CCTV cameras into one vast surveillance network. So in China,  who you are, where you go and with whom you socialise will soon be  stored on your own government-customized record thanks to LFR technology. The eyes of the masses will also be used with neighborhood committees and other informers able to watch security camera footage on  their mobiles and TVs. The advances in the technology in China have been  quite something since their LFR is now able to classify targets by  their gender, clothes and hair length, plus they can be followed from  one camera to the next based upon their faces alone. This has been  particularly useful for the central government in order to keep a tight  control over the population in the Xinjiang region. The ultimate  surveillance state has now been formed.

From a privacy point of  view, one particularly dangerous piece of facial recognition software is  Clearview. The developers have been gathering up images from social  media accounts for some years now and all users' activities, interests  and networks are sitting in the Clearview database. From this, it can  retrieve a complete profile of a person just based upon one photo when  compared against the over three billion which it claims to hold on file.  In a recent in-depth analysis of Clearview and its Australian founder,  Mr Hoan Ton-That, the New York Times described it as "the secretive  company that might end privacy as we know it". Law enforcement, however,  take a different view and are certainly very upbeat, describing  Clearview as "the biggest breakthrough of the last decade". They have  been able to identify criminals using the software and it is thought  that some 600 police departments in both the USA and Canada are using  it. No doubt it will also have caught the attention of the FBI and  Department of Homeland Security. However, a powerful tool like this in  the wrong hands can be extremely dangerous. It says a lot that even  Google back in 2011 declined to go down the path of creating something  akin to Clearview for fear of misuse. Then-CEO Eric Schmidt said it was a  perfect tool for dictators to use. So far, other than for law  enforcement use, we do not know into whose hands this software has  fallen. Since it is against their "rules", Facebook, in a futile attempt  to stem the flow of photos, have sent cease-and desist letters to Mr  Hoan. Kind of ironic when you think about the things Mr Zuckerberg gets  up to.

So now that you know about Clearview, will you regain  control of your privacy by jumping into action and making your Facebook  account private? Certainly from that point onward but here's the rub.  Whatever was out there previously for all to see, Clearview will already  have it and won't be wiping it from their database any time soon. Mr  Hoan has released this product into the market perhaps not realising or  caring about the negative consequences. When finally tracked down he did  admit that his product needed some form of government regulation but at this juncture, the genie is truly out of the bottle.

​For further background on this feature, here are a selection of recent press articles for your reference:

How many CCTV Cameras are there in London in 2019?
In London there is 1 CCTV Camera for every 14 people, meaning there are now...
Met police deploy live facial recognition technology
Cameras used at east London shopping centre despite experts warning against them
China bets on facial recognition in big drive for total surveillance
Facial recognition is the new hot tech topic in China. Banks, airports, hotels and even public toilets are all trying to verify people’s identities by analyzing their faces. But the police and security state have been the most enthusiastic about embracing this new technology.
‘I think my blackness is interfering’: does facial recognition show racial bias?
The latest research into facial recognition technology used by police across the US has found that systems disproportionately target vulnerable minorities
Facial Recognition Is Accurate, if You’re a White Guy
Commercial software is nearly flawless at telling the gender of white men, a new study says. But not so for darker-skinned women.
European parliament says it will not use facial recognition tech
Statement comes after leaked memo on use of technology in security provoked outcry
Hiding in plain sight: activists don camouflage to beat Met surveillance
Privacy campaigners bid to beat police facial recognition plans by wearing ‘dazzle’ makeup
Goodbye Privacy: Police Are Silently Stealing Your Photos to Use Against You
Privacy is a thing of the past now that Clearview has created a database to identify a person’s entire online profile from a single photo.
The Secretive Company That Might End Privacy as We Know It
An unregulated facial recognition app can probably tell the police your name, and help them find out where you live and who your friends are.
This man says he’s stockpiling billions of our photos
Ton-That told O’Sullivan that he’s ready to defend Clearview AI’s technology in court, if necessary. (Niamh McDonnell / CNN)
Share this post