Credas CEO Defends Facial Recognition Technology

The bottom line is, Facial Recognition works.

There are countless good use cases of facial recognition technology, one of which we are very close to.

Credas helps combat the multi-billion pound problem of Money Laundering. Particularly for sectors like Estate Agency, Accountancy, Financial Services and Legal.

You can read more about how Credas helps AML here.

In fact, the Home Office calls out electronic verification solutions as providing the highest level of assurance possible when a business is completing these types of checks.

More recently, an independent review conducted by the University of Essex concluded that the technology, used by the Met Police, has an 81% failure rate.

The report concludes that the tech regularly misidentified people who were then wrongly stopped and as a result, they claim that use of this tech should be stopped immediately.

For me, this is an absurd thing to suggest, read on, and I’ll tell you why.

An article on this report in the Guardian calls out;

Of 42 people flagged up during the Met’s trials, 22 people were stopped, but of those only eight were being sought – some of whom were wanted for serious violent crime.

Identifying people who are wanted for a serious crime, that’s a win surely?

It’s fantastic that facial recognition is getting more airtime, it’s incredibly complicated but a hugely valuable piece of tech in my opinion. It’s no surprise, however, that there is a lot of scrutiny, negativity, and, most importantly, a lack of knowledge, which is causing unbalanced views and influencing public perception.

As I’ve already alluded, this technology, like most, is complex, especially to those that aren’t from a technical background. Facial recognition can almost appear space-age to some, and understandably so, we’re talking about using technology that is clever enough to distinguish the most minute detail of a human face.

Having said this, without downplaying the ‘how’ we need to focus on ‘why’ this technology needs to be used by security services to keep us safe.

An article by Sky News said this about the report;

Researchers found that the controversial system is 81% inaccurate – meaning that, in the vast majority of cases, it flagged up faces to police when they were not on a wanted list.

However, what about the ones it did get right?

At a high level, facial recognition is simple. At least two images are required to compare and depending on the algorithms and tolerances used; you will either have an exact, partial or no match.

It’s as simple as that!

The real difficulty is taking that secondary image. In the use case that is under scrutiny, there are so many mitigating factors to consider.

The Met Police and other law enforcement agencies are capturing this second image in real-time and in environments that cannot be controlled. Things like lighting, depth, glare, background objects, angles, the list go on these things cannot be controlled.

The ability to overcome these things is improving all the time and has dramatically improved since I took my first AI module back in 2001!

We must remember that substantial investment and R&D is taking place in this space, and things will only improve. Like all change, we cannot expect things to be immediately perfect.

It’s frustrating that there is so much negativity out there on this topic when the objective is to catch criminals and protect the public.

Maybe this is where those who are tasked with protecting us need to do more with their explanations to the public. If the police laid out their plans, were honest about the limitations in the current implementation and shared their objectives with the public, maybe we’d all be a little less ruffled by the use of the technology.

We may even find that the public embraces the fact that they are playing a crucial part in the development of this incredible tech. Having said all that, I totally understand the civil liberty and human rights debate.

But while I appreciate the stress that may be caused from being incorrectly identified as a ‘person of interest’, I am struggling to understand that there are any other genuine concerns.

I can’t imagine the security services want our details to sell to us. GDPR police would have a field day. For me, being caught in the facial recognition net is a necessary step to keep us all safe.

It will undeniably help security and law services to do their jobs when they have faced austerity measures over the last few years which has meant that there is less resource available to fight crime when crime rates are generally increasing and becoming more sophisticated.

Using technology in combination with more traditional methods, should be encouraged and celebrated, not torn to shreds.


Sky Article
Guardian Article
Updated HMRC Guidelines for Electronic Verification

This site uses cookies and other tracking technologies to assist with navigation, for you to provide feedback, analyse your use of our products and services, assist with our promotional and marketing efforts, and provide content from third parties.