The use of biometric data such as fingerprints to unlock mobile devices and verify identity at immigration and customs counters are used around the world. Despite its wide application, one cannot change the scan of their fingerprint. Once the scan is stolen or hacked, the owner can’t change his/her fingerprints and has to look for another identity security system. In view of this, a scholar of HKBU has invented a new technology entitled “lip motion password” (lip password) which utilises a person’s lip motions to create a password. This system verifies a person’s identity by simultaneously matching the password content with the underlying behavioural characteristics of lip movement. Nobody can mimic a user’s lip movement when uttering the password which can be changed at any time. This novel technology, the first in the world and has been granted a US patent in 2015, is expected to be used in financial transaction authentication.
HKBU’s Department of Computer Science Professor Cheung Yiu-ming in charge of the research said the new technique has a number of advantages over conventional security access control methods: 1) The dynamic characteristics of lip motions are resistant to mimicry, so a lip password can be used singly for speaker verification, as it is able to detect and reject a wrong password uttered by the user or the correct password spoken by an imposter; 2) Verification based on a combination of lip motions and password content ensures that access control is doubly secure; 3) Compared with traditional voice-based authentication, the acquisition and analysis of lip movements is less susceptible to background noise and distance, moreover, it can even be used by a speech-impaired person; 4) A user can reset the lip password in a timely manner to strengthen security; 5) There is no language boundary, in other words, a person from any country can use this lip password verification system.
Professor Cheung said: “The same password spoken by two persons is different and a learning system can distinguish them.” The study adopted a computational learning model which extracts the visual features of lip shape, texture and movement to characterise lip sequence. Samples of lip sequence are collected and analysed to train the models and determine the threshold of accepting and rejecting a spoken password.
The potential application of this new patented technology includes, but is not limited to, financial transaction authentication including electronic payment using mobile devices, transactions at ATM machines, and credit card user passwords. It can also be applied to enhance the security access control system currently used in entrances of companies or private premises.
In addition, lip password can be used together with other biometrics to enhance the security level of systems. For instance, lip password can be combined with face recognition, whereby the problem of spoofing face recognition with 3D masks in personal identity verification would be solved.
Professor Cheung Yiu-ming demonstrates using the world’s first “lip motion password” technology, which can provide double security in identity authentication.
A diagram shows the basic concept of lip motion password and how it works.