Voiceprints: Impersonators might find them a lot harder to fake

Image credit

Voiceprints: Impersonators might find them a lot harder to fake

Voiceprints: Impersonators might find them a lot harder to fake

Subheading text
Voiceprints are becoming the next supposedly foolproof security measure
    • Author:
    • Author name
      Quantumrun Foresight
    • September 9, 2022

    While voice-enabled devices and systems have existed for some time, they have also enabled the development of voiceprints that the security industry is increasingly prioritizing as the next fraud prevention mechanism.

    Voiceprints context

    Voiceprints are created by storing a specific voice sample in a digital vault. The system then uses this sample to determine if the caller or user’s voice is a legitimate match to the sample on record. 

    As remote work becomes the norm rather than the exception, organizations are exploring better methods to ramp up security. Pins, passwords, and tokens are all good, but the biometrics space is viewed as the key to fraud prevention. Voiceprints, in particular, just like fingerprints and facial scanning, are so specific to an individual’s vocal cords and patterns that it makes it difficult even for the most seasoned impersonator to copy. However, the main reason why voiceprints are becoming popular is that consumers view using them as user-friendly, instantaneous, and natural.

    Disruptive impact

    Financial services companies are increasingly adopting voiceprints to authenticate callers and verify identity. Using artificial intelligence (AI) and natural language processing, a deep neural network can scan thousands of voiceprints and learn how to interpret tone and even mood by detecting pitch changes and word usage. Companies can also configure an alert system that triggers when a potential fraudster’s voice matches that of a previously flagged voiceprint. Aside from applying voiceprints for caller verification, organizations can use big data to detect other kinds of anomalies, such as elder abuse, where another person is forcing an older person to make financial transactions. 

    Financial service companies are now rolling out additional services that use voice biometrics through apps and interactive voice response systems to conduct basic tasks such as checking balances and self-service, i.e., voice commerce. However, voiceprints are not without their weaknesses; for example, not everyone can use their voice, and background noise levels may interfere with voice detection.

    Implications for voiceprints

    Wider implications for voiceprints may include:

    • More companies using voice biometrics to grant employee access to facilities, systems, and even emails and chats.
    • Access to phone-based government services increasingly integrating voiceprints for authentication purposes.
    • Customer service becoming reliant on voiceprints to launch self-serve options, detecting needs based on tone and pace.
    • Biometrics such as voiceprint, fingerprints, and facial recognition being used alongside passive security options, such as pin numbers, to better secure systems.
    • Fraudsters using voiceprint technology to emulate a passable copy of a person’s voice to commit social engineering crimes to steal data or money.

    Questions to comment on

    • Would you be willing to use voiceprints to make financial transactions?
    • How else do you think voiceprints can be used?

    Insight references

    The following popular and institutional links were referenced for this insight: