ABA Banking Journal
No Result
View All Result
  • Topics
    • Ag Banking
    • Commercial Lending
    • Community Banking
    • Compliance and Risk
    • Cybersecurity
    • Economy
    • Human Resources
    • Insurance
    • Legal
    • Mortgage
    • Mutual Funds
    • Payments
    • Policy
    • Retail and Marketing
    • Tax and Accounting
    • Technology
    • Wealth Management
  • Newsbytes
  • Podcasts
  • Magazine
    • Subscribe
    • Advertise
    • Magazine Archive
    • Newsletter Archive
    • Podcast Archive
    • Sponsored Content Archive
SUBSCRIBE
ABA Banking Journal
  • Topics
    • Ag Banking
    • Commercial Lending
    • Community Banking
    • Compliance and Risk
    • Cybersecurity
    • Economy
    • Human Resources
    • Insurance
    • Legal
    • Mortgage
    • Mutual Funds
    • Payments
    • Policy
    • Retail and Marketing
    • Tax and Accounting
    • Technology
    • Wealth Management
  • Newsbytes
  • Podcasts
  • Magazine
    • Subscribe
    • Advertise
    • Magazine Archive
    • Newsletter Archive
    • Podcast Archive
    • Sponsored Content Archive
No Result
View All Result
No Result
View All Result
Home Compliance and Risk

Is deepfake technology shifting the gold standard of authentication?

November 8, 2023
Reading Time: 4 mins read
Is deepfake technology shifting the gold standard of authentication?

As deepfake technology advances and becomes more widely available and democratized, a challenge for financial institutions will be improving the certainty rate of user authentication to prevent breaches.

By Gaelan Woolham

Fraud has long plagued the financial services sector, and deepfakes have emerged as a threat to a secure customer experience.

As interactions with customers are becoming more digital, financial institutions rely on three key pillars of authentication to verify users’ identities:

  1. Something you have. (Example: SMS push to a trusted number or device.)
  2. Something you know. (Example: security challenge questions.)
  3. Something you are. (Example: fingerprint, facial or voice recognition.)
MORE INFO: Download a report titled, The Growing Significance of Trusted Digital Identities in U.S. Financial Services from ABA’s Office of Innovation and consultants Oliver Wyman.
However, these pillars vary in robustness. “Something you have” can be compromised if your devices are stolen. “Something you know” can be found online using social media and similar resources. This makes “Something you are” — verified via biometrics — the gold standard for fraud prevention, especially in contact centre platforms.

The advantages of biometrics are not only their resilience to current technology-based attacks, but also their minimal user friction. Think face ID versus remembering and typing multiple, complex usernames and passwords. In the contact center context, voice biometrics have become a popular and secure alternative to PINs, passcodes, and challenge questions. As the technology has matured, sometimes as little as three seconds of talking can be sufficient to verify the user’s identity.

The emergence of deepfakes in sophisticated cybercrime

In recent years we have seen the emergence of deep learning techniques within artificial intelligence. Early examples included impressive image recognition technology, that has quickly been enabled to generate deepfakes, i.e. simulated images on demand. And will soon be able to generate real-time video.

While current deepfake images and videos are impressive, they still give the uncanny feeling that something is off. The rapid rate of advancement will surely make these media indistinguishable from reality to the average viewer faster than our ability to learn and anticipate.

Another application of these techniques is the ability to mimic individual voices. Using minimal input, such as a voicemail message or a social media post, systems can be trained to mimic human voices with remarkable fidelity, even achieving conversational interactions when combined with technologies like ChatGPT. If systems can mimic an individual’s voice including tone, word choice, and cadence, should we be concerned about the future of voice biometrics security?

Deepfakes and the ability to overcome existing authentication

Deepfake scams within financial services include fraudulent claims, account opening fraud and synthetic identity fraud. Financial services institutions need to consider how deep learning technology has the potential to defeat current voice authentication systems.

A recent study at the University of Waterloo showed that voice biometric authentication including those of industry leaders such as Amazon and Microsoft (Nuance) authentication systems can be bypassed by deepfake technology in only six attempts.

As deepfake technology advances and becomes more widely available and democratized, a challenge for agile financial institutions will be improving the certainty rate of user authentication to prevent breaches. To achieve this, they must ensure that their voice biometric tools are actively tested against deepfake audio samples. Given the fast pace of these advancements, incumbent infosec players and emerging startups are already refining their tools to improve the efficacy rates of differentiating synthetic voices from real ones.

An artificial intelligence arms race?

Advancements in countermeasures, including the use of machine learning for detection, are leading to authentication systems that produce a probability score. Leading biometric security products are being consistently updated to identify and prevent deepfakes. This includes priority approaches to separating real and synthetic voices using factors too subtle for the human ear, as well as combining with other session metadata such as behavioural patterns, device data, number spoofing and liveness detection.

Once a session score is assigned, it can be processed by additional controls and authentication checkpoints, tuned to an organization’s risk tolerance, to grant access or to trigger additional actions such as session termination or step-up authentication. Further, user activity can be monitored for higher risk actions, such as initiating large transactions or changing authentication preferences. A low confidence session combined with suspicious activity could be used to trigger additional security checks, or to trigger alerts for further investigation. The pace of progress in deep learning for both detection and evasion has resulted in a continuous ‘arms race’ between information security teams, authentication service providers and fraudulent actors.

The key to successfully implementing step-up authentication in response to deepfake fraud potential is to understand organizational data and risk indicators and properly tuning the responses.

Given that voice biometric authentication is now adopted widely, is trusted by clients and has high efficacy rates, banks face the challenge of maintaining security without resorting to older, more intrusive techniques for authentication. Introducing multi-factor authentication by default, especially on the voice channel, could negatively impact the customer experience.

We believe a layered approach to fraud detection, such as step-up multi-factor authentication, tuned against other customer and session meta-data, as well as robust behavioral analytics, provides a pathway forward that protects the customer experience, while maximizing fraud prevention.

Gaelan Woolham is an executive director at Capco, a global technology and management consultancy specializing in driving digital transformation in the financial services industry.

Tags: Artificial intelligenceBiometricsCyber crimeData securityVoice banking
ShareTweetPin

Related Posts

Hsu: Third-party risk management guidance offers flexibility for smaller banks

Banking agencies issue revised risk management model guidance

Compliance and Risk
April 17, 2026

The federal banking agencies rescinded existing risk management model guidance and replaced it with revised principles that they said better account for a financial institution’s size and complexity. ABA applauded the revisions, noting that banks' use of AI...

RCC Preview: Flipping the script on traditional tech risk in banking

RCC Preview: Flipping the script on traditional tech risk in banking

Compliance and Risk
April 17, 2026

In the first part in a series, a risk and compliance expert discusses how technology risk in the financial sector increasingly defies traditional definitions and compliance efforts, and how banks can move beyond siloed thinking.

ABA, associations: FHFA fails to make case for SCP rule change

FHLBs propose allowing letters of credit for discount window advances

Community Banking
April 17, 2026

Federal Home Loan Bank members should be allowed to use short-term FHLB letters of credit to secure advances through the Federal Reserve’s discount window, the Council of FHLBs suggested in a recent letter to FHFA Director Bill Pulte.

Study: Weak fundamentals primary cause of bank failures

Study: Weak fundamentals primary cause of bank failures

Compliance and Risk
April 16, 2026

A recent study of more than 150 years of U.S. bank data has concluded that weak fundamentals are the primary driver of bank failures, and that strong banks usually survive runs.

ABA: Policymakers should avoid changes that reduce credit availability

ABA: Policymakers should avoid changes that reduce credit availability

Compliance and Risk
April 16, 2026

The Fair Credit Reporting Act is a critical consumer protection law that supports responsible lending, and policymakers should avoid changes that could restrict credit availability by reducing data accuracy or adding complexity, banker Veneshia Ferdinand told House lawmakers...

ABA urges FinCEN to reevaluate BOI collection burden on banks

FinCEN touts nearly $2B in interdicted funds related to cybercrime

Compliance and Risk
April 15, 2026

FinCEN's Rapid Response Program has facilitated the interdiction of over $268 million in stolen funds on behalf of U.S. victims since the start of 2025, bringing the total to more than $1.8 billion since its inception, according to...

NEWSBYTES

ABA: Illinois interchange law will ‘wreck havoc’ on payment systems

April 17, 2026

Banking agencies issue revised risk management model guidance

April 17, 2026

ABA supports deregulatory approach in proposed CFPB strategic plan

April 17, 2026

SPONSORED CONTENT

Planning Your 2026 Budget? Allocate Resources to Support Growth and Retention Goals

How leading banks are enhancing customer engagement through financial data insights

April 10, 2026
Check Fraud Is Outpacing Legacy Controls. What Banks Should Evaluate Now.

Check Fraud Is Outpacing Legacy Controls. What Banks Should Evaluate Now.

April 1, 2026
How top agricultural lenders are approaching AI, automation and innovation in 2026

How top agricultural lenders are approaching AI, automation and innovation in 2026

March 2, 2026
Top 7 FP&A Trends in Banking for 2026

Top 7 FP&A Trends in Banking for 2026

March 1, 2026

PODCASTS

Podcast: Capitalizing on opportunities to serve high-net-worth clients

April 9, 2026

Podcast: Are credit union commercial loans risky business?

March 30, 2026

Podcast: Risk and strategy in sponsor banking

March 19, 2026

American Bankers Association
1333 New Hampshire Ave NW
Washington, DC 20036
1-800-BANKERS (800-226-5377)
www.aba.com
About ABA
Privacy Policy
Contact ABA

ABA Banking Journal
About ABA Banking Journal
Media Kit
Advertising
Subscribe

© 2026 American Bankers Association. All rights reserved.

No Result
View All Result
  • Topics
    • Ag Banking
    • Commercial Lending
    • Community Banking
    • Compliance and Risk
    • Cybersecurity
    • Economy
    • Human Resources
    • Insurance
    • Legal
    • Mortgage
    • Mutual Funds
    • Payments
    • Policy
    • Retail and Marketing
    • Tax and Accounting
    • Technology
    • Wealth Management
  • Newsbytes
  • Podcasts
  • Magazine
    • Subscribe
    • Advertise
    • Magazine Archive
    • Newsletter Archive
    • Podcast Archive
    • Sponsored Content Archive

© 2026 American Bankers Association. All rights reserved.