ABA Banking Journal
No Result
View All Result
  • Topics
    • Ag Banking
    • Commercial Lending
    • Community Banking
    • Compliance and Risk
    • Cybersecurity
    • Economy
    • Human Resources
    • Insurance
    • Legal
    • Mortgage
    • Mutual Funds
    • Payments
    • Policy
    • Retail and Marketing
    • Tax and Accounting
    • Technology
    • Wealth Management
  • Newsbytes
  • Podcasts
  • Magazine
    • Subscribe
    • Advertise
    • Magazine Archive
    • Newsletter Archive
    • Podcast Archive
    • Sponsored Content Archive
SUBSCRIBE
ABA Banking Journal
  • Topics
    • Ag Banking
    • Commercial Lending
    • Community Banking
    • Compliance and Risk
    • Cybersecurity
    • Economy
    • Human Resources
    • Insurance
    • Legal
    • Mortgage
    • Mutual Funds
    • Payments
    • Policy
    • Retail and Marketing
    • Tax and Accounting
    • Technology
    • Wealth Management
  • Newsbytes
  • Podcasts
  • Magazine
    • Subscribe
    • Advertise
    • Magazine Archive
    • Newsletter Archive
    • Podcast Archive
    • Sponsored Content Archive
No Result
View All Result
No Result
View All Result
ADVERTISEMENT
Home Compliance and Risk

Is deepfake technology shifting the gold standard of authentication?

November 8, 2023
Reading Time: 4 mins read
Is deepfake technology shifting the gold standard of authentication?
ADVERTISEMENT

As deepfake technology advances and becomes more widely available and democratized, a challenge for financial institutions will be improving the certainty rate of user authentication to prevent breaches.

By Gaelan Woolham

Fraud has long plagued the financial services sector, and deepfakes have emerged as a threat to a secure customer experience.

As interactions with customers are becoming more digital, financial institutions rely on three key pillars of authentication to verify users’ identities:

  1. Something you have. (Example: SMS push to a trusted number or device.)
  2. Something you know. (Example: security challenge questions.)
  3. Something you are. (Example: fingerprint, facial or voice recognition.)
MORE INFO: Download a report titled, The Growing Significance of Trusted Digital Identities in U.S. Financial Services from ABA’s Office of Innovation and consultants Oliver Wyman.
However, these pillars vary in robustness. “Something you have” can be compromised if your devices are stolen. “Something you know” can be found online using social media and similar resources. This makes “Something you are” — verified via biometrics — the gold standard for fraud prevention, especially in contact centre platforms.

The advantages of biometrics are not only their resilience to current technology-based attacks, but also their minimal user friction. Think face ID versus remembering and typing multiple, complex usernames and passwords. In the contact center context, voice biometrics have become a popular and secure alternative to PINs, passcodes, and challenge questions. As the technology has matured, sometimes as little as three seconds of talking can be sufficient to verify the user’s identity.

The emergence of deepfakes in sophisticated cybercrime

In recent years we have seen the emergence of deep learning techniques within artificial intelligence. Early examples included impressive image recognition technology, that has quickly been enabled to generate deepfakes, i.e. simulated images on demand. And will soon be able to generate real-time video.

While current deepfake images and videos are impressive, they still give the uncanny feeling that something is off. The rapid rate of advancement will surely make these media indistinguishable from reality to the average viewer faster than our ability to learn and anticipate.

Another application of these techniques is the ability to mimic individual voices. Using minimal input, such as a voicemail message or a social media post, systems can be trained to mimic human voices with remarkable fidelity, even achieving conversational interactions when combined with technologies like ChatGPT. If systems can mimic an individual’s voice including tone, word choice, and cadence, should we be concerned about the future of voice biometrics security?

Deepfakes and the ability to overcome existing authentication

Deepfake scams within financial services include fraudulent claims, account opening fraud and synthetic identity fraud. Financial services institutions need to consider how deep learning technology has the potential to defeat current voice authentication systems.

A recent study at the University of Waterloo showed that voice biometric authentication including those of industry leaders such as Amazon and Microsoft (Nuance) authentication systems can be bypassed by deepfake technology in only six attempts.

As deepfake technology advances and becomes more widely available and democratized, a challenge for agile financial institutions will be improving the certainty rate of user authentication to prevent breaches. To achieve this, they must ensure that their voice biometric tools are actively tested against deepfake audio samples. Given the fast pace of these advancements, incumbent infosec players and emerging startups are already refining their tools to improve the efficacy rates of differentiating synthetic voices from real ones.

An artificial intelligence arms race?

Advancements in countermeasures, including the use of machine learning for detection, are leading to authentication systems that produce a probability score. Leading biometric security products are being consistently updated to identify and prevent deepfakes. This includes priority approaches to separating real and synthetic voices using factors too subtle for the human ear, as well as combining with other session metadata such as behavioural patterns, device data, number spoofing and liveness detection.

Once a session score is assigned, it can be processed by additional controls and authentication checkpoints, tuned to an organization’s risk tolerance, to grant access or to trigger additional actions such as session termination or step-up authentication. Further, user activity can be monitored for higher risk actions, such as initiating large transactions or changing authentication preferences. A low confidence session combined with suspicious activity could be used to trigger additional security checks, or to trigger alerts for further investigation. The pace of progress in deep learning for both detection and evasion has resulted in a continuous ‘arms race’ between information security teams, authentication service providers and fraudulent actors.

The key to successfully implementing step-up authentication in response to deepfake fraud potential is to understand organizational data and risk indicators and properly tuning the responses.

Given that voice biometric authentication is now adopted widely, is trusted by clients and has high efficacy rates, banks face the challenge of maintaining security without resorting to older, more intrusive techniques for authentication. Introducing multi-factor authentication by default, especially on the voice channel, could negatively impact the customer experience.

We believe a layered approach to fraud detection, such as step-up multi-factor authentication, tuned against other customer and session meta-data, as well as robust behavioral analytics, provides a pathway forward that protects the customer experience, while maximizing fraud prevention.

Gaelan Woolham is an executive director at Capco, a global technology and management consultancy specializing in driving digital transformation in the financial services industry.

Tags: Artificial intelligenceBiometricsCyber crimeData securityVoice banking
ShareTweetPin

Related Posts

FBI: Crypto-related fraud losses increased 45% in 2023

Justice Department seizes millions of dollars linked to alleged crypto investment scams

Compliance and Risk
June 20, 2025

The Department of Justice announced it has seized $225.3 million in funds linked to cryptocurrency investment scams. The action marks the largest cryptocurrency seizure in Secret Service history.

ABA urges FinCEN to reevaluate BOI collection burden on banks

FinCEN releases figures on BSA filings

Compliance and Risk
June 20, 2025

Financial institutions filed 4.7 million suspicious activity reports in fiscal year 2024. They filed 20.5 million currency transaction reports during the same time frame.

FinCEN to propose new rules on money laundering, whistleblower program

Treasury official outlines principles for Bank Secrecy Act modernization

Compliance and Risk
June 18, 2025

The Treasury Department is exploring ways to streamline the filing process for suspicious activity reports and currency transaction reports as part of a broader effort to modernize BSA enforcement, Deputy Secretary of the Treasury Michael Faulkender said.

ABA suggests splitting proposal to expand Fedwire, NSS operating hours

FATF releases revisions to international standard for payment transparency

Compliance and Risk
June 18, 2025

FAFT announced several revisions to its recommendation on payments transparency, which it said will enhance the safety and security of cross-border payments to better detect financial crime.

BAFT releases report on best practices, guidance for ISO 20022 migration

CFPB to delay small-business lending data collection compliance dates

Compliance and Risk
June 17, 2025

The CFPB will issue an interim final rule today to push back by roughly a year the compliance dates for its small-business data collection requirements, according to a filing in the Federal Register.

Is deepfake technology shifting the gold standard of authentication?

Will fraud prevention ever be autonomous?

Technology
June 17, 2025

Anti-fraud systems are learning to anticipate fraud rather than merely react to it. Better anticipatory abilities inch systems closer to full automation.

NEWSBYTES

ABA DataBank: Planned/announced office conversions spike

June 20, 2025

OCC releases mortgage performance report for Q1 2025

June 20, 2025

Justice Department seizes millions of dollars linked to alleged crypto investment scams

June 20, 2025

SPONSORED CONTENT

AI Compliance and Regulation: What Financial Institutions Need to Know

Unlocking Deposit Growth: How Financial Institutions Can Activate Data for Precision Cross-Sell

June 1, 2025
Choosing the Right Account Opening Platform: 10 Key Considerations for Long-Term Success

Choosing the Right Account Opening Platform: 10 Key Considerations for Long-Term Success

April 25, 2025
Outsourcing: Getting to Go/No-Go

Outsourcing: Getting to Go/No-Go

April 5, 2025
Six Payments Trends Driving the Future of Transactions

Six Payments Trends Driving the Future of Transactions

March 15, 2025

PODCASTS

Podcast: Staying close to clients amid tariff-driven volatility

June 18, 2025

Podcast: Old National’s Jim Ryan on the things that really matter

June 12, 2025

Podcast: What bankers need to know about ‘First Amendment audits’

June 5, 2025
ADVERTISEMENT

American Bankers Association
1333 New Hampshire Ave NW
Washington, DC 20036
1-800-BANKERS (800-226-5377)
www.aba.com
About ABA
Privacy Policy
Contact ABA

ABA Banking Journal
About ABA Banking Journal
Media Kit
Advertising
Subscribe

© 2025 American Bankers Association. All rights reserved.

No Result
View All Result
  • Topics
    • Ag Banking
    • Commercial Lending
    • Community Banking
    • Compliance and Risk
    • Cybersecurity
    • Economy
    • Human Resources
    • Insurance
    • Legal
    • Mortgage
    • Mutual Funds
    • Payments
    • Policy
    • Retail and Marketing
    • Tax and Accounting
    • Technology
    • Wealth Management
  • Newsbytes
  • Podcasts
  • Magazine
    • Subscribe
    • Advertise
    • Magazine Archive
    • Newsletter Archive
    • Podcast Archive
    • Sponsored Content Archive

© 2025 American Bankers Association. All rights reserved.