PaymentsJournal
No Result
View All Result
SIGN UP
  • Commercial
  • Credit
  • Debit
  • Digital Assets & Crypto
  • Digital Banking
  • Emerging Payments
  • Fraud & Security
  • Merchant
  • Prepaid
PaymentsJournal
  • Commercial
  • Credit
  • Debit
  • Digital Assets & Crypto
  • Digital Banking
  • Emerging Payments
  • Fraud & Security
  • Merchant
  • Prepaid
No Result
View All Result
PaymentsJournal
No Result
View All Result

Deepfakes Mean Deep Financial Loss for Banking and Payment Industries

How to address workforce training needs to spot and guard against deepfakes.

By Keith Vincent
December 11, 2020
in Fraud & Security, Fraud Risk and Analytics, Industry Opinions, Security
0
0
SHARES
0
VIEWS
Share on FacebookShare on TwitterShare on LinkedIn
Deepfakes Mean Deep Financial Loss for Banking and Payment Industries

Deepfakes Mean Deep Financial Loss for Banking and Payment Industries

Banking and Payment Industries are on high alert due to a new threat in the cybersecurity landscape. Like many things originally intended for good, artificial intelligence and deep learning has morphed into the proliferation of deep fake technology – an insidious problem for these industries.

According to the Wall Street Journal, a scam involving an audio call to a CEO of a U.K. based energy company succeeded in extracting approximately $243,000 from the firm. The voice was enabled by artificial intelligence to sound real to the victim, who he believed he was speaking with his superior at the parent company.

The man was directed to make an urgent transfer of funds to a supplier of the firm. Follow up calls made the victim suspicious, so he declined to send more funds, but by that time it was too late to recover the initial transfer. According to the story, the CEO reported that he, “…recognized his boss’ slight German accent and the melody of his voice on the phone.” Although this type of sophisticated cyberattack was predictable, it stood as highly unusual at the time for its novelty and success.

“Then I’ll get down on my knees and pray…we don’t get fooled again!”

The Who

Deepfakes are intentionally distorted videos, images, or audio recordings that portray something that is fictitious or false, enabling malicious entities with a novel and sophisticated social engineering tool. Technology innovations enable deepfakes to look and sound authentic and convincing, leading to abuse and misuse.

Social engineering is the idea of leveraging human tendencies to produce the desired result; in this case, commit a cybercrime. Cybercriminals manipulate their victims, often by enticing them to click on a malicious file or hyperlink or divulge information they would otherwise protect. It is widely understood that social engineering is a favorite of cybercriminals because humans are often too trusting and easily manipulated under the right circumstances.

The average consumer of social media is probably familiar with deep fakes from an entertainment and social sharing perspective. Online searches are replete with interesting and useful good use cases for artificial intelligence. For example, in May 2019 three Machine Learning Engineers at Dessa showcased a realistic artificial intelligence voice simulation of popular podcast host Joe Rogan. The demonstration is an outstanding example of how easily the lines between synthetic and real are blurred. A cursory online search returns practical use case examples such as text to speech and video editing.

A recent study reports that personal banking and payment transfers are considered, “…most at risk of deepfake fraud, above social media, online dating, and online shopping.” Financial institutions in general are obvious targets for cybercriminals due to their large amount of assets and customer data. The report outlines deepfake impact on the financial services industry. Areas of concern are onboarding processes, payment/transfer authorization, account hijacking, synthetic identities and impersonation among others.

Banking and Payment Services organizations need to prepare their workforce to meet this credible threat by updating their security programs with the following objectives:

  • Awareness of the good use cases of artificial intelligence, deep learning, and deepfakes as well as their weaponization by malicious actors
  • Process and procedure training to address critical functions such as onboarding, payment/transfer authorization, account monitoring, identification procedures, etc.
  • Training on technology deployed to detect and eradicate deepfakes
  • Cybersecurity awareness training to promote awareness and vigilance

Workers should be trained to deal with ad-hoc urgent requests with a pre-defined protocol to authorize such requests, perhaps requiring an approval chain to ensure authorization has the appropriate checks and balances.

Particular attention needs to be paid to brand reputation and the customer experience. When a breach occurs, the long-term effects of losing customer confidence and brand reputation can dwarf the short-term financial and systems damages. Banks and payment companies understand the trust consumers put in their products and the care taken to protect personal assets. Once that trust is gone it can rarely, if ever, be reclaimed.

Institutions that deploy effective training to deepfake provide the heightened awareness, procedural discipline and hypervigilance that reduces the risk of getting “fooled again.”

0
SHARES
0
VIEWS
Share on FacebookShare on TwitterShare on LinkedIn
Tags: CybercrimeCybersecurityDeep FakeFraudIndustry OpinionsScamSecurity

    Get the Latest News and Insights Delivered Daily

    Subscribe to the PaymentsJournal Newsletter for exclusive insight and data from Javelin Strategy & Research analysts and industry professionals.

    Must Reads

    payment gateways

    How Payment Gateways for Businesses Can Help You Offer Your Customers More Options

    February 10, 2026
    Reserve Bank of India (RBI) Extends Mandate for Tokenization to June '22

    Late Payments? Governments Are Taking Action

    February 9, 2026
    ai phishing

    The Fraud Epidemic Is Testing the Limits of Cybersecurity

    February 6, 2026
    stablecoins b2b payments

    Stablecoins and the Future of B2B Payments: Faster, Cheaper, Better

    February 5, 2026
    Payment Facilitator

    The Payment Facilitator Model as a Growth Strategy for ISVs

    February 4, 2026
    Simplifying Payment Processing? Payment Orchestration Can Help , multi-acquiring merchants

    Multi-Acquiring Is the New Standard—Are Merchants Ready?

    February 3, 2026
    ACH Network, credit-push fraud, ACH payments growth

    What’s Driving the Rapid Growth in ACH Payments

    February 2, 2026
    chatgpt payments

    How Merchants Should Navigate the Rise of Agentic AI

    January 30, 2026

    Linkedin-in X-twitter
    • Commercial
    • Credit
    • Debit
    • Digital Assets & Crypto
    • Digital Banking
    • Commercial
    • Credit
    • Debit
    • Digital Assets & Crypto
    • Digital Banking
    • Emerging Payments
    • Fraud & Security
    • Merchant
    • Prepaid
    • Emerging Payments
    • Fraud & Security
    • Merchant
    • Prepaid
    • About Us
    • Advertise With Us
    • Sign Up for Our Newsletter
    • About Us
    • Advertise With Us
    • Sign Up for Our Newsletter

    ©2024 PaymentsJournal.com |  Terms of Use | Privacy Policy

    • Commercial Payments
    • Credit
    • Debit
    • Digital Assets & Crypto
    • Emerging Payments
    • Fraud & Security
    • Merchant
    • Prepaid
    No Result
    View All Result