PaymentsJournal
No Result
View All Result
SIGN UP
  • Commercial
  • Credit
  • Debit
  • Digital Assets & Crypto
  • Digital Banking
  • Emerging Payments
  • Fraud & Security
  • Merchant
  • Prepaid
PaymentsJournal
  • Commercial
  • Credit
  • Debit
  • Digital Assets & Crypto
  • Digital Banking
  • Emerging Payments
  • Fraud & Security
  • Merchant
  • Prepaid
No Result
View All Result
PaymentsJournal
No Result
View All Result

Microsoft’s AI Assistant Can Be Exploited by Cybercriminals

By Wesley Grant
August 9, 2024
in Analysts Coverage, Artificial Intelligence, Fraud & Security
0
0
SHARES
0
VIEWS
Share on FacebookShare on TwitterShare on LinkedIn
microsoft copilot hacker, AI in India's fintech sector, AI-based biometrics fraud, banks AI artificial intelligence

Microsoft’s Copilot has been touted as a productivity enabler, but the ubiquitous artificial intelligence app’s widespread use also exposes vulnerabilities that criminals can exploit.

At the Black Hat security conference, researcher Michael Bargury demonstrated five ways how Copilot, which has become an integral part of Microsoft 365 apps like Word and Outlook, can be manipulated by bad actors.

For instance, after a hacker gains access to a work email, they can use Copilot to mimic the user’s writing style, including emojis, and send convincing email blasts containing malicious links or malware.

“AI’s ability to assist criminals in writing code to scrape information from social media, paired with its ability to match the speech patterns, tone, and style of an impersonated party’s written communication—whether professional or personal—is an insidious combination,” said Kevin Libby, Fraud & Security Analyst at Javelin Strategy & Research. “When used conjointly, these abilities considerably increase the probability of success for a phishing or smishing operation. AI can even help to scale phishing attacks through automation.”

Poisoning Databases

Bargury demonstrated how a hacker with access to an email account can exploit Copilot to access sensitive information, like salary data, without triggering Microsoft’s security protections.

In other scenarios, he showed how an attacker can poison the Copilot’s database by sending a malicious email and then steering Copilot into providing banking details. Additionally, the AI assistant could also be maneuvered into furnishing critical company data, such as upcoming earnings call forecasts.

During the demonstration, Bargury largely used Copilot for its intended purpose, but also introduced  misinformation and gave Copilot misleading instructions to illustrate how easily the AI could be manipulated.

A Glaring Weakness

The demonstration highlighted a glaring weakness in AI: when secure corporate data is combined with unverified external information. Copilot’s flaws raise concerns about AI’s rapid adoption across nearly every industry, especially in large organizations where employees frequently interact with the technology.

AI can also be one of the strongest tools in fraud detection, as it can help companies discover breaches much faster. Still, it’s clear that the technology is still developing, which opens up opportunities for criminals.

“While AI tools promise innumerable benefits, they also pose significant risks,” Libby said. “Criminals can use AI tools to help them with everything from malicious coding of malware, to scraping social media accounts for PII and other information about potential targets to fortify social engineering attacks, to creating deepfakes of CEOs to scam organizations out of tens of millions of dollars per video or audio call.”

According to Wired, after the demonstration, Bargury praised Microsoft and said the tech giant worked hard to make Copilot secure, but he was able to discover the weaknesses by studying the system’s infrastructure. Microsoft’s leadership responded that they appreciated Bargury’s findings and would work with him to analyze them further.

0
SHARES
0
VIEWS
Share on FacebookShare on TwitterShare on LinkedIn
Tags: Automated FraudCopilotDigital FraudhackersMicrosoft

    Get the Latest News and Insights Delivered Daily

    Subscribe to the PaymentsJournal Newsletter for exclusive insight and data from Javelin Strategy & Research analysts and industry professionals.

    Must Reads

    real-time payments merchant

    Banks Without Invoicing Services Are Missing a Small Business Opportunity

    January 23, 2026
    card program

    Should Banks Compete in the Credit Builder Card Market?

    January 22, 2026
    real-time payments, instant payments

    Getting Out in Front of Instant Payments—Before It’s Too Late

    January 21, 2026
    PhotonPay ClearBank

    PhotonPay Expands UK Local Payment Rails via New Collaboration with ClearBank

    January 20, 2026
    agentic commerce

    To Forecast Agentic Commerce Adoption, Look to Biometrics and Digital IDs

    January 16, 2026
    ar ap

    Where Financial Institutions Fit in the AR/AP Value Chain

    January 15, 2026
    digital gift card

    Present and Accounted For: Digital Gift Cards in Incentive Programs

    January 14, 2026
    payments fraud, faster payments fraud

    Faster Payments Demand Faster Fraud Detection

    January 13, 2026

    Linkedin-in X-twitter
    • Commercial
    • Credit
    • Debit
    • Digital Assets & Crypto
    • Digital Banking
    • Commercial
    • Credit
    • Debit
    • Digital Assets & Crypto
    • Digital Banking
    • Emerging Payments
    • Fraud & Security
    • Merchant
    • Prepaid
    • Emerging Payments
    • Fraud & Security
    • Merchant
    • Prepaid
    • About Us
    • Advertise With Us
    • Sign Up for Our Newsletter
    • About Us
    • Advertise With Us
    • Sign Up for Our Newsletter

    ©2024 PaymentsJournal.com |  Terms of Use | Privacy Policy

    • Commercial Payments
    • Credit
    • Debit
    • Digital Assets & Crypto
    • Emerging Payments
    • Fraud & Security
    • Merchant
    • Prepaid
    No Result
    View All Result