PaymentsJournal
No Result
View All Result
SIGN UP
  • Commercial
  • Credit
  • Debit
  • Digital Assets & Crypto
  • Digital Banking
  • Emerging Payments
  • Fraud & Security
  • Merchant
  • Prepaid
PaymentsJournal
  • Commercial
  • Credit
  • Debit
  • Digital Assets & Crypto
  • Digital Banking
  • Emerging Payments
  • Fraud & Security
  • Merchant
  • Prepaid
No Result
View All Result
PaymentsJournal
No Result
View All Result

Microsoft’s AI Assistant Can Be Exploited by Cybercriminals

By Wesley Grant
August 9, 2024
in Analysts Coverage, Artificial Intelligence, Fraud & Security
0
0
SHARES
0
VIEWS
Share on FacebookShare on TwitterShare on LinkedIn
microsoft copilot hacker, AI in India's fintech sector, AI-based biometrics fraud, banks AI artificial intelligence

Microsoft’s Copilot has been touted as a productivity enabler, but the ubiquitous artificial intelligence app’s widespread use also exposes vulnerabilities that criminals can exploit.

At the Black Hat security conference, researcher Michael Bargury demonstrated five ways how Copilot, which has become an integral part of Microsoft 365 apps like Word and Outlook, can be manipulated by bad actors.

For instance, after a hacker gains access to a work email, they can use Copilot to mimic the user’s writing style, including emojis, and send convincing email blasts containing malicious links or malware.

“AI’s ability to assist criminals in writing code to scrape information from social media, paired with its ability to match the speech patterns, tone, and style of an impersonated party’s written communication—whether professional or personal—is an insidious combination,” said Kevin Libby, Fraud & Security Analyst at Javelin Strategy & Research. “When used conjointly, these abilities considerably increase the probability of success for a phishing or smishing operation. AI can even help to scale phishing attacks through automation.”

Poisoning Databases

Bargury demonstrated how a hacker with access to an email account can exploit Copilot to access sensitive information, like salary data, without triggering Microsoft’s security protections.

In other scenarios, he showed how an attacker can poison the Copilot’s database by sending a malicious email and then steering Copilot into providing banking details. Additionally, the AI assistant could also be maneuvered into furnishing critical company data, such as upcoming earnings call forecasts.

During the demonstration, Bargury largely used Copilot for its intended purpose, but also introduced  misinformation and gave Copilot misleading instructions to illustrate how easily the AI could be manipulated.

A Glaring Weakness

The demonstration highlighted a glaring weakness in AI: when secure corporate data is combined with unverified external information. Copilot’s flaws raise concerns about AI’s rapid adoption across nearly every industry, especially in large organizations where employees frequently interact with the technology.

AI can also be one of the strongest tools in fraud detection, as it can help companies discover breaches much faster. Still, it’s clear that the technology is still developing, which opens up opportunities for criminals.

“While AI tools promise innumerable benefits, they also pose significant risks,” Libby said. “Criminals can use AI tools to help them with everything from malicious coding of malware, to scraping social media accounts for PII and other information about potential targets to fortify social engineering attacks, to creating deepfakes of CEOs to scam organizations out of tens of millions of dollars per video or audio call.”

According to Wired, after the demonstration, Bargury praised Microsoft and said the tech giant worked hard to make Copilot secure, but he was able to discover the weaknesses by studying the system’s infrastructure. Microsoft’s leadership responded that they appreciated Bargury’s findings and would work with him to analyze them further.

0
SHARES
0
VIEWS
Share on FacebookShare on TwitterShare on LinkedIn
Tags: Automated FraudCopilotDigital FraudhackersMicrosoft

    Get the Latest News and Insights Delivered Daily

    Subscribe to the PaymentsJournal Newsletter for exclusive insight and data from Javelin Strategy & Research analysts and industry professionals.

    Must Reads

    open banking

    Open Banking Has Begun to Intrude on Banks’ Customer Relationships

    December 5, 2025
    conversational payments

    Conversational Payments: The Next Big Shift in Financial Services  

    December 4, 2025
    embedded finance

    Inside the Embedded Finance Shift Transforming SMB Software

    December 3, 2025
    metal cards

    Metal Card Magnitude: How a Premium Touch Can Enthrall High-Value Customers

    December 2, 2025
    digital gift cards

    How Nonprofits Can Leverage Digital Gift Cards to Help Those in Need

    December 1, 2025
    stored-value prepaid

    How Stored-Value Accounts Are the Next Iteration of Prepaid Payments

    November 26, 2025
    google crypto wallet, crypto regulation

    Crypto Heads Into 2026 Awaiting Its ‘Rocketship Point’

    November 25, 2025
    Merchants Real-Time Payments, swipe fees, BNPL

    The 3 Key Trends That Will Shape Merchant Payments in 2026

    November 24, 2025

    Linkedin-in X-twitter
    • Commercial
    • Credit
    • Debit
    • Digital Assets & Crypto
    • Digital Banking
    • Commercial
    • Credit
    • Debit
    • Digital Assets & Crypto
    • Digital Banking
    • Emerging Payments
    • Fraud & Security
    • Merchant
    • Prepaid
    • Emerging Payments
    • Fraud & Security
    • Merchant
    • Prepaid
    • About Us
    • Advertise With Us
    • Sign Up for Our Newsletter
    • About Us
    • Advertise With Us
    • Sign Up for Our Newsletter

    ©2024 PaymentsJournal.com |  Terms of Use | Privacy Policy

    • Commercial Payments
    • Credit
    • Debit
    • Digital Assets & Crypto
    • Emerging Payments
    • Fraud & Security
    • Merchant
    • Prepaid
    No Result
    View All Result