SoatDev IT Consulting
SoatDev IT Consulting
  • About us
  • Expertise
  • Services
  • How it works
  • Contact Us
  • News
  • August 8, 2023
  • Rss Fetcher

Using artificial intelligence tools to clone voices has introduced an entirely novel realm of risk for both companies and individuals.
Generative AI (GAI) has become a catalyst for change, introducing new ways of conducting business, managing data, gathering insights, and collating content. As an intelligent and highly capable technology, it has become a powerful tool in the business toolbox, providing rapid analysis, support, and functionality.
Regrettably, the immense potential of GAI is being exploited by cybercriminals, who have harnessed it for malicious purposes, such as creating convincing deep fakes and perpetrating unnervingly realistic voice scams. In 2019, the technology was used to impersonate the voice of the CEO of an energy company in the UK to extort $243,000. In 2021, a company in Hong Kong was defrauded of $35 million. These attacks are not solely aimed at large corporations; individuals are also targeted. Voice clone scams, including kidnapping hoaxes, requests for money from friends or family, and emergency calls, are all part of these scams that prove difficult to detect.
“The scammers are incredibly clever,” says Stephen Osler, Co-Founder and Business Development Director at Nclose. “Using readily available tools online, scammers can create realistic conversations that mimic the voice of a specific individual using just a few seconds of recorded audio. While they have already targeted individuals making purchases on platforms like Gumtree or Bob Shop, as well as engaged in fake kidnapping scams, they are now expanding their operations to target high-level executives with C-Suite scams.”
It’s easy to recognize the potential for cybercriminals, considering the number of people who use voice notes to quickly convey instructions to team members or arrange payments. Busy executives frequently use platforms like WhatsApp to message others while driving or rushing between meetings, making it difficult, if not impossible, for employees to discern that the message is fake.
“An IT administrator might receive a voice note from their manager, requesting a password reset for their access to O365,” explains Osler. “Unaware of the malicious intent, the administrator complies, thinking it’s a legitimate instruction. However, in reality, they unintentionally provide privileged credentials to a threat actor. This information can then be exploited to gain unauthorized access to critical business infrastructure and potentially deploy ransomware.”
And where do these voice clips come from? They originate from voice notes sent via platforms like WhatsApp or Facebook Messenger, social media posts, and phone calls. Scammers can exploit various methods, such as recording calls with CEOs, to create deep fakes, or extracting voice samples from videos or posts on individuals’ online profiles. Cybercriminals have many techniques at their disposal to capture the distinctive voice identity of anyone who has shared their lives online. Subsequently, they employ AI technology to manipulate these recordings, making it appear as though the person is speaking live during the call or voice note.
Deepfake technology will only become more proficient at deceiving victims and breaching organizations. To defend against this, organizations must ensure they have robust processes and procedures in place that require multiple levels of authentication, particularly for financial or authentication-based transactions.”
Companies should establish a clearly defined formal process for all transactions. Relying solely on a voice note from the CIO or CISO should not suffice to change a password, authenticate a monetary transaction, or grant hackers access to the business. It is crucial to educate employees and end-users about the evolving risks associated with these threats. If they are aware of this type of scam, they are more likely to take a moment to verify the information before making a costly mistake.
Always ensure that any voice note or instruction you receive is from a trusted source. It is important to double-check and confirm that the communication is indeed from the intended person,” concludes Osler. “Cultivate an inquisitive mindset and question the source, whether it is a call, email, or message. By doing so, both organizations and individuals can be better prepared to identify and protect themselves against potential voice-cloning scams.
By Stephen Osler, Co-Founder and Business Development Director at Nclose

Previous Post
Next Post

Recent Posts

  • Octonions sometimes associate
  • Looking for keys under the lamppost
  • Why Intempus thinks robots should have a human physiological state
  • 48 hours left: What you won’t want to miss at the 20th TechCrunch Disrupt in October
  • Last 24 hours: TechCrunch Disrupt 2025 Early Bird Deals will fly away after today

Categories

  • Industry News
  • Programming
  • RSS Fetched Articles
  • Uncategorized

Archives

  • May 2025
  • April 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2024
  • April 2024
  • March 2024
  • February 2024
  • January 2024
  • December 2023
  • November 2023
  • October 2023
  • September 2023
  • August 2023
  • July 2023
  • June 2023
  • May 2023
  • April 2023

Tap into the power of Microservices, MVC Architecture, Cloud, Containers, UML, and Scrum methodologies to bolster your project planning, execution, and application development processes.

Solutions

  • IT Consultation
  • Agile Transformation
  • Software Development
  • DevOps & CI/CD

Regions Covered

  • Montreal
  • New York
  • Paris
  • Mauritius
  • Abidjan
  • Dakar

Subscribe to Newsletter

Join our monthly newsletter subscribers to get the latest news and insights.

© Copyright 2023. All Rights Reserved by Soatdev IT Consulting Inc.