'He Is Not Mr. Chauhan': NSE Cautions Against Deepfake Videos Of CEO's Stock Advice
'He Is Not Mr. Chauhan': NSE Cautions Against Deepfake Videos Of CEO's Stock Advice
National Stock Exchange cautioned investors against deepfake videos of its MD and CEO Ashishkumar Chauhan giving stock recommendations.

The National Stock Exchange (NSE) on Wednesday issued a warning to investors regarding the dissemination of deepfake videos featuring its MD and CEO, Ashishkumar Chauhan, providing stock recommendations.

In a statement, NSE said it has observed the use of the face or voice of Chauhan and the NSE logo in a few investment and advisory audio and video clips falsely created using technology.

Also Read: MBA Student, High Returns & A Telegram Channel: Nagpur Man Loses Rs 23 Lakh To Crypto Investment Fraud

“Investors are hereby cautioned not to believe in such audio and videos and not follow any such investment or other advice coming from such fake videos or other mediums,” NSE said.

Such videos seem to have been created using sophisticated technologies to imitate the voice and facial expressions of Chauhan, it added.

It may be noted that NSE’s employees are not authorised to recommend any stock or deal in those stocks.

Additionally, the exchange said it is making efforts requesting these platforms to take down such videos, wherever possible.

As per the NSE’s process, any official communication is made only through its official website and the exchange’s social media handles, the bourse said.

Verify before trusting

The exchange has asked investors to verify the source of communication and content that is sent out on behalf of the NSE and to check the official social media handles.

Deepfakes are manipulated videos or other digital representations that use artificial intelligence to create cogent videos or audio of individuals they never did or said, posing a risk of spreading misinformation and damaging their reputation.

Dangers of deepfake in the financial world

  • Fraudulent Transactions: Deepfake technology can be used to create convincing videos or audio recordings of individuals, including high-ranking financial executives or clients. These deepfakes could be employed to authorise fraudulent transactions or manipulate financial data, leading to significant financial losses for individuals and institutions.
  • Market Manipulation: Deepfake videos or audio recordings could be used to spread false information about companies, stocks, or financial markets. By creating convincing fake statements from influential figures, malicious actors could manipulate market sentiment, leading to artificial fluctuations in stock prices or other financial instruments.
  • Identity Theft: Deepfakes can also be utilised for identity theft purposes within the financial sector. Fraudsters could create convincing fake videos or audio recordings of individuals to gain unauthorised access to their financial accounts, sensitive information, or proprietary data.
  • Reputation Damage: A deepfake video or audio recording could tarnish the reputation of financial institutions or key figures within the industry. Even if later proven false, the initial spread of such content could cause significant damage to trust and credibility, impacting business relationships and financial stability.
  • Regulatory Challenges: Regulators may struggle to keep pace with the evolving capabilities of deepfake technology. Implementing effective regulations and safeguards to detect and prevent deepfake-related financial crimes poses a significant challenge, requiring collaboration between industry stakeholders and government agencies.

(With agency inputs)

What's your reaction?

Comments

https://filka.info/assets/images/user-avatar-s.jpg

0 comment

Write the first comment for this!