Rediff.com« Back to articlePrint this article

Murthy flags fake news about him; cautions public

December 14, 2023 23:10 IST

Infosys co-founder NR Narayana Murthy on Thursday called out "fake news items", which claimed he endorsed automated trading applications and warned the public not to fall for such fraudulent claims.

Gautam Adani

Photograph: ANI Photo

He also slammed "fake interviews" that used "deepfake pictures and videos" of him.

In a series of posts on X, Murthy drew the attention of the public to false content being put out by malicious sites and products or services being sold fraudulently using his name and urged people to report any such instances to the concerned regulatory authorities.

 

"In recent months, there have been several fake news items propagated via social media apps and on various webpages available on the internet, claiming that I have endorsed or invested in automated trading applications...named BTC AI Evex, British Bitcoin Profit, Bit Lyte Sync, Immediate Momentum, Capitalix Ventures etc," Murthy said.

The news items appeared on fraudulent websites that masquerade as popular newspaper websites and some of them even publish "fake interviews using deepfake pictures and videos".

"Public warning issued in respect of fake videos and posts on social media and internet about me," Murthy said.

Murthy categorically denied any endorsement, relation or association with these applications or websites.

He cautioned the public not to fall prey to the content of these malicious sites and products or services being sold in his name, using such fraudulent means.

"Please report any such instances to the concerned regulatory authorities," he said.

The development comes just days after veteran industrialist and former Tata Group chairman Ratan Tata issued an alert for misusing his name on social media to "exaggerate investment" with a risk-free and 100 per cent guarantee.

In a post on Instagram, Tata called out a post from a user by the name of Sona Agrawal who used a fake interview of him in a video recommending investments.

In the fake video, Tata addresses Sona Agrawal as his manager.

"A recommendation from Ratan Tata for everyone in India.

"This is your chance to exaggerate your investment right today risk-free with a 100 per cent guarantee. Go to the channel right now," read the caption of the video post.

The video also showed messages of people receiving money in their accounts.

Tata wrote FAKE on the video and also on the screenshot of the caption of the video.

The issue of deepfakes has come under regulatory glare over the last few days, after several 'deepfake' videos targeting leading actors went viral, sparking public outrage and raising concerns over the misuse of technology and tools for creating doctored content and fake narratives.

Deepfakes refer to synthetic or doctored media that is digitally manipulated and altered to convincingly misrepresent or impersonate someone, using a form of artificial intelligence.

The government has been holding meetings with social media platforms to review progress made by them in tackling misinformation and deepfakes.

Platforms have been reminded that 11 areas of "user harms" or "illegalities" flagged under IT Rules are also mapped to equivalent provisions in the IPC (Indian Penal Code) and hence criminal consequences can follow even under the current laws.

The government has said that the terms of service/ community guidelines put out by platforms should clearly mention that a violation under 3(1)(b) of IT Rules also amount to a violation of relevant provision under other laws like IPC.

The platforms had been asked to align the terms of service/ community guidelines to the provisions of the IT Rules, in particular the 11 prohibited areas mentioned, and map with other laws.

Earlier this month, Minister of State for IT and Electronics Rajeev Chandrasekhar in a post on X had said that the new, amended IT Rules to further ensure compliance of platforms, and the safety and trust of online users are actively under consideration.

© Copyright 2024 PTI. All rights reserved. Republication or redistribution of PTI content, including by framing or similar means, is expressly prohibited without the prior written consent.