Bollywood

Nora Fatehi is the victim of Deep Fake, watch the viral video !!

From Indian actresses, and billionaires to Prime Minister Narendra Modi himself, many have been victims of deepfake videos. “Anybody who has a computer and access to the internet can technically produce a ‘deepfake’ video,” says John Villasenor who is a professor of electrical engineering at the University of California. The word deepfake combines the terms “deep learning” and “fake” and it is a form of artificial intelligence. After rashmika Katrina Kajol and many other Bollywood actresses, now Nora Fatehi has become the victim of deepfake.

Nora Fatehi’s deep fake video going viral!

Deepfakes are videos that create a delusion with the use of deep learning, AI, and photoshopping techniques. These images and videos spread disinformation. Technologies like GANs (Generative Adversarial Networks) or ML (Machine Learning) are interplayed to create the videos. Deepfakes could be imitations of a face, body, sound, speech, environment, or any other personal information manipulated to create an impersonation.

Nora Fatehi who is a famous actor and dancer, now has become the latest victim of deepfake videos. Nora’s image has been manipulated and being used to promote an online shopping website, Without her permission and consent. Nora posted that on her Instagram story. That website is based on technical apparel and athletic shoes, it is named Lulumelon. Nora has claimed that they had used a deepfake video of her to promote their brand. She wrote, “SHOCKED!!! This is not me!”

How deepfake can affect people?

According to a report from a cybersecurity firm, deepfake pornographic videos are aimed and targeted more at women than men. Women form 90% of the victims of crimes like revenge porn, non-consensual porn, and other forms of harassment and deepfake is one more on the list.

Deepfakes are not just limited to the creation of imagery and videos!! They can cost you politically, and financially too. there exist AI tools to clone the voices of individuals to execute financial scams. About 47% of Indian adults have experienced or know someone who has experienced some kind of AI voice scam.

New Staff

Recent Posts

Maharashtra CM Eknath Shinde Announces ₹11 Crore for World Cup-Winning India Cricket Team !!

After India won the T20 World Cup title after 17 years, money is raining on…

5 hours ago

‘Retire Them…’, Suresh Raina Gives Big Advice to BCCI on Virat-Rohit Retiring!!

Rohit Sharma's captaincy has made a huge contribution to the T20 World Cup 2024 victory.…

6 hours ago

Hemant Soren Take The Oath As A CM of Jharkhand Once Again !!

Hemant Soren has become the Chief Minister of Jharkhand for the third time. He took…

2 days ago

Top 10 Bollywood Actresses who are not Highly Educated !!

Hey, Bollywood buffs! Ever wondered how much schooling your favorite actresses have? Turns out, some…

2 days ago

IPL retention update: 8, 5-7 or 0, how many players will the franchise retain, is the rule of IPL going to change?

Indian Premier League (IPL) franchises have requested to retain as many players as possible for…

3 days ago

Drama in bigbiss house: the Vada Pav girl and her father controversy!

Bigg Boss OTT 3's latest episode on July 1st was a rollercoaster of emotions and…

4 days ago