GLOBAL RESEARCH SYNDICATE
No Result
View All Result
  • Login
  • Latest News
  • Consumer Research
  • Survey Research
  • Marketing Research
  • Industry Research
  • Data Collection
  • More
    • Data Analysis
    • Market Insights
  • Latest News
  • Consumer Research
  • Survey Research
  • Marketing Research
  • Industry Research
  • Data Collection
  • More
    • Data Analysis
    • Market Insights
No Result
View All Result
globalresearchsyndicate
No Result
View All Result
Home Data Collection

Deep Fakes – Perspectives – Business Recorder

globalresearchsyndicate by globalresearchsyndicate
January 9, 2021
in Data Collection
0
Deep Fakes – Perspectives – Business Recorder
0
SHARES
25
VIEWS
Share on FacebookShare on Twitter

What are DeepFakes?

Have you seen that video of Jim Carrey as Jack Torrance in “The Shining” or the Queen dancing and issuing a TikTok challenge, or even the one of Steve Buscemi with Jennifer Lawrence’s body? If yes, then you have seen a Deepfake.

What is a Deepfake

Deepfakes are a form of media where an existing video or image has been digitally altered, utilizing the power of AI and Machine Learning. Researchers had been working on the technology since the 90s, though the technology has improved tremendously since the past few years.

The term was adopted when a Reddit user “DeepFakes” shared digitally altered adult videos of celebrities in late 2017. He used Deep Learning to insert the faces of Celebrities in those videos, which made them look extremely realistic.

The technology has advanced to an extent that voice can also be manipulated to go along with a Deepfake video. Baidu (Chinese Tech giant) developed an AI algorithm which could clone a voice with only 3.7 seconds of voice data as a sample, where it used to require 30 minutes earlier. Fortunately, the result would not look realistic as it would require a lot more samples to get a better-quality output.

Deepfake algorithms utilize a ML technique called a deep neural network to examine the facial movements of one person, which synthesizes images of another person’s face and swaps one face for another in an image or a video. This has been done in movies for years but that required hundreds of hours of work by video editors and CGI experts, to get half-decent results.

The technology has gotten better by leaps and bounds so much that one can create believable fake videos using a powerful GPU and training data. All this requires is an understanding of deep learning technology and is hundreds of sample images of the two people whose faces you need to swap, and they can be fed into the algorithm. This does not even require any video editing skills.

The code is easily available online and is simple to use for anyone with a tech background. The caveat is that they need to collect and prepare the training data.

There is an example of a user was able to swap the faces of two late night talk show hosts by just going through a few videos of theirs and obtaining pictures of their faces. It did not take him more than 72 hours using a simple GPU to do his magic.

Perils of Deepfakes

It can create havoc in an era where disinformation campaigns are a norm. It is dangerous technology which can be weaponized and used for Revenge Porn, Political Fake News, Propaganda (utilizing sock puppets), and Court Proceedings, among others.

Deepfakes can be potentially threatening for our state institutions including the military as fake footage can be created to create discord among the masses. It can be used to frame political rivals or create misunderstandings among people. People are slowly getting familiar with the technology and the easy availability means that it potentially will not be long before someone comes up with a dangerous video.

Financial Fraud

There was this case in 2019 where the head of an Energy firm in the UK was scammed out of 200k GBP where the con artists made a deepfake audio of his boss asking for an urgent transfer of funds. Another incident took place at an unnamed tech firm, where an employee was approached. The scam was not successful as the audio sounded a bit robotic and so he flagged it to their legal department. The technology will keep getting better and this means that there will be many more cases of financial fraud using deepfake audio.

This is where it requires a lot more awareness on the part of the financial sector, by training employees and increasing security measures so that the likelihood of financial fraud is reduced. There is also the need for educating the public so that they do not fall for scams. When the technology gets more accessible, it will be easier for con artists to get their marks.

An example is the case of a lawyer in Philadelphia who was nearly scammed by someone impersonating his son claiming he was in trouble and needed bail money. The scammer sounded just like his son, used the same style of words and cadence. He had a narrow escape as he called his daughter in law who alerted her husband. It was when the son called and informed him that it was not him that he realized he had nearly been scammed.

Political Propaganda

Imagine a situation where a miscreant makes a deepfake audio of a political target where he or she makes comments which can get him into trouble. All they need is to train as much audio as they can get to ensure a higher quality and spread it via messaging apps and other social media platforms. Even if the audio is proven to be fake, it will have caused the damage as regular people would tend to believe what they hear.

An example is the video of Nancy Pelosi (Speaker of the House) where the audio was altered to make her look drunk. Despite the repeated requests, Facebook did not take any timely action to take the video down. It only helped the Republican supporters who were biased and chose to believe that it was true despite evidence of the original video not showing any slurring.

Revenge Porn

Revenge Porn is the most dangerous scenario here as it can be used by miscreants to target women, where their likeness can be used without their consent to create fake videos and ruin their lives. The usual victims are ex-girlfriends or ex-wives. There have been cases where they have been blackmailed to pay up or have their videos sent to family and friends. In some cases, videos were leaked to ruin the reputation of the women.

Rana Ayyub is a journalist based in Mumbai who was the victim of a Deepfake video, which was spread online after she criticized the Indian government in an article. An average person would not bother with the details and assume it was really her, even though it was clear that it was not her body. it was an extremely traumatizing ordeal for her as the law enforcement authorities did not cooperate with her even though members of the ruling party were openly sharing it.

According to Deeptrace labs, 96% of fake videos across the internet are of women, mostly celebrities, which are used in porn videos without their consent. They detected 14,678 deepfake videos across a number of streaming platforms and porn sites, which is a huge increase from 7,964 videos in December

SockPuppets

Sockpuppets can also use Generative adversarial networks or (GAN images) to look believable so that they can spread propaganda on social networks. It utilizes neural networks to create realistic looking pictures using training data from a pool of pictures. The technology is still not that great as it allows for a sharp-eyed observer to detect them as they usually have some defects. However, with advances there will be a time when one will not be able to tell them apart using the naked eye. This still allows them to pass themselves off as real people as the pictures are unique and will not be noticed by the average person. We can expect this to get worse in the future as more propaganda campaigns will utilize this for their sockpuppets to spread misinformation online.

Limitations and learnings

The fake videos require a lot of time to get believable results. The longer there is training of data, the more realistic they will seem. Not only that but the electricity and GPU charges add to the total, unless you are running it on the cloud where you will only be charged for the GPU time.

Detection

There are ways of detecting fakes, but it requires alertness and the usage of AI to fight back. The US has a program to detect and even trace deepfakes under a DARPA program. Many universities, research teams and organizations are working on ways to fight back using AI. Microsoft, Facebook, Amazon, and other esteemed universities are also working on the Deepfake Detection Challenge so that open-source tools are available to fight this.

Experts believe that using the blockchain can be a way of helping as this would mean creating a public ledger of all videos which are created. It works as cryptography prevents tampering and any changes done, will be recorded in the ledger.

Cryptographic algorithms will also allow for hashes to be inserted so that they can be authenticated, and any fakes can be detected.

Forensics Lab

A good way to prevent this is to push for a proper forensics’ lab with a team of AI researchers and experts who can scrutinize videos to detect their authenticity. Research needs to be conducted on it or at least obtained from the public domain, as there is already some work being done on it. Keeping ahead of the curve can prove to be quite useful and can be used to diffuse any potential issues.

Safety

Make sure that your accounts are private so that you can only share pictures with trusted people. Even then you should limit what you share as it does not take much to create a fake looking video which an average person will not be able to tell apart. Conduct regular searches online to check and see if someone has uploaded images or videos of you.

Conclusion

The technology will keep improving which will lead to more issues. All we can do is to be prepared and fight back using the same tools which were used to create it in the first place. Vigilance and preventive measures go a long way.

Related Posts

How Machine Learning has impacted Consumer Behaviour and Analysis
Consumer Research

How Machine Learning has impacted Consumer Behaviour and Analysis

January 4, 2024
Market Research The Ultimate Weapon for Business Success
Consumer Research

Market Research: The Ultimate Weapon for Business Success

June 22, 2023
Unveiling the Hidden Power of Market Research A Game Changer
Consumer Research

Unveiling the Hidden Power of Market Research: A Game Changer

June 2, 2023
7 Secrets of Market Research Gurus That Will Blow Your Mind
Consumer Research

7 Secrets of Market Research Gurus That Will Blow Your Mind

May 8, 2023
The Shocking Truth About Market Research Revealed!
Consumer Research

The Shocking Truth About Market Research: Revealed!

April 25, 2023
market research, primary research, secondary research, market research trends, market research news,
Consumer Research

Quantitative vs. Qualitative Research. How to choose the Right Research Method for Your Business Needs

March 14, 2023
Next Post
Outsourced Customer Support Services Market Professional Survey 2020 by Manufacturers, Share, Growth, Trends, Types and Applications, Forecast to 2025

Lateral Flow Assay Market Research, Recent Trends and Growth Forecast 2026

Categories

  • Consumer Research
  • Data Analysis
  • Data Collection
  • Industry Research
  • Latest News
  • Market Insights
  • Marketing Research
  • Survey Research
  • Uncategorized

Recent Posts

  • Ipsos Revolutionizes the Global Market Research Landscape
  • How Machine Learning has impacted Consumer Behaviour and Analysis
  • Market Research: The Ultimate Weapon for Business Success
  • Privacy Policy
  • Terms of Use
  • Antispam
  • DMCA

Copyright © 2024 Globalresearchsyndicate.com

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Cookie settingsACCEPT
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Non-necessary
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.
SAVE & ACCEPT
No Result
View All Result
  • Latest News
  • Consumer Research
  • Survey Research
  • Marketing Research
  • Industry Research
  • Data Collection
  • More
    • Data Analysis
    • Market Insights

Copyright © 2024 Globalresearchsyndicate.com