Deep Truths of Deepfakes — Tech That Can Fool Anyone

Date:

Deepfakes: How do they work? And how can we detect them and stop them? Experts believe that blockchain technology can play a part.

A deepfake, in its most basic form, is an amalgamation of voice- and face-cloning AI technology that allows for the creation of computer-generated videos that look like real people.

Developers need to have tens of hours worth of footage of the individual whose voice/face is to be cloned. They also need a human imitator who can mimic the facial expressions and voice of their target.

Two people are involved in creating a deepfake. One is a famous individual, the other is an unknown individual closely associated with this project.

 

From technology to reality

 

Visual deepfakes can be viewed from a technical perspective. They are created by machine learning tools, which are able decode facial expressions and convert them into a matrix that includes key attributes such as the position and size of the target’s eyes, nose, and mouth. Finer details such as facial hair and skin texture are considered secondary and given less importance.

In general, the deconstruction is done in such a manner that each face can be fully recreated from its original elements. A quality deepfake involves the ability to reconstruct the final image so that the imitator’s movements are also reflected in the target’s.

Matthew Dixon, assistant professor and researcher at Illinois Institute of Technology’s Stuart School of Business, explained that both voice and face can be easily recreated using certain programs and techniques.

“Once a person has been digitally cloned it is possible to then generate fake video footage of them saying anything, including speaking words of malicious propaganda on social media. The average social-media follower would be unable to discern that the video was fake.”

Vlad Miller, CEO at Ethereum Express, spoke on the finer points of deepfake technology. This cross-platform solution is based upon an innovative model that has its own Blockchain and uses an a proof-ofauthority consensus protocol. He explained that deepfakes are a method of synthesizing images using a machine learning technique called GAN. It’s an algorithm that uses a combination two neural networks.

The first creates the image samples while the second differentiates the genuine samples from the counterfeit ones. GAN’s operation utility can be described as the work of two individuals. The first is involved in counterfeiting, while the second tries to differentiate the originals from copies. The second algorithm will instantly detect if the first algorithm is a fake. After that, the first algorithm will offer a more realistic image.

Regarding the negative social and political implications that deepfake videos can have on the masses, Steve McNew, a MIT trained blockchain/cryptocurrency expert and senior managing director at FTI Consulting, told Jangopedia:

“Online videos are exploding as a mainstream source of information. Imagine social media and news outlets frantically and perhaps unknowingly sharing altered clips — of police bodycam video, politicians in unsavory situations or world leaders delivering inflammatory speeches — to create an alternate truth. The possibilities for deepfakes to create malicious propaganda and other forms of fraud are significant.”

Examples of deepfakes being used for nefarious purposes

Deepfake technology can manipulate the personality and facial features of real-world people, raising many legitimate concerns, particularly in relation to its possible use for shady purposes.

Additionally, many tutorials have been available on the internet for years that show how to digitally alter audio/video data so that it can fool facial recognition systems.

Some disturbing cases of audio/video manipulation, including some that were quite troubling, have been discovered recently and have raised questions about deepfakes’ utility. recent article states that deepfake technology has improved to the point that it is now possible to create videos that allow targets to not only express certain emotions, but also look like certain ethnic groups and have certain ages. Martin Zizi, CEO at Aerendir, is a provider of physiological biometric technology. He pointed this out to Jangopedia

“AI does not learn from mistakes, but from plain statistics. It may seem like a small detail, but AI-based on plain statistics — even with trillion bytes of data — is just that, a statistical analysis of many dimensions. So, if you play with statistics, you can die by statistics.”

Zizi added that facial recognition relies on neural networks that are fragile in nature. These networks can be compared to cathedrals. If one cornerstone is removed, the entire structure crumbles. Zizi elaborated further on the topic:

“By removing 3 to 5 pixels from a 12 million pixels image of someone’s face brings recognition to zero!  Researchers have found that adversarial attacks on neural net attacks can find those 3 to 5 pixels that represent the ‘cornerstones’ in the image.”

Another example big of fake tech being used for financial purposes was when the CEO of a UK-based energy company was scammed into making a transfer of 220,000 euros ($243,000) from a bank account. He believed he was talking to his boss, who is the chief executive of the parent company. The voice was actually a fraudster who used deepfake voice technology in order to fool the executive.


Blockchain may help against deepfakes

Witness Media Lab’s 72-page report cites blockchain as a valid tool to counter the many digital threats posed by deepfake tech.

This is possible by using blockchain to digitally sign and verify the authenticity of audio or video files that are directly or in some way related to them. The more digital signatures added to a video, the more likely it is to be authentic.

Greg Forst (director of marketing at Factom protocol) commented on the matter and said that blockchain could offer a unique solution to deepfakes. Or at least a significant part of it. He noted:

“If video content is on the blockchain once it has been created, along with a verifying tag or graphic, it puts a roadblock in front of deepfake endeavors. However, this hinges on video content being added to the blockchain from the outset. From there, digital identities must underline the origins and creator of the content. Securing data at source and having some standardization for media will go a long way.”

McNew believes that the blockchain’s immutability means that once a data block is confirmed by the network, it cannot be altered. If videos or photos are uploaded to a blockchain verification app before being shared, they could easily be identified as fake.

Miller also shared a similar opinion. He believes that using blockchain technology with artificial intelligence could help to solve many privacy and security issues raised by deepfakes. He also added:

“AI is able to collect, analyze, sort, and transmit data efficiently, improving the quality and speed of execution of internal processes.” The blockchain protects data and their sequence from any interference.

 

Blockchain technology is not without its limitations


There are some limitations to blockchain technology that prevent it from being used actively to monitor deepfakes online.
 The technology’s overall scalability is limited because of the intense computational and memory requirements to fight digitally altered A/V data in real time.

A significant curbing on crowdsourced video content (such the one currently available on YouTube) is another potential problem that blockchain could cause due to deepfake detection. Dixon raised the following concern:

“How can someone from a poor country reach out to the world with their message, if they must be approved by a Silicon Valley-based firm? Do we trust tech companies with such power or should we? When trust is weakened, liberty is at risk.”

Hibryda is the creator and founder Bitlattice. Bitlattice is a distributed ledger system using a multidimensional lattice to address issues like security, timing, and scalability. His view:

The biggest problem with blockchain tech is its inability to verify whether signed media is genuine. This is not an issue with blockchain tech or other related technologies. They only offer ledgers that can be manipulated extremely difficultly. This is an external problem and there is no way to fix it. Crowd-powered verification may be an option, but it is not possible to create a system that can provide objective and reliable fact-checking.

Forst explained to Jangopedia however that although most people believe that using blockchain to detect deepfakes is too costly, there are many open-source solutions. Forst stated that while blockchain can’t solve deepfake detection in its entirety, it can help with a portion of the problem.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Share post:

Subscribe

Digital Marketing AgencyDigital Marketing Agency

Popular

More like this
Related

14 Real Estate Investment Trusts: Best Paying Jobs

In the dynamic world of finance and investments, Real...

Navigating Workplace Conflict: Mastering Soft Skills for Success

In today's dynamic work environment, effective communication and interpersonal...

How to Foster Successful Remote Work Relationships

Remote work today's rapidly evolving work landscape, remote work...

10 Best Budgeting Tips for Your 40s and 50s: Secure Your Financial Future Today!

As you enter your 40s and 50s, your financial...