These people do not exist. They’re deep fakes

Images via Thispersondoesnotexist.com

Take a close look at the collection of photos at the top of the page. Do these people look familiar? They shouldn’t. They’re not even people! These insanely accurate images are deep fakes created by artificial intelligence (AI). You can see the algorithm at work at Thispersondoesnotexist.com.

The tool is the product of Nvidia-created neural networks and Uber engineer Philip Wang who created the website. The neural networks (i.e. machine learning technology that is meant to mimic the human brain) are more specifically ‘generative adversarial networks’, better known as GANs. This GAN is designed to work without human supervision to fuse together millions of publicly available images of faces on a few different levels to create brand new images of people that don’t actually exist.

What makes Nvidia’s technology so convincing is a second GAN that compares the generated faces against real images to determine whether they’re realistic or not. If you head to thispersondoesnotexist.com and refresh the page, you’ll be served a never-ending array of realistic human faces generated by the technology. These kind of AI-created images are often referred to as ‘deep fakes’.

Image via Youtube explainer on Nvidia’s research

You can head to Youtube to get a better idea of how this technology fuses together different elements of photographs, like facial features, colouring and hairstyles to create convincing imagery.

Reckon you could spot a real face from a deep fake? Take the quiz!

Images via Thispersondoesnotexist.com

The technology isn’t quite perfect – how many errors can you spot in the above faces? However, if the brisk advancement of technology in the 21st century has taught us anything, we know this technology is only going to get better with time. So, what kind of devious applications could this technology have in the future?

3 ways deep fakes might be misused in future

1. Fake news

Donald Trump’s favourite catchphrase might induce eye-rolls at present, but the very real (and scary) possibility of this type of technology could compromise the legitimacy of news as we know it. Last year a Belgian political party released a fake video of Trump addressing Belgian policies on climate change. Despite the terrible quality of the video and the humorous intentions of the party, the Belgians were outraged and misunderstood the video to be real.

You may also remember this PSA from Obama, or, um Jordan Peele which demonstrates how easy it might be to spread realistic fake videos in the future.

2. Cyber crime

A recent rise in phishing scams involving iTunes gift-cards reminds us how easy it is to get scammed in the digital age. If this technology gets into the wrong hands (which it most certainly will – Nvidia has made this code publicly available) then scammers could fit out entire social media profiles with convincing images of fake people, for example.

We love this episode of Reply All for explaining how effective modern scams can be. And no, we’re not talking about emails from a long-lost Nigerian cousin.

3. Non-consensual image use

Deep fakes already have a history of misuse. For example, faces of women taken from the internet have already been overlaid by AI onto pornographic imagery, sparking a conversation around the criminality of non-consensual image use.

Think about it; in the same way, hackers could frame you for a robbery by overlaying your face onto incriminating CCTV footage, or ruin your reputation by leaking a forged video of you spewing racial slurs.

This is all starting to sound like a Black Mirror episode to us!

How do you think this technology could be used in future? Let us know in the comments!

Eliza Brockwell

Author: Eliza Brockwell

Eliza is the Digital Producer for Careers with STEM. Eliza is passionate about creating content that encourages diversity of representation in STEM.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.