Abstract
In this paper, we propose a new method to expose AI-generated fake face images or videos (commonly known as the Deep Fakes). Our method is based on the observations that Deep Fakes are created by splicing synthesized face region into the original image, and in doing so, introducing errors that can be revealed when 3D head poses are estimated from the face images. We perform experiments to demonstrate this phenomenon and further develop a classification method based on this cue. Using features based on this cue, an SVM classifier is evaluated using a set of real face images and Deep Fakes.
Keywords
Affiliated Institutions
Related Publications
From few to many: illumination cone models for face recognition under variable lighting and pose
We present a generative appearance-based method for recognizing human faces under variation in lighting and viewpoint. Our method exploits the fact that the set of images of an ...
Deepfake Video Detection Using Recurrent Neural Networks
In recent months a machine learning based free software tool has made it easy to create believable face swaps in videos that leaves few traces of manipulation, in what are known...
MesoNet: a Compact Facial Video Forgery Detection Network
This paper presents a method to automatically and efficiently detect face\ntampering in videos, and particularly focuses on two recent techniques used to\ngenerate hyper-realist...
The SVM-Minus Similarity Score for Video Face Recognition
Challenge, but also an opportunity to eliminate spurious similarities. Luckily, a major source of confusion in visual similarity of faces is the 3D head orientation, for which i...
Unsupervised Joint Alignment of Complex Images
Many recognition algorithms depend on careful positioning of an object into a canonical pose, so the position of features relative to a fixed coordinate system can be examined. ...
Publication Info
- Year
- 2019
- Type
- article
- Pages
- 8261-8265
- Citations
- 988
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1109/icassp.2019.8683164