9,435 research outputs found

    DeepFakes: a New Threat to Face Recognition? Assessment and Detection

    Full text link
    It is becoming increasingly easy to automatically replace a face of one person in a video with the face of another person by using a pre-trained generative adversarial network (GAN). Recent public scandals, e.g., the faces of celebrities being swapped onto pornographic videos, call for automated ways to detect these Deepfake videos. To help developing such methods, in this paper, we present the first publicly available set of Deepfake videos generated from videos of VidTIMIT database. We used open source software based on GANs to create the Deepfakes, and we emphasize that training and blending parameters can significantly impact the quality of the resulted videos. To demonstrate this impact, we generated videos with low and high visual quality (320 videos each) using differently tuned parameter sets. We showed that the state of the art face recognition systems based on VGG and Facenet neural networks are vulnerable to Deepfake videos, with 85.62% and 95.00% false acceptance rates respectively, which means methods for detecting Deepfake videos are necessary. By considering several baseline approaches, we found that audio-visual approach based on lip-sync inconsistency detection was not able to distinguish Deepfake videos. The best performing method, which is based on visual quality metrics and is often used in presentation attack detection domain, resulted in 8.97% equal error rate on high quality Deepfakes. Our experiments demonstrate that GAN-generated Deepfake videos are challenging for both face recognition systems and existing detection methods, and the further development of face swapping technology will make it even more so.Comment: http://publications.idiap.ch/index.php/publications/show/398

    Deepfakes: False Pornography Is Here and the Law Cannot Protect You

    Get PDF
    It is now possible for anyone with rudimentary computer skills to create a pornographic deepfake portraying an individual engaging in a sex act that never actually occurred. These realistic videos, called “deepfakes,” use artificial intelligence software to impose a person’s face onto another person’s body. While pornographic deepfakes were first created to produce videos of celebrities, they are now being generated to feature other nonconsenting individuals—like a friend or a classmate. This Article argues that several tort doctrines and recent non-consensual pornography laws are unable to handle published deepfakes of non-celebrities. Instead, a federal criminal statute prohibiting these publications is necessary to deter this activity

    Na het fake news, Fake video

    Get PDF
    Michelle Obama kijkt met haar typische glimlach in de lens. Even later ontbloot ze haar borsten. Nooit gebeurd - toch niet voor de lens van een camera - en toch bestaat het filmpje. Welkom in de wereld van deepfake, het fenomeen dat nu nog een speeltje in handen van internetpiraten is, maar de potentie heeft om politieke en andere brokken te maken
    corecore