[ad_1]
Deepfakes, fake videos generated by artificial intelligence, could be a major problem in the future. Videos in which the face of one actor is replaced by the face of another can look funny. But the same tool could one day be used to create malicious fabrications and manipulate public opinion. In a world where you can’t tell a real video from a fake, you have to question absolutely everything you see on the screen.
However, today deepfakes can still be recognized with the naked eye, if you only know where to look. The technology of deepfakes is imperfect, therefore it leaves traces in the form of small graphic artifacts in the frame. Here are five ways you can recognize a fake.
Resolution difference
Pay attention to the difference in the resolution of the details of the face and the rest of the video. The sources from which artificial intelligence extracts the “donor” face are usually of high resolution. This is necessary for the high-quality work of the neural network. But for the same reason, the transferred face in the final video may look sharper and more detailed than the rest of the video.
This effect is most clearly seen in the humorous video, where the face of actor Steve Buscemi was transferred to the body of Jennifer Lawrence:
Notice how blurry the Golden Globes logo is in the background compared to Buscemi’s crisp and bright face.
Picture Frame Reveal
Artificial intelligence does a great job of transferring one face to another. But when the background face is partially covered by some object in the frame, the neural network starts to fail. In such seconds, the “sticker” face begins to shift slightly, and a barely noticeable blurred frame may appear around it.
Most clearly, this effect is noticeable in the following video. These are stills from The Shining, where Jack Nicholson’s face was changed to that of another actor, Jim Carrey:
For most of the video, the overlay looks very natural. But in the moment where the hero looks into the gap and says his legendary phrase “Here’s Johnny!” You may notice a slight glitch in the image.
Inconsistent scaling
If the shapes of the “donor” and “recipient” skulls are very different, it is difficult for artificial intelligence to fit the face of one actor to the head of another. As a result, the donor face may scale differently depending on the camera angle. This is very noticeable in the video where they tried to transplant the face of Sylvester Stallone onto the body of Arnold Schwarzenegger in Terminator 2:
The resulting character looks different when the camera looks straight at him (0:57) or slightly down (2:46). In moments when the camera moves relative to the hero, the “sticker” face begins to flicker, trying to adjust to the changing angle.
Boundary offset
Artificial intelligence does not always understand what size the “sticker” should be. In some frames he may use the whole face, while in others he may use only a fragment of it. As a result, the border of the “donor” face can shift, periodically exposing the facial features of the original.
This effect can be seen in the video, where the face of comedian Bill Hader is periodically replaced with the face of Tom Cruise:
Hader and Cruz have a similar face shape, so the deepfake is done very well. But still, his cheekbones and chin betray him. This is noticeable at the moment 1:03, although, we admit, if we did not know about the substitution in advance, we would never have noticed it.
Shade mismatch
When overlaying a face, the neural network tries to match its hue to the skin tone and lighting on the original video. However, she doesn’t always manage to do it perfectly. In the new video, where Jim Carrey’s face is transplanted onto Alison Brie’s body, this shortcoming of the video is particularly evident:
Look closely at the skin tone around the edges and in the center of the face. Jim Carrey’s skin is more yellow than Alison Brie’s. This is especially noticeable on the cheekbones, where the border of the overlay passes. As a result, the face looks like it was smeared with a dark foundation.
Of course, deepfakes will improve in quality. Perhaps in a couple of years we will see fakes that will be impossible to recognize without specialized software. But this should only make your approach to information consumption more balanced and selective. Technology creates an unusual world in which you can no longer trust your own eyes. Therefore, anyone who wants to retain the ability to think critically will have to learn to carefully select and analyze the facts themselves.
If you want to receive news via messenger, subscribe to the new Telegram channel iGate
[ad_2]
Добавить комментарий
Для отправки комментария вам необходимо авторизоваться.