Your browser is obsolete!

The page may not load correctly.

Persona (non) grata

Persona (non) grata

Other issues in this category (26)
  • add to favourites
    Add to Bookmarks

Keep your head

Read: 18801 Comments: 2 Rating: 10

Tuesday, March 20, 2018

Our ancestors didn't need Photoshop to doctor photos in which the head of one person is attached to the body of another. Nor did artists who’ve been using tricks of this kind since ancient times. But for quite some time, considerable skill has been required to create fake images that look realistic—even after sophisticated image editors appeared. But everything changes, and machine learning is letting ordinary users accomplish things they could previously only dream of.

A Reddit user going by the name DeepFakes trained neural networks to generate smut videos “starring” celebrities.

Face2Face technology replaces the actors’ faces in real time.

To create videos like this, the author used publicly available guidelines and such machine-learning frameworks as TensorFlow, which Google makes available free of charge to researchers, post graduate students, and any user interested in machine learning.

https://habrahabr.ru/post/346478

Naturally, anyone can use the technique. Of course, videos of this kind can't be regarded as evidence in a criminal investigation: they appear genuine at first glance, but the clips contain distortions, and face tracking doesn’t work properly, but the untrained eye will have trouble exposing a fake.

However, the system can still be employed by unscrupulous individuals to damage the reputation of people they don't like. For example, two women got into a quarrel, and the next morning videos showing them naked popped up in social media. The resolution is low and distortions have deliberately been introduced—who will register the fact that the body and face actually belong to different people?

Frankly, we didn't plan to write about this topic—after all, it has little to do with information security, and fakes have been found in ample supply on the Internet before. But the technology is gaining in popularity, and that's why we feel compelled to warn our readers about it.

Two months ago Motherboard contacted Peter Eckersley, Chief Computer Scientist for the Electronic Frontier Foundation, to discuss the technology’s possible impact on people's daily lives. “You can make fake videos with neural networks today, but people will be able to tell that you’ve done that if you look closely, and some of the techniques involved remain pretty advanced. That’s not going to stay true for more than a year or two”, Peter Eckersley said.

As it turned out, two months was more than enough. New porn clips "featuring" Jessica Alba and Daisy Ridley appeared on Reddit—the quality of these videos is far superior to those that appeared in December.

In fact, the clips turned out to be so convincing that people who maintain sites hosting celebrity nude photos perceived them as genuine. A fake video of Emma Watson taking a shower was uploaded to one of the most popular of these sites and was accompanied by this description: "the never-before-seen video above is from my private collection". Meanwhile, Emma Watson had nothing to do with the video except for the fact that a photo of her face was used without her consent.

The user-friendly application lets anyone easily create fake videos—ranging from clips featuring a former girlfriend as a form of vengeance to public appearances of high-ranking government officials saying whatever the video maker wants.

https://motherboard.vice.com/en_us/article/bjye8a/reddit-fake-porn-app-daisy-ridley

Can a fake be exposed? That’s often possible, but:

As you learn more about Image Forensics, you start realising there is always a way to get around any technique used to expose fakes. Some can easily be bypassed by JPEG compression. In other cases, colour correction, blurring, image resizing or rotating an image to arbitrary angles will do the job. When magazine or TV output is converted to digital format, errors occur too, and this also complicates any analysis. And it’s at this moment that you start realising an image can be tweaked in Photoshop in such a way that no one will be able to prove it’s a fake.

http://vas3k.ru/blog/390

#technologies

The Anti-virus Times recommends

Just be aware of this new fabrication technique. Be prepared, and don't fall for any tricks.

[Twitter]

Tell us what you think

To leave a comment, you need to log in under your Doctor Web site account. If you don't have an account yet, you can create one.

Comments