
視頻造假

Recently, Reddit has been making news again with a subreddit in w hich people use a machine learning tool called “Deep Fake” to automatically replace one person’s face with another in a video. Obviously, since this is the internet, people are using it for two things: fake celebrity porn and inserting Nicolas Cage into random movies.
最近,Reddit再次發布新聞,人們在視頻中使用機器學習工具“ Deep Fake”來自動將一個人的臉替換為另一個人的臉。 顯然,由于這是互聯網,因此人們將其用于兩件事:假名人色情片和將尼古拉斯·凱奇(Nicolas Cage)插入隨機電影中。
While swapping someone’s face in a photograph has always been relatively easy, swapping someone’s face in a video used to be time consuming and difficult. Up until now, it’s mainly just been done by VFX studios for big budget Hollywood movies, where an actor’s face is swapped onto their stunt double. But now, with Deep Fake, anyone with a computer can do it quickly and automatically.
雖然在照片中交換某人的臉總是相對容易的,但在視頻中交換某人的臉過去既費時又困難。 到目前為止,這主要是由VFX制片廠完成的,用于制作好萊塢大型預算電影,演員的臉被換成他們的特技替身。 但是現在,有了Deep Fake,擁有計算機的任何人都可以快速,自動地做到這一點。
Before going any further, you need to know what a Deep Fake looks like. Check out the SFW video below which is a compilation of different celebrity face swaps, mainly involving Nic Cage.
在進行進一步操作之前,您需要了解Deep Fake的外觀。 觀看下面的SFW視頻,其中包含有關Nic Cage的不同名人面Kong互換的匯編。
The Deep Fake software works using machine learning. It’s first trained with a target face. Distorted images of the target are run through the algorithm and it learns how to correct them to resemble the unaltered target face. When the algorithm is then fed images of a different person, it assumes they’re distorted images of the target, and attempts to correct them. To get video, the Deep Fake software operates on every frame individually.
Deep Fake軟件使用機器學習來工作。 首先使用目標臉進行訓練。 目標的失真圖像通過算法運行,并且學習如何校正它們以使其與未改變的目標面部相似。 然后,當算法被提供給另一個人的圖像時,它將假定它們是目標的扭曲圖像,并嘗試對其進行校正。 要獲取視頻,Deep Fake軟件會在每個幀上單獨運行。
The reason that Deep Fakes have largely just involved actors is that there is a lot of footage of them available from different angles which makes training more effective (Nicolas Cage has 91 acting credits on IMDB). However, given the amount photos and video people post online and that you really only need about 500 images to train the algorithm, there’s no reason ordinary people can’t be targeted too, although probably with a little less success.
Deep Fakes基本上只涉及演員的原因是,有很多不同角度的鏡頭可供使用,這使得培訓更加有效(Nicolas Cage在IMDB上有91個表演學分)。 但是,考慮到人們在網上發布的照片??和視頻的數量,并且您實際上只需要大約500張圖像來訓練算法,就沒有理由也不能將普通人當作目標,盡管成功的可能性可能會少一些。
如何發現假貨 (How to Spot a Deep Fake)
Right now, Deep Fakes are pretty easy to spot but it will get harder as the technology gets better. Here are some of the giveaways.
目前,Deep Fakes很容易發現,但隨著技術的進步,它將變得越來越難。 這是一些贈品。
Weird Looking Faces. In a lot of Deep Fakes, the faces just look weird. The features don’t line up perfectly and everything just appears a bit waxy like in the image below. If everything else looks normal, but the face appears weird, it’s probably a Deep Fake.
奇怪的表情。 在很多“深造假貨”中,這些面Kong看起來很奇怪。 這些功能并不能完美地排列在一起,所有功能看起來都像下面的圖片一樣有點蠟質。 如果其他一切看起來正常,但臉部看起來很奇怪,則可能是“深假”。

Flickering. A common feature of bad Deep Fake videos is the face appearing to flicker and the original features occasionally popping into view. It’s normally more obvious at the edges of the face or when something passes in front of it. If weird flickering happens, you’re looking at a Deep Fake.
忽隱忽現。 糟糕的Deep Fake視頻的一個常見特征是面部似乎閃爍,原始特征有時會突然出現。 通常在臉的邊緣或前面有東西通過時更明顯。 如果發生怪異的閃爍,則表示您正在查看“深淵假貨”。
Different Bodies. Deep Fakes are only face swaps. Most people try and get a good body match, but it’s not always possible. If the person seems to be noticeably heavier, lighter, taller, shorter, or has tattoos they don’t have in real life (or doesn’t have tattoos they do have in real life) there’s a good chance it’s fake. You can see a really obvious example below, where Patrick Stewart’s face has been swapped with J.K. Simmons in a scene from the movie Whiplash. Simmons is significantly smaller than Stewart, so it just looks odd.
不同的機構。 冒牌貨只是面Kong互換。 大多數人嘗試獲得良好的身體匹配,但這并不總是可能的。 如果該人看起來明顯更重,更輕,更高,更矮或在現實生活中沒有紋身(或者在現實生活中沒有紋身),那么很有可能是假的。 您可以在下面看到一個非常明顯的示例,在電影Whiplash的場景中,Patrick Stewart的臉與JK Simmons交換了。 西蒙斯比斯圖爾特小得多,所以看起來很奇怪。

Short Clips. Right now, even when the Deep Fake software works perfectly and creates an almost indistinguishable face swap, it can only really do it for a short amount of time. Before too long, one of the problems above will start happening. That’s why most Deep Fake clips that people share are only a couple of seconds long, the rest of the footage is unusable. If you’re shown a very short clip of a celebrity doing something, and there’s no good reason it’s so short, it’s a clue that it’s a Deep Fake.
短片。 現在,即使Deep Fake軟件可以完美運行并創建幾乎無法區分的人臉交換,它也只能在很短的時間內完成。 不久,上述問題之一將開始發生。 這就是為什么人們共享的大多數Deep Fake剪輯只有幾秒鐘長,其余片段無法使用的原因。 如果顯示某位明星做某事的片段很短,并且沒有充分的理由說明它太短,那么就可以證明這是“假貨”。
No Sound or Bad Lip Syncing. The Deep Fake software only adjusts facial features; it doesn’t magically make one person sound like another. If there’s no sound with the clip, and there’s no reason for their not to be sound, it’s another clue you’re looking at a Deep Fake. Similarly, even if there is sound, if the spoken words don’t match up correctly with the moving lips (or the lips look strange while the person talks like in the clip below), you might have a Deep Fake.
沒有聲音或嘴唇同步不良。 Deep Fake軟件僅調整面部特征; 它不會神奇地使一個人聽起來像另一個人。 如果剪輯沒有聲音,也沒有理由不發出聲音,那是您正在尋找Deep Fake的另一個線索。 同樣,即使有聲音,如果說話的單詞與移動的嘴唇沒有正確匹配(或者當人說話時,嘴唇在下面的剪輯中看起來很奇怪),您可能會感到很虛假。
Unbelievable Clips. This one kind of goes without saying but, if you’re shown a truly unbelievable clip, there’s a good chance you shouldn’t actually believe it. Nicolas Cage has never starred as Loki in a Marvel movie. That’d be cool, though.
令人難以置信的剪輯。 這種說法毋庸置疑,但是,如果您看到了一個真正令人難以置信的剪輯,則很有可能您實際上不應該相信它。 尼古拉斯·凱奇(Nicolas Cage)從未出演過奇跡電影中的洛基(Loki)。 不過那太酷了。
Dubious Sources. Like with fake photos, where the video supposedly comes from is often a big clue as to its authenticity. If the New York Times is running a story on it, it’s far more likely to be true that something you discover in a random corner of Reddit.
可疑來源。 就像偽造的照片一樣,視頻的真實性通常也很重要。 如果《紐約時報》刊登了一個故事,那么您在Reddit的任意角落發現的東西就更有可能成為事實。
For the time being, Deep Fakes are more of a horrifying curiosity than a major problem. The results are easy to spot, and while it’s impossible to condone what’s being done, no one is yet trying to pass off Deep Fakes as genuine videos.
就目前而言,“深造假”更多的是令人恐懼的好奇心,而不是主要的問題。 結果很容易發現,雖然不可能縱容正在做的事情,但沒有人試圖將“ Deep Fakes”作為真正的視頻傳播。
As the technology gets better, however, they’re likely to be a much bigger issue. For example, convincing fake footage of Kim Jong Un declaring war on the USA could cause a major panic.
但是,隨著技術的進步,它們可能會成為一個更大的問題。 例如,說服金正恩(Kim Jong Un)向美國宣戰的假鏡頭可能會引起嚴重恐慌。
翻譯自: https://www.howtogeek.com/341469/how-to-spot-a-deep-fake-face-swapped-video/
視頻造假