Only way that would help is if you’re completely disconnected from the system and thus have nothing to take. They’re taking photos of you online and then deepfaking them into scenarios, so your adversarial system would need to be active from the beginning to the end of your existence on parts of you (your face) that you want to protect, since having it only nearby opens up the opportunity for the attacker to simply cut out the adversarial system before feeding it to the machine.
That’s true. I don’t personally post photos of myself online, but I suppose other people have probably put up pictures I’m in at some point. I don’t think it would need to be all your photos, though, just a large enough number to poison the dataset. I could be wrong about that, though.
I do wonder if adversarial manipulations will become a service offered by image hosts, similar to stripping metadata in the future.
There is a website that will generate deepfakes for a charge. I heard something about it being done with women who do live streaming on twitch or whatever because there is massive data sets available of them. So now their fans (if you’d call them that) are harassing them by making pornos of them. The story I heard was about another streamer, a guy, who had done this and accidentally pulled it up while he was live so it was seen my many people. And the woman who was the subject of the porno was really shocked because like they were friends/colleagues.
I know exactly who you’re talking about, as far as I know, that website has since been shutdown so that is at least good news. The streamer in question who pulled it up has since funded a few female streamers to help them cover the fees of paying services to combat the rise of deepfake pornography which is good, though it’s obviously still awful what he did in the first place.
I wonder if adversarial attacks would work to counter this.
Only way that would help is if you’re completely disconnected from the system and thus have nothing to take. They’re taking photos of you online and then deepfaking them into scenarios, so your adversarial system would need to be active from the beginning to the end of your existence on parts of you (your face) that you want to protect, since having it only nearby opens up the opportunity for the attacker to simply cut out the adversarial system before feeding it to the machine.
That’s true. I don’t personally post photos of myself online, but I suppose other people have probably put up pictures I’m in at some point. I don’t think it would need to be all your photos, though, just a large enough number to poison the dataset. I could be wrong about that, though.
I do wonder if adversarial manipulations will become a service offered by image hosts, similar to stripping metadata in the future.
There is a website that will generate deepfakes for a charge. I heard something about it being done with women who do live streaming on twitch or whatever because there is massive data sets available of them. So now their fans (if you’d call them that) are harassing them by making pornos of them. The story I heard was about another streamer, a guy, who had done this and accidentally pulled it up while he was live so it was seen my many people. And the woman who was the subject of the porno was really shocked because like they were friends/colleagues.
I know exactly who you’re talking about, as far as I know, that website has since been shutdown so that is at least good news. The streamer in question who pulled it up has since funded a few female streamers to help them cover the fees of paying services to combat the rise of deepfake pornography which is good, though it’s obviously still awful what he did in the first place.
You mean the guy who commissioned the deepfake has not only regretted it but also has put his own money towards preventing it going forward?
If I understand correctly then nice to hear about someone who doesn’t dig in their heels in a moment of humiliation.
The streamer that was caught using that deepfake website. If we stop turning around the bowl, Atrioc.
And yes, It’s my understanding that he has put down more than 60k iirc towards helping streamers prevent it. He talks about it in this video