Get the top HN stories in your inbox every day.
JNRowe
Melting_Harps
> The documentary is available on iPlayer¹; You may need to be in the UK or have an IP that at least looks that way to see it :/
I watched and recall these events very well as I saw them unfold in real time; to be honest the AI filter is rather notable and is really no better than the home-brew deepfake stuff you've seen, to be honest. What that means is that the masses have access to tools that are sophisticated as the MSM platforms, which is in itself remarkable how quickly the barrier of entry has been made for building with these tools.
It's a good watch and should be on people's watchlist amongst other doumcaraies like 'Revolution of Our Times' [0] and 'Faceless' [1] which both use the low-tech masks that achieve the same end. Some have said it's a gimmick, which I partly agree, but what it does is serve to re-enforce the fact that these surveillance tools can and should be thwarted using some basic understanding of OPSEC and pre-caution: they didn't mention that the HK Police and CCP gathered a ton of data from the use of data tracking from publicly accessed information when protestors were taking the underground to get to and from protests and were tracked down for violating NSL later on in 2020-22 when the arrests came down and everone from activists to politicians and even media tycoon like Jimmy Lai were taken down with fake charges.
It's sad state of affairs, but it's a reminder of how quickly an affluent, well educated Society can come under the yoke of tyranny and lose decades of progress due the expansion of authoritarianism and should serve as a reminder of why 'this matters' despite it being an unpleasant aspect of 'civic duty.'
0: https://www.hkdc.us/revolution-of-our-times 1: https://www.youtube.com/watch?v=j5YKKPizQi8
cjbayliss
Another interesting example of face swapping is DeepPrivacy. [1]
Example of it used by Benn Jordan (aka, The Flashbulb) [2]
hjanssen
It says in the article that the faces are swapped with those of actors. Are those something like film actors? Because that would actually be amazing, completely dodging the drawback several commenters here have mentioned that faces of innocent people could be used.
Actors faces are already public. Then again,I would probably be pissed when my face would be used as a mask to say something I might not agree with. Interesting question if that is ethical or not.
paranoidrobot
Using actors in news stories is pretty standard when the interviewee does not want to appear on camera, or use their own voice.
There's a long history of this kind of masking:
- Backlighting, with the face in deep shadow. Possibly with the original interviewee or an actor.
- Voice-Masking with some kind of voice-changer
- Using a voice-actor to say the words.
I would see this as the next evolution of this process.
So long as The actor being portrayed has given their consent to do this (they were hired for this specific job), and the facial expressions/behaviors are genuinely recreated I don't see any ethical or moral issues.
movedx
> Interesting question if that is ethical or not.
It's likely not ethical, and I'm confident it's probably illegal. For example, can you setup a billboard at the side of a highway with Tom Hank's face on it and a quote that speaks to anti-trans rights, has a homophobic remark, or perhaps an anti-CCP statement? Probably not.
I'd say this is no different.
brookst
I didn’t see anything indicating whether or not the actors had licensed their likeness for this use. Wouldn’t that matter?
movedx
It probably would, yeah. And I would imagine the BBC would have such (concrete) agreements in place, for sure.
ethbr0
It depends on the country.
IANAL, but in the US you'd need (a) rights to the image being used (e.g. shot yourself, in public) & (b) a damn good argument that your juxtaposition of their likeness and your words doesn't cause them monetary damages (e.g. in the form of lost revenue from reputation).
Eleison23
>IANAL, but in the US you'd need (a) rights to the image being used (e.g. shot yourself, in public) & (b) a damn good argument that your juxtaposition of their likeness and your words doesn't cause them monetary damages (e.g. in the form of lost revenue from reputation).
Well... "need" is, as I often say, quite a strong word.
Might makes right, and money talks, so if you don't want to bother getting the rights, or the person who you want to exploit can't fight back, why not?
partiallypro
This seems to have a glaring flaw in that a fake AI generated person could have a real-world doppelganger, which a tyrannical government could mistakenly arrest, or worse.
kristofferc
A real person can also have a real-world doppelganger. In fact, isn't this just as likely as an AI generated person having one, maybe even more? So doing this swap didn't really change anything when it comes to "innocent" people getting targeted?
richbell
It's a novel approach, but like you've said I think the potential negatives far outweighs the positives. Perhaps they could use obviously fake faces with weird proportions and features?
Idk, this seems like a case of using cool flashy tech because it's cool and flashy and not because it's better than the alternative (using an actor to portray the subject, blurring or blacking out the subject's face).
trompetenaccoun
With face recognition software and automated edits they could run any protest footage from autocratic regimes like Iran or China through a filter to obscure everyone's face and replace it with an AI generated one. All it would cost them is a few extra minutes of editing time, I'm not sure if that would be critical in the news business where there's a lot of time pressure.
I think that would be a really cool feature that could help keep protesters safe. Dissidents could send footage to trustworthy outlets who would only make public the edited footage. Because of course people still want to share the event with the world, but they might not want to be identified in a place where if caught they get tortured, raped or even killed.
The individual face isn't really important most of the time, it's not like it makes a difference to the viewer. They could still give the unedited footage too police in places with legitimate rule of law if the government has a court order and can prove a crime was committed by an individual they have on tape.
justsomehnguy
I think that would be a really helpful feature to convince Regular Joe what there are protests today in checks notes Eastasia.
jlarocco
Seems like a publicity stunt.
A face covering would cost nothing and achieve the end goal of hiding their identity.
sportslife
This solves multiple goals: hide identity, show anon-source exists, and show facial emotions to viewers.
The latter is a huge win for making media people want to watch. Same reason all those cable stations segment the screen into four with a face in each corner, or why streamers overlay their gaming with a their face on a webcam. We like faces.
jlarocco
> The latter is a huge win for making media people want to watch.
I get that people like faces, but the news should be real. Maybe I'm expecting too much of the BBC, but it seems pretty short-sighted to start carrying fake news, even if they are only starting with a fake face.
Eleison23
I guess you're right, but if it's on the news, shouldn't we expect factual data rather than "a pageant" as De Niro repeated in "Wag the Dog"?
josephcsible
But since it's publicly known that the faces are fake, the government isn't going to try to use them to identify anyone.
dmix
A "glaring flaw" because of some extreme and unlikely scenario?
Why would governments be using AI modified content to look for people to arrest? Is this really a serious risk to dissuade using it? Seems pretty unlikely that it simultaneously a) exclusively be used as the source + b) actually matches someone IRL.
mansion7
The USA itself put out a nationwide manhunt, distributing video to news outlets, asking the public to assist in identifying protesters whose only known alleged crime was trespassing.
janalsncm
If you look at the J6 convictions it is for far more than trespassing. Interrupting the electoral college count turns out to be a pretty big deal.
richbell
> Why would governments be using this content to look for people to arrest?
Why wouldn't oppressive regimes that are arresting dissidents review footage of dissidents that makes them look bad? Harassing reporters and their sources is a very common way to suppress information.
Scientology does it, why wouldn't a government? https://www.forbes.com/sites/richardbehar/2020/08/05/sciento...
chrisseaton
> Why wouldn't oppressive regimes that are arresting dissidents review footage of dissidents that makes them look bad?
Because the faces aren’t real? They want to arrest dissidents not random people.
undefined
kramerger
Are you serious?
This is standard practice by security forces of any dictatorship.
dmix
A standard practice to use... potentially AI modified surveillance photography? Where will they get these photos from exactly in this future scenario? Activists and journalists when these photo apps become widespread (unbeknownst to the government)?
It's an interesting hypothetical for sure but it's stretch to call it a glaring flaw. Not really any worse than people being misidentified in normal photos.
fasthands9
Are we sure this has a greater chance of happening than the alternative (you present the voice a transcript or robotic voice and the government arrests a random person they thought it may be linked to anywas)
baxtr
Especially since these are generated based on training data from real humans…
upsidesinclude
It's not even AI generated faces!
They claim that they face swapped real interviews of participants with actors instead of just having the actors perform the piece....
Aachen
Wow, that's cool! For years I've been wondering about all the places my face must have ended up with all the crowdshots people like to take at busy train stations and such; now, I see a way for this issue to become unnecessary as there could be cameras that just replace unnecessary faces. Paving the road to the future!
astrea
Why even bother with this? I understand to preserve facial expressions, but why not fully use actors or sacrifice “the art” of it all and hide the faces completely. This opens up the possibility of reversing the inference of the face-swapping AI and generating the original face.
janalsncm
I’m not sure how you’d reverse the face swapping process since it’s not a one-to-one function. It’s a many-to-many function. If they mapped everyone’s face to e.g. a player on the Miami Heat there’d be no way to reverse it.
phillipseamore
Does it really protect identity? Aren't swapped faces mapped to the same landmarks that would be used for facial recognition and could just as easily be extracted?
msie
Reminds me of the scramble suit in A Scanner Darkly.
bennysonething
"an artificial intelligence was used" this is weird phrasing? Why "an" is that normal?
Melting_Harps
> "an artificial intelligence was used" this is weird phrasing? Why "an" is that normal?
Yes, it's typical syntax in the English language to separate vowels in that manner. English has lots of very little foibles like this (Y is sometimes a vowel) that you would be privy to unless you're a native speaker, so I can understand why you'd be confused, take a look here [0] for more info.
0: https://linguistics.stackexchange.com/questions/12798/why-is...
d1sxeyes
I think OP was not asking why we have the 'n' inserted, but why we are using an article at all, instead of 'an artificial intelligence' why not just 'artificial intelligence was used', after all, 'intelligence' is not normally a count noun.
Looking at OP's post history, I doubt they're confused about using 'an' vs 'a'.
In short, I think the use of 'an' here means it's a single specifically trained AI. It would not be wrong to say 'artificial intelligence was used' though, it just doesn't highlight the dedicated nature of the AI used.
Similar difference to 'Someone swapped the faces' and 'An expert swapped the faces'.
bennysonething
Sorry, I should have been a bit clearer what I was questioning.
upsidesinclude
This makes absolutely no sense.
If you wanted anonymity and were willing to go to these lengths, then the face could be fully fictitious and AI generated. Thereby placing no one in fear of government retribution.
This only makes sense if you filmed actors and wanted to cover that glaring mistake in your propaganda, because you lack the necessary talent to actually render them with AI.
I don't buy it. Not a fan of authoritarian regimes, but this stinks.
richbell
> This only makes sense if you filmed actors and wanted to cover that glaring mistake in your propaganda, because you lack the necessary talent to actually render them with AI.
Can you elaborate what you mean by this? I don't understand how you're concluding that this only makes sense if it's to cover up propaganda. Using actors as surrogates to protect people is a common practice, and this is a logical step forward (albeit one I disagree with).
upsidesinclude
Read my comment again.
This press release only makes any sense if they intend to preemptively combat investigations which show that these are actors and not the people that they claim.
richbell
I read your comment several times before replying; it still doesn't make sense.
> This press release only makes any sense if they intend to preemptively combat investigations which show that these are actors and not the people that they claim.
That assumes that there was a pretense that the people and identies are real. Documentaries and exposés commonly use methods to protect their subject's identities, including using actors and distorting the face/voice.
coding123
What happens if the fake looks like a real person.
kristofferc
Same as what would happen if a real person looked like another real person, I presume.
Aachen
Until one is made aware that the faces are all fictional, then it becomes very hard to justify any investigation into any particular face.
How likely is a match anyway? My impression is that even with large databases, faces are different enough to not be overwhelmed by hits. But then it's not as if I've used these systems so I don't know.
One step they could take is make one or more properties subtly different so that it definitely matches virtually nobody (plus then all the other features one can randomize).
hosh
Our civilization already has trouble agreeing on facts. And this is another step to further blur that.
The article has positive things to say about protecting protestors, but I see a tool used to blatantly manufacture facts.
Might as well make it obviously fake, or be honest about how this is art. Otherwise, it’s propoganda, whether as a tool for establishment or as a tool for activists.
freddealmeida
what an interesting use of this technology.
Get the top HN stories in your inbox every day.
The documentary is available on iPlayer¹; You may need to be in the UK or have an IP that at least looks that way to see it :/
It immediately made me think of Channel 4's alternative Christmas message from 2020². While the technique is obviously different between the two, the step change in quality feels immense to me. Channel 4's was a few minute long heavily staged piece that didn't hold up to close scrutiny in my eyes, and the BBC's far more convincing example is applied to — I'll trust — somewhat organic recordings of multiple people.
Edit to add: I attempt to draw the connection here because we're talking about relatively cheap broadcast television and not $500,000,000 movies.
¹ https://www.bbc.co.uk/iplayer/episode/m001f7t5/hong-kongs-fi...
² https://www.channel4.com/press/news/deepfake-queen-deliver-c...