View from the Street: Deepfakes (almost) killed the video star

@TomGillingham

View from the Street: Deepfakes (almost) killed the video star

Read it and weep, radio. Online video is unquestionably the go-to medium for the world’s biggest companies, for politicians of all stripes and, of course, for cat owners.

Autoplay videos are ubiquitous – and frequently funny or informative and sometimes both – but as technology continues to evolve, can we believe everything we see online?

In the era of fake news, most of us are aware how quickly a blank piece of paper can be turned into a meme, or how in the aftermath of any hurricane that fake picture of the shark on a freeway will, ahem, surface.

It is less widely known that this sort of digital manipulation is no longer confined to static images and faked videos are becoming increasingly common online. A recent example is this slightly unsettling video of a robot wandering down a street, which was retweeted by a number of intelligent, influential people, including Irvine Welsh and Derren Brown.

While this is a fairly harmless piece of content, it is another reminder that the technology required to create video ‘deepfakes’ – essentially Photoshop for moving images – is becoming much more prevalent and readily accessible.

The precise definition of the term ‘deepfake’ is contested, but broadly it references instances in which technology is used to combine existing images or additional footage with an original video to create an edited piece of content that can literally put words into the mouth of the subject.

Sadly, this means a new avenue for deception is rapidly opening up, which presents another layer of difficulty in reaching the truth behind any story. It poses a risk to both private individuals and companies.

In a world in which we quite literally can’t believe our eyes, who is stepping up to help us all identify what is fact and what is fiction?

Firstly, advancing visual technology isn’t entirely a bad thing. The New York Times recently used augmented reality to probe evidence of chemical attacks in Syria, and its work with Bellingcat is well worth a look. Innovative approaches like this demonstrate the value of – and appetite for – this kind of technology in modern investigative journalism.

However, most recognised news outlets have never been under as much pressure as they are today. Severe revenue challenges and anti-mainstream media sentiment online is making it harder for traditional outlets to tackle the fake news epidemic effectively.

Perhaps as a reaction to this reality, a new sort of digital vigilante has appeared in recent years. Have-a-go heroes like @picpedant and Snopes deliver everyday debunkings, and the tide of responsibility may gradually be turning amongst the big content platforms.

It’s been hard to ignore Facebook’s ‘Fake news is not our friend’ adverts in recent weeks, and Twitter temporarily suspended the far-right conspiracy theorist Alex Jones. Last week the two social media giants also took down almost 1,000 accounts allegedly linked to “inauthentic” or “manipulating” behaviour.

Finally, and perhaps most importantly, personal responsibility, backed up by better education and public information, will be crucial in the war on misinformation. Being open to a multitude of sources may prove the most effective strategy in combating the fake news epidemic.

Social media already creates a feedback loop that reinforces users’ own world views. And before you curse millennials and new technology, bear in mind that it’s not that far removed from the way previous generations relied faithfully on the newspapers that reflected their particular political beliefs or world view.

A recent survey by the National Literacy Trust found that only one in fifty UK children could tell if a news story is real or fake and it’s highly likely that deepfake video technology is only going to get better (i.e. worse) and more deceitful.

So, if in doubt, take ten seconds to question the content you’re watching. When it comes to the next generation of fake news, it seems seeing shouldn’t always mean believing.