Reject the evidence of your eyes and ears
The pace of advancement in AI has never been higher. Recent steps forward in image and video generation have the power for both good and bad. Fakes used to be an incredible amount of effort, and required high levels of skill but are now easy enough for anyone with a computer to create.
It will soon become impossible to tell whether what you are seeing is real or fake. What are the implications for news agencies being fooled, or for law enforcement?
I’ll show you how these fakes can be created and what hope we have for distinguishing truth from fiction in our digital future.
Session takeaways
- Understanding of generative adversarial networks- what they are, how they work and how to create one
- Implications of fake evidence in a world where this technology is freely available
- A healthy cynicisim of pictures, video and audio presented by any form media!
Bio
Janet heads up the Artificial Intelligence division at StoryStream and is an AI Venture Partner at investment firm MMC ventures. As an experienced C-level professional, she helps start-up companies build their AI strategy to saleable products. Janet regularly speaks and writes on technical subjects.