
Last year alone, over 60 million incidents of child sexual abuse materials (CSAM) were reported online. Social enterprise Krunam is stopping that cycle of abuse with its breakthrough platform.
Krunam CEO, Chris Wexler, joins the show to talk about how technologies like computer vision and deep learning are aiding content moderators in identifying and removing CSAM from the internet, how social algorithms incent the distribution of harsh content, and why it’s also important we don’t create an Internet that’s entirely encrypted.
Note: This episode contains discussions around child sexual abuse.
In this episode, you will learn:
- How content moderation happens on big social platforms
- Why AI is best for identifying patterns at large scale
- What computer vision is and how Krunam is using it for intent of behavior
- The challenge of AI training with illegal data
- How right now is a great growth period for technologies that identify digital toxic waste
- How humans’ attraction to outliers fuels social algorithms
- How it takes 30 years for society to adopt a new technology
- How the progress of communication technology is reorganizing society from around location to around ideas and leading to adverse social outcomes
- Why we need to have more nuanced conversations
This episode is brought to you by The Jed Mahonis Group, where we make sense of mobile app development with our non-technical approach to building custom mobile software solutions. Learn more at https://jmg.mn.
Recorded July 20, 2021 | Edited by Jordan Daoust | Produced by Jenny Karkowski
Show Links
Krunam’s website | https://krunam.co
Krunam on LinkedIn | https://www.linkedin.com/company/krunam/
Chris Wexler on LinkedIn | https://www.linkedin.com/in/chriswexler/
Chris Wexler on Twitter | https://twitter.com/ChrisWexler
JMG Careers Page | https://jmg.mn/careers
Connect with Tim Bornholdt on LinkedIn | https://www.linkedin.com/in/timbornholdt/
Chat with The Jed Mahonis Group about your app dev questions | https://jmg.mn