Haley McNamara and Dani Pinter sit down with Melinda Tankard Reist, Co-Founder and Movement Director of Collective Shout, for a discussion around the recent successful campaign to remove a sexually violent game called “No Mercy” from Steam. Melinda describes how the game allows players to sexually assault your family members and rewards you for doing so. In less than a week, this campaign garnered over 70,000 signatures and over 3,000 emails sent to the CEO of Valve to remove this game. Despite the backlash and threats, Melinda and the team at Collective Shout continue their work to protect women and children from objectification and sexualization.
Melinda is an author, speaker, media commentator and campaigner. She is best known for her work addressing sexualization, objectification, harms of pornography, sexual exploitation, trafficking and violence against women. Melinda is author/editor of seven books (no. 8 on boundary-setting for girls forthcoming 2025). She co-founded Collective Shout for a world free of sexploitation 15 years ago, and is Movement Director. Melinda is an Ambassador for World Vision Australia, Compassion Australia, Hagar NZ and the youth mentoring body the Raise Foundation. She is also Senior Lecturer in the Centre for Culture and Ethics, Notre Dame University, Sydney and named in the Who’s Who of Australian Women and the World Who’s Who of Women. In 2024 she was the recipient of the ‘Global Impact Award’ presented at the Coalition to End Sexual Exploitation [CESE] Global Summit in Washington DC.
“Children being the targets of this deepfake technology is our worst nightmare”
Haley and Dani discuss the current state of artificial intelligence and deepfake technology in the realm of sexual exploitation. Overwhelmingly, these tools are used for pornographic material, and 10% of teenagers have reported being aware of deepfake pornography depicting someone they know. They also dig into how this leads to desensitization towards sexualizing children and overwhelming law enforcement.
-- Urge your representatives to pass the TAKE IT DOWN Act: https://advocacy.charityengine.net/Default.aspx?isid=2355
-- Urge Apple and Google to have better policies for A.I. apps:
https://endsexualexploitation.org/AI-Deepfake-Apps
Watch the video version of this episode here: https://youtu.be/uItdvTL3pIU
Haley McNamara and Dani Pinter discuss The Guardian article: "‘I didn’t start out wanting to see kids’: are porn algorithms feeding a generation of paedophiles – or creating one?"
They talk about the chilling reality of how pornography platforms algorithms cause escalation for so many of their visitors.
Read the full article on The Guardian here: https://tinyurl.com/yc48mczp
Learn more about the harms of pornography: https://endsexualexploitation.org/issues/pornography/
Haley McNamara (NCOSE Senior VP of Programs and Initiatives) and Dani Pinter (Senior VP and Director at the NCOSE Law Center) talk about Section 230 of the Communications Decency Act (CDA) and why it's essential for it to end. They also discuss the history of the Dirty Dozen List and what led to this unique version of the list in 2025.
Since its inception in 1996, Section 230 has effectively provided blanket immunity to big tech companies for harms facilitated on their platforms. It's time to call for a full repeal of CDA Section 230!
Learn more and take action here: www.DirtyDozenList.org
Watch the video version of this episode here: https://youtu.be/G7VZVJ1QRUc