Episodios

  • The datafication of refugees: humanitarian agencies & biometrics with Zara Rahman from the Superrr Lab
    Feb 18 2025

    Biometrics – our fingerprints, faces, irises, for instance – are increasingly used to verify identity. But what happens when this data collection is applied to vulnerable populations, like refugees and asylum seekers, in ways that can remove agency rather than offer them protection? In the humanitarian space, organizations justify biometric data collection in a way to increase efficiency, yet stories have shown that such mechanisms can be weaponized: data handed over to oppressive governments, misidentifications leading to life-altering mistakes, and accountability often falling on the very people humanitarian programs claim to help. Beyond survival depending on data-driven systems, racial capitalism also plays a critical role by reinforcing the same global inequalities that force people to migrate in the first place. Who benefits from implementing biometric data collection in a humanitarian context, and who bears the consequences when it fails?

    To answer these questions and more, I had the pleasure to talk with Zara Rahman, author of “Machine Readable Me: The Hidden Ways That Technology Shapes Our Identities”, Strategic Advisor at the SUPERRR Lab and Visiting Research collaborator at the Citizens and Technology Lab at Cornell University. Zara is a researcher, writer, public speaker and non-profit executive, whose interests lie at the intersection of technology, justice and community. For over a decade, her work has focused on supporting the responsible use of data and technology in advocacy and social justice, working with activists from around the world to support context-driven and thoughtful uses of tech and data.

    Más Menos
    36 m
  • Mapping for justice: from cartography to GIS, with Cathy Richards from the Open Environmental Data Project
    Feb 7 2025

    If cartography, the ancestor of GIS, already displayed colonial patterns and racist stereotypes back in the day, why would the digital legacy of maps be any different? Maps have an authoritative value and hold power through the representation of the world from the perspective of who creates them. However, communities are often excluded from their design leading to the misrepresentation or omission of important landmarks and third places. In this episode, Cathy Richards explains why it is critical for communities to have the tools to paint their own stories through mapping, what is the role of communities in the development of tech powered solutions that include GIS and what are the risks associated with the exclusion of said communities.

    Cathy is the Civic Science Fellow and Data Inclusion Specialist at the Open Environmental Data Project. Previously, she was the Associate for Digital Resilience and Emerging Technology at The Engine Room where she advised civil society organizations on their use of technology and data. As a Green Web Fellow, she investigated the benefits, ethical questions, and security risks associated with using GIS for environmental justice. Cathy holds a Bachelor's degree in International Relations from Boston University, an MPA from the Monterey Institute of International Studies, and she comes from beautiful Costa Rica.

    Más Menos
    38 m
  • How MyBranz fights fake online reviews to save the planet — and your wallet with Janani Kumar, Founder of MyBranz
    Jan 31 2025

    A few months ago, in August 2024, the Federal Trade Commission (FTC) announced a final rule banning fake reviews and testimonials, a rule that will allow to deter AI-generated fake reviews by prohibiting fake or false consumer reviews, consumer testimonials, and celebrity testimonials for instance.

    Now, if you have ever been online shopping, you know that this decision is pretty groundbreaking. How many of us have been deceived by fake reviews before buying a product? Not only is it a waste of money, but also in the midst of a climate crisis, it's an additional waste and it’s very detrimental to our environment. Joining us today is Janani Kumar, the founder of MyBranz.

    Janani knew something had to be done way before the FTC even made a move. MyBranz is a software accessible online that promotes transparency at every step of the consumer journey by leveraging AI to provide verified reviews from across the web to help users find the best brands and products based on lawful and real feedback, saving money and helping the environment at the same time. In this episode, we explored the sources and impacts of online fake reviews, consumer trust, and what the FTC ruling means for users.

    https://www.mybranz.com/

    Más Menos
    24 m
  • Quit Clicking Kids: protecting child influencers through policy with Chris McCarty, founder of Quit Clicking Kids
    Jan 24 2025

    Since #KOSA, protecting kids online has continued to be a very hot topic. However, we often overlook the influence industry that also impacts kids online, for instance with the emergence of thousands of YouTube family channels. Horror stories of behind the scenes abuse have come out in recent news stories, in addition to the serious lack of protection when it comes to their privacy and their financial exploitation. Kids cannot give informed consent to become part of the family influence industry that is, unlike kids in acting careers, barely, if at all, regulated: to date, only three US states have signed this type of protective legislation into law, and many more have bills in the works. To talk about this topic, I welcomed the amazing Chris McCarty.

    At 17, Chris founded Quit Clicking Kids, an advocacy organization, to safeguard the rights of children who grow up on monetized family social media accounts after discovering that child social media stars lacked the same rights and protections as child actors. Since then, they have worked with legislators across the United States to introduce protective legislation. In addition to leading advocacy efforts at Quit Clicking Kids, Chris is a junior at the University of Washington majoring in Political Science. Their work has been featured by The New York Times, CNN, NBC News, Teen Vogue, and they recently made the Forbes List of 30 under 30 in the social media category.

    For more information on Quit Clicking Kids:

    https://quitclickingkids.com/

    https://www.instagram.com/quit_clicking_kids/

    Más Menos
    30 m
  • How existing safety mitigation safeguards fail in LLMs with Khaoula Chehbouni, PhD Researcher at McGill and MILA
    Jan 17 2025

    Large Language Models, or LLMs, may be the most popular type of AI systems, often seen as an alternative to search engines, even though they should not as the information they throw at users only resemble and mimic human speech and is not always factual, among many other issues that are talked about in this episode.

    Our guest today is Khaoula Chehbouni, she is a PhD Student in Computer Science at McGill University and Mila (Quebec AI Institute). Khaoula was awarded the prestigious FRQNT Doctoral Training Scholarship to research fairness and safety in large language models. She previously worked as a Senior Data Scientist at Statistics Canada and completed her Masters in Business Intelligence at HEC Montreal, where she received the Best Master Thesis award.

    In this episode, we talked about the impact of Western narratives on which LLMs are trained, the limits of trust and safety, how racism and stereotypes are mirrored and amplified by LLMs, and what it is like to be a minority in a STEM academic environment. I hope you’ll enjoy this episode.

    Más Menos
    37 m
  • Privacy under attack: how the Tor Project fights digital surveillance, with Raya Sharbain & Pavel Zoneff
    Jan 10 2025

    We’re all very used to being surveilled by now, especially through surveillance capitalism, or the commodification of our personal data - our age, location, mental state, shopping habits, tax bracket, are collected through various apps and websites and sold to thousands of third parties. On top of that, governments surveil their citizens, and it does not only happen in authoritarian States such as Russia, it also happens in the United States as well, where activists are watched by authorities during and after lawful protests.

    Looking at how pervasive tech enabled surveillance is, once you’re aware, it feels like living in a dystopia. Who needs to read Orwell’s 1984 when you can just look into civil society’s reports on mass surveillance or read the news? What we need are anti-surveillance alternatives, such as a search engine that, unlike Google, does not track you or any of your personal data, let alone sell them to whoever is willing to pay, and that addresses censorship, government firewalls, and empower users to access the open web. The good news is that it exists, and it’s called Tor: a web browser that protects users' privacy and anonymity by hiding their IP addresses and browsing activity by sending web traffic through a series of routers, called nodes, to anonymize it. The traffic is encrypted three times as it passes through the Tor network, a process conceptualized by the idea of "onion routing" that began in the 90s. The goal was to use the internet with as much privacy as possible, relying on a decentralized network. Today, the Tor Browser has become the world's strongest tool for privacy and freedom online.

    I had the pleasure to welcome not one, but two guests today from the Tor Project: Raya Sharbain is an Education Coordinator with the Tor Project, where she facilitates training for journalists and human rights defenders on Tor and Tails the anonymous operating system, and also develops and updates educational curricula on the Tor ecosystem, focusing on its use in circumventing network censorship and surveillance. Raya is a part-time Research Fellow with the Citizen Lab, focusing on targeted surveillance. Pavel Zoneff, with over a decade of experience working for some of the world’s leading tech brands, Pavel joined the Tor Project in 2023. As Director of Strategic Communications he supports the organization’s global outreach and advocacy efforts to champion unrestricted access to the open web and encrypted technologies.

    Más Menos
    34 m
  • Queer Arab History, social media, and representation with Marwan Kaabour from Takweer
    Dec 17 2024

    Queer Arabs are often portrayed as either hated by their own Arab community or as Western imports as if Queerness came along with colonialism. And online, queer voices are mostly from western countries and don’t represent Queer Arabs. However, Marwan Kaabour is challenging these narratives by researching, digitally archiving, and celebrating Queer History in the Middle East from centuries ago to the present day.

    Takweer, the Instagram page that Marwan started 5 years ago to take ownership of his own story as a Queer Arab, quickly turned into a space of inclusion and discussion and went viral.Marwan is a Lebanese artist, designer, and the founder of Takweer. He was born and raised in Beirut before moving to London in 2011. From the Takweer project was born a book, The Queer Arab Glossary which is the first published collection of queer Arabic slang and was published this year in 2024. This episode is at the intersection of social media, digital archives, queer History, Arab culture, and languages and I hope you’ll enjoy it.

    Takweer page: https://www.instagram.com/takweer_/

    Book: https://saqibooks.com/books/saqi/the-queer-arab-glossary/

    Más Menos
    28 m
  • History of tech, power relations, & archiving for social justice with Dr Jeffrey Yost from the Charles Babbage Institute
    Dec 3 2024

    Technology narratives are set in the present, and all of their promises set in the near future. We’ve heard about flying cars, automated jobs, robots able to annihilate Humanity, robots able to save Humanity, and went through many hype cycles, like crypto – a time that I personally tend to block from my memory. But looking at the past, at the evolution of technology, is actually critical for its impacts to be relevant and beneficial for everyone.

    We often say that History keeps repeating itself, so if we want to predict the future of technology, why not look at its past? Beyond that, I wondered how the History of Technology related to social justice and how interdisciplinary studies could advance social justice, as well as how to choose who and what to archive when it comes to Tech History, and how much AI could be useful or harmful in this endeavour. To answer these questions and more, I had the pleasure to welcome Dr Jeffrey Yost who studies power imbalances & societal inequality in our digital world.

    Dr Yost is a historian of science, technology, and medicine focused on the social, political, and cultural and intellectual history of the digital world. He is the Director of the Charles Babbage Institute (CBI) for Computing, Information & Culture, a computing and software studies research institute and the leading and most diverse historical archives center for students and scholars to study digital tech & its contexts.

    Created, hosted and produced by Mélissa M'Raidi-Kechichian.

    Más Menos
    42 m