<?xml version="1.0" encoding="UTF-8" standalone="no"?><rss xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:media="http://search.yahoo.com/mrss/" version="2.0">
  <channel>
    <atom:link href="https://feeds.megaphone.fm/CSIS7279986668" rel="self" type="application/rss+xml"/>
    <title>Humanity, Wired</title>
    <link>https://www.csis.org/podcasts/humanity-wired</link>
    <language>en-us</language>
    <copyright>All content © 2019 Center for Strategic and International Studies</copyright>
    <description>AI. Social Media. Blockchain. Gene-edited babies. Are these the greatest innovations in history or the greatest threat to humanity? Humanity, Wired makes sense of the human rights impact of technology today and tomorrow. Host Amy Lehr, Human Rights Initiative director at the Center for Strategic and International Studies (CSIS) in Washington D.C., sits down with human rights defenders, policymakers and technologists to discuss how to make technology work for us, not against us.</description>
    
    <itunes:explicit>no</itunes:explicit>
    <itunes:type>episodic</itunes:type>
    <itunes:subtitle>AI. Social Media. Blockchain. Gene-edited babies. Are these the greatest innovations in history or the greatest threat to humanity? Humanity, Wired makes sense of the human rights impact of technology today and tomorrow. Host Amy Lehr, Human Rights Initia</itunes:subtitle>
    <itunes:author>Center for Strategic and International Studies</itunes:author>
    <itunes:summary>AI. Social Media. Blockchain. Gene-edited babies. Are these the greatest innovations in history or the greatest threat to humanity? Humanity, Wired makes sense of the human rights impact of technology today and tomorrow. Host Amy Lehr, Human Rights Initiative director at the Center for Strategic and International Studies (CSIS) in Washington D.C., sits down with human rights defenders, policymakers and technologists to discuss how to make technology work for us, not against us.</itunes:summary>
    <content:encoded>
      <![CDATA[<p>AI. Social Media. Blockchain. Gene-edited babies. Are these the greatest innovations in history or the greatest threat to humanity? Humanity, Wired makes sense of the human rights impact of technology today and tomorrow. Host Amy Lehr, Human Rights Initiative director at the Center for Strategic and International Studies (CSIS) in Washington D.C., sits down with human rights defenders, policymakers and technologists to discuss how to make technology work for us, not against us.</p>]]>
    </content:encoded>
    <itunes:owner>
      <itunes:name>Center for Strategic and International Studies</itunes:name>
      <itunes:email>webmaster@csis.org</itunes:email>
    </itunes:owner>
    <itunes:image href="https://megaphone.imgix.net/podcasts/1d8b79b6-067b-11ea-9ae9-672de0457cf3/image/190612_podcasts_humanitywired_option1_v3.jpg?ixlib=rails-4.3.1&amp;max-w=3000&amp;max-h=3000&amp;fit=crop&amp;auto=format,compress"/>
    <itunes:category text="Technology">
    </itunes:category>
    <item>
      <title>Human Rights in a Surveilled World</title>
      <link>https://www.csis.org/podcasts/humanity-wired</link>
      <description>In this episode, host Amy Lehr talks with Steve Feldstein, associate professor and Frank and Bethine Church Chair of Public Affairs at Boise State University, about the concerns his research identified with regard to how AI-powered surveillance technology is being deployed around the world, and what we can do about it.</description>
      <pubDate>Fri, 21 Feb 2020 21:02:00 -0000</pubDate>
      <itunes:title>Human Rights in a Surveilled World</itunes:title>
      <itunes:episodeType>full</itunes:episodeType>
      <itunes:author>Center for Strategic and International Studies</itunes:author>
      <itunes:image href="https://megaphone.imgix.net/podcasts/009c007a-54ee-11ea-8a8f-6f175bc310d0/image/uploads_2F1582319153265-9pywgz7b2h6-713f718d6a8da43c5be31412a8fac0ff_2F190612_podcasts_humanitywired_option1_v3.jpg?ixlib=rails-4.3.1&amp;max-w=3000&amp;max-h=3000&amp;fit=crop&amp;auto=format,compress"/>
      <itunes:subtitle>In this episode, host Amy Lehr talks with Steve Feldstein, associate professor and Frank and Bethine Church Chair of Public Affairs at Boise State University, about the concerns his research identified with regard to how AI-powered surveillance technology is being deployed around the world, and what we can do about it.</itunes:subtitle>
      <itunes:summary>In this episode, host Amy Lehr talks with Steve Feldstein, associate professor and Frank and Bethine Church Chair of Public Affairs at Boise State University, about the concerns his research identified with regard to how AI-powered surveillance technology is being deployed around the world, and what we can do about it.</itunes:summary>
      <content:encoded>
        <![CDATA[<p>In this episode, host Amy Lehr talks with Steve Feldstein, associate professor and Frank and Bethine Church Chair of Public Affairs at Boise State University, about the concerns his research identified with regard to how AI-powered surveillance technology is being deployed around the world, and what we can do about it.</p>]]>
      </content:encoded>
      <itunes:duration>2028</itunes:duration>
      <guid isPermaLink="false"><![CDATA[009c007a-54ee-11ea-8a8f-6f175bc310d0]]></guid>
      <enclosure length="0" type="audio/mpeg" url="https://traffic.megaphone.fm/CSIS1056313335.mp3"/>
    <itunes:explicit>no</itunes:explicit></item>
    <item>
      <title>What Happens When Online Speech Goes Offline?</title>
      <link>https://www.csis.org/podcasts/humanity-wired</link>
      <description>In this episode, host Amy Lehr talks with Brittan Heller, founding Director of the Center on Technology and Society for the Anti-Defamation League. They discuss online hate speech and how it connects to real-life violence. They also discuss the roles of governments, companies, and citizens in combating combating online hate speech.</description>
      <pubDate>Fri, 14 Feb 2020 18:58:00 -0000</pubDate>
      <itunes:title>What Happens When Online Speech Goes Offline?</itunes:title>
      <itunes:episodeType>full</itunes:episodeType>
      <itunes:author>Center for Strategic and International Studies</itunes:author>
      <itunes:image href="https://megaphone.imgix.net/podcasts/f7a1a2b0-4f5a-11ea-a2f7-1f1b5e4d92e1/image/uploads_2F1581706260681-aw314gsr0pr-7700f12da0b940a9b87ca6774f3b39b6_2F190612_podcasts_humanitywired_option1_v3.jpg?ixlib=rails-4.3.1&amp;max-w=3000&amp;max-h=3000&amp;fit=crop&amp;auto=format,compress"/>
      <itunes:subtitle>In this episode, host Amy Lehr talks with Brittan Heller, founding Director of the Center on Technology and Society for the Anti-Defamation League. </itunes:subtitle>
      <itunes:summary>In this episode, host Amy Lehr talks with Brittan Heller, founding Director of the Center on Technology and Society for the Anti-Defamation League. They discuss online hate speech and how it connects to real-life violence. They also discuss the roles of governments, companies, and citizens in combating combating online hate speech.</itunes:summary>
      <content:encoded>
        <![CDATA[<p>In this episode, host Amy Lehr talks with Brittan Heller, founding Director of the Center on Technology and Society for the Anti-Defamation League. They discuss online hate speech and how it connects to real-life violence. They also discuss the roles of governments, companies, and citizens in combating combating online hate speech.</p>]]>
      </content:encoded>
      <itunes:duration>1744</itunes:duration>
      <guid isPermaLink="false"><![CDATA[f7a1a2b0-4f5a-11ea-a2f7-1f1b5e4d92e1]]></guid>
      <enclosure length="0" type="audio/mpeg" url="https://traffic.megaphone.fm/CSIS5937832741.mp3"/>
    <itunes:explicit>no</itunes:explicit></item>
    <item>
      <title>Preventing Radicalization: A Personal Perspective</title>
      <description>Host Amy Lehr talks with Hadiya Masieh. Hadiya joined Islamist group Hizbut Tahrir when she was in college. After a decade, she severed those ties, dedicating her time and energy to speaking out against the ideas promoted by such radical groups. She uses her insights and experience to deter young people from taking the same path, using technology as one tool in that effort.</description>
      <pubDate>Fri, 09 Aug 2019 16:30:00 -0000</pubDate>
      <itunes:episodeType>full</itunes:episodeType>
      <itunes:author>Center for Strategic and International Studies</itunes:author>
      <itunes:subtitle> Host Amy Lehr talks with Hadiya Masieh. Hadiya joined Islamist group Hizbut Tahrir when she was in college. After a decade, she severed those ties, dedicating her time and energy to speaking out against the ideas promoted by such radical groups. She uses</itunes:subtitle>
      <itunes:summary>Host Amy Lehr talks with Hadiya Masieh. Hadiya joined Islamist group Hizbut Tahrir when she was in college. After a decade, she severed those ties, dedicating her time and energy to speaking out against the ideas promoted by such radical groups. She uses her insights and experience to deter young people from taking the same path, using technology as one tool in that effort.</itunes:summary>
      <content:encoded>
        <![CDATA[ Host Amy Lehr talks with Hadiya Masieh. Hadiya joined Islamist group Hizbut Tahrir when she was in college. After a decade, she severed those ties, dedicating her time and energy to speaking out against the ideas promoted by such radical groups. She uses her insights and experience to deter young people from taking the same path, using technology as one tool in that effort. ]]>
      </content:encoded>
      <itunes:duration>1672</itunes:duration>
      <itunes:explicit>no</itunes:explicit>
      <guid isPermaLink="false"><![CDATA[https://csis-prod.s3.amazonaws.com/s3fs-public/field_soundcloud_audio/190702_Masieh_Edit%203.mp3]]></guid>
      <enclosure length="0" type="audio/mpeg" url="https://traffic.megaphone.fm/CSIS2729937380.mp3"/>
    </item>
    <item>
      <title>A Human Rights-based Approach to A.I.</title>
      <description>Initiatives and partnerships to promote “ethical A.I.” are proliferating within the A.I. community. While ethics provide a critical framework in addressing challenges posed by A.I., it is not a replacement for human rights. Host Amy Lehr discusses the human rights impact of A.I. and what governments and companies can do to make human rights integral in the design and use of A.I. with guests David Kaye, the UN Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, and Vivek Krishnamurthy, Counsel in the Boston office of Foley Hoag LLP and lecturer on law at Harvard Law School as part of the Berkman Klein Center.</description>
      <pubDate>Thu, 25 Jul 2019 22:30:00 -0000</pubDate>
      <itunes:episodeType>full</itunes:episodeType>
      <itunes:author>Center for Strategic and International Studies</itunes:author>
      <itunes:subtitle> Initiatives and partnerships to promote “ethical A.I.” are proliferating within the A.I. community. While ethics provide a critical framework in addressing challenges posed by A.I., it is not a replacement for human rights. Host Amy Lehr discusses the hu</itunes:subtitle>
      <itunes:summary>Initiatives and partnerships to promote “ethical A.I.” are proliferating within the A.I. community. While ethics provide a critical framework in addressing challenges posed by A.I., it is not a replacement for human rights. Host Amy Lehr discusses the human rights impact of A.I. and what governments and companies can do to make human rights integral in the design and use of A.I. with guests David Kaye, the UN Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, and Vivek Krishnamurthy, Counsel in the Boston office of Foley Hoag LLP and lecturer on law at Harvard Law School as part of the Berkman Klein Center.</itunes:summary>
      <content:encoded>
        <![CDATA[ Initiatives and partnerships to promote “ethical A.I.” are proliferating within the A.I. community. While ethics provide a critical framework in addressing challenges posed by A.I., it is not a replacement for human rights. Host Amy Lehr discusses the human rights impact of A.I. and what governments and companies can do to make human rights integral in the design and use of A.I. with guests David Kaye, the UN Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, and Vivek Krishnamurthy, Counsel in the Boston office of Foley Hoag LLP and lecturer on law at Harvard Law School as part of the Berkman Klein Center. ]]>
      </content:encoded>
      <itunes:duration>2157</itunes:duration>
      <itunes:explicit>no</itunes:explicit>
      <guid isPermaLink="false"><![CDATA[https://csis-prod.s3.amazonaws.com/s3fs-public/field_soundcloud_audio/HW_Ep3_Vivek_David.mp3]]></guid>
      <enclosure length="0" type="audio/mpeg" url="https://traffic.megaphone.fm/CSIS6590375324.mp3"/>
    </item>
    <item>
      <title>AI for Good</title>
      <description>There is a preconceived notion that artificial intelligence has predominantly negative implications for human rights. However, artificial intelligence can also positively impact human rights --a point that’s often neglected and not given the attention they deserve. Humanity, Wired host Amy Lehr talks with Sherif Elsayed-Ali, Director of Partnerships at AI for Good, and Element AI, about the positive impacts of AI, and the role of business in this space. Element AI’s AI for Good lab provides dedicated, world class AI and engineering expertise to organizations working for the public benefit. He is also co-chair of the World Economic Forum's global future council on human rights and technology, and a fellow at the Carr Center for Human Rights at Harvard Kennedy School.</description>
      <pubDate>Fri, 12 Jul 2019 15:45:00 -0000</pubDate>
      <itunes:episodeType>full</itunes:episodeType>
      <itunes:author>Center for Strategic and International Studies</itunes:author>
      <itunes:subtitle> There is a preconceived notion that artificial intelligence has predominantly negative implications for human rights. However, artificial intelligence can also positively impact human rights --a point that’s often neglected and not given the attention th</itunes:subtitle>
      <itunes:summary>There is a preconceived notion that artificial intelligence has predominantly negative implications for human rights. However, artificial intelligence can also positively impact human rights --a point that’s often neglected and not given the attention they deserve. Humanity, Wired host Amy Lehr talks with Sherif Elsayed-Ali, Director of Partnerships at AI for Good, and Element AI, about the positive impacts of AI, and the role of business in this space. Element AI’s AI for Good lab provides dedicated, world class AI and engineering expertise to organizations working for the public benefit. He is also co-chair of the World Economic Forum's global future council on human rights and technology, and a fellow at the Carr Center for Human Rights at Harvard Kennedy School.</itunes:summary>
      <content:encoded>
        <![CDATA[ There is a preconceived notion that artificial intelligence has predominantly negative implications for human rights. However, artificial intelligence can also positively impact human rights --a point that’s often neglected and not given the attention they deserve. Humanity, Wired host Amy Lehr talks with Sherif Elsayed-Ali, Director of Partnerships at AI for Good, and Element AI, about the positive impacts of AI, and the role of business in this space. Element AI’s AI for Good lab provides dedicated, world class AI and engineering expertise to organizations working for the public benefit. He is also co-chair of the World Economic Forum's global future council on human rights and technology, and a fellow at the Carr Center for Human Rights at Harvard Kennedy School. ]]>
      </content:encoded>
      <itunes:duration>1560</itunes:duration>
      <itunes:explicit>no</itunes:explicit>
      <guid isPermaLink="false"><![CDATA[https://csis-prod.s3.amazonaws.com/s3fs-public/field_soundcloud_audio/HWSherif02.mp3]]></guid>
      <enclosure length="0" type="audio/mpeg" url="https://traffic.megaphone.fm/CSIS6821900584.mp3"/>
    </item>
    <item>
      <title>I spy with my little eye: Spyware and Stalkerware</title>
      <description>In recent years, an industry of stalkerware—including so-called spouseware—has grown. When malicious stalkerware is installed on devices, it is well hidden. It allows the spy ware’s owner to spy on everything the victim is doing. According to Eva Galperin, researcher at the Electronic Frontier Foundation, “Full access to someone’s phone is essentially full access to someone’s mind.” This spyware has serious repercussions for the right to privacy, and could pose severe risks to victims of domestic abuse.</description>
      <pubDate>Fri, 28 Jun 2019 13:30:00 -0000</pubDate>
      <itunes:episodeType>full</itunes:episodeType>
      <itunes:author>Center for Strategic and International Studies</itunes:author>
      <itunes:subtitle> In recent years, an industry of stalkerware—including so-called spouseware—has grown. When malicious stalkerware is installed on devices, it is well hidden. It allows the spy ware’s owner to spy on everything the victim is doing. According to Eva Galperi</itunes:subtitle>
      <itunes:summary>In recent years, an industry of stalkerware—including so-called spouseware—has grown. When malicious stalkerware is installed on devices, it is well hidden. It allows the spy ware’s owner to spy on everything the victim is doing. According to Eva Galperin, researcher at the Electronic Frontier Foundation, “Full access to someone’s phone is essentially full access to someone’s mind.” This spyware has serious repercussions for the right to privacy, and could pose severe risks to victims of domestic abuse.</itunes:summary>
      <content:encoded>
        <![CDATA[ In recent years, an industry of stalkerware—including so-called spouseware—has grown. When malicious stalkerware is installed on devices, it is well hidden. It allows the spy ware’s owner to spy on everything the victim is doing. According to Eva Galperin, researcher at the Electronic Frontier Foundation, “Full access to someone’s phone is essentially full access to someone’s mind.” This spyware has serious repercussions for the right to privacy, and could pose severe risks to victims of domestic abuse. ]]>
      </content:encoded>
      <itunes:duration>1568</itunes:duration>
      <itunes:explicit>no</itunes:explicit>
      <guid isPermaLink="false"><![CDATA[https://csis-prod.s3.amazonaws.com/s3fs-public/field_soundcloud_audio/HWEva01.mp3]]></guid>
      <enclosure length="0" type="audio/mpeg" url="https://traffic.megaphone.fm/CSIS4106447643.mp3"/>
    </item>
  </channel>
</rss>