<?xml version="1.0" encoding="utf-8"?>
<?xml-stylesheet type="text/xsl" href="/global/feed/rss.xslt" ?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:media="http://search.yahoo.com/mrss/" xmlns:podaccess="https://access.acast.com/schema/1.0/" xmlns:acast="https://schema.acast.com/1.0/">
    <channel>
		<ttl>60</ttl>
		<generator>acast.com</generator>
		<title>The Data Fix with Dr. Mél Hogan</title>
		<link>https://shows.acast.com/the-data-fix</link>
		<atom:link href="https://feeds.acast.com/public/shows/63997541ed122a001195e286" rel="self" type="application/rss+xml"/>
		<language>en</language>
		<copyright>Mél Hogan</copyright>
		<itunes:keywords>AI,GAN,technology,newmedia,affect,feelings,datafix,scholars,experts,art,environment,leftwing</itunes:keywords>
		<itunes:author>Mél Hogan</itunes:author>
		<itunes:subtitle>a progressive pod about perpetual tech promises</itunes:subtitle>
		<itunes:summary><![CDATA[<p>Hi everyone, my name is Mél Hogan and I’m a critical media studies scholar based in Canada. I’m working on a project called The Data Fix through a series of conversations with scholars, thinkers, and feelers. Together we explore the significance of living in a world of data, and especially the growing trend of “digital humans” in the form of chatbots, holograms, deepfakes, ai images and videos, and even tech that revives the dead. The conversations are minimally edited, and serve as an archive of the collective thinking and feeling that is going into the Data Fix project. Please see thedatafix.net for more details and show notes. Thank you so much for listening.&nbsp;</p><br><p>Cover art by <a href="https://www.instagram.com/oonaode/" rel="noopener noreferrer" target="_blank">Oona Ode</a>.</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		<description><![CDATA[<p>Hi everyone, my name is Mél Hogan and I’m a critical media studies scholar based in Canada. I’m working on a project called The Data Fix through a series of conversations with scholars, thinkers, and feelers. Together we explore the significance of living in a world of data, and especially the growing trend of “digital humans” in the form of chatbots, holograms, deepfakes, ai images and videos, and even tech that revives the dead. The conversations are minimally edited, and serve as an archive of the collective thinking and feeling that is going into the Data Fix project. Please see thedatafix.net for more details and show notes. Thank you so much for listening.&nbsp;</p><br><p>Cover art by <a href="https://www.instagram.com/oonaode/" rel="noopener noreferrer" target="_blank">Oona Ode</a>.</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
		<itunes:explicit>false</itunes:explicit>
		<itunes:owner>
			<itunes:name>Mél Hogan</itunes:name>
			<itunes:email>info+63997541ed122a001195e286@mg-eu.acast.com</itunes:email>
		</itunes:owner>
		<acast:showId>63997541ed122a001195e286</acast:showId>
		<acast:showUrl>the-data-fix</acast:showUrl>
		<acast:signature key="EXAMPLE" algorithm="aes-256-cbc"><![CDATA[wbG1Z7+6h9QOi+CR1Dv0uQ==]]></acast:signature>
		<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmTHg2/BXqPr07kkpFZ5JfhvEZqggcpunI6E1w81XpUaBscFc3skEQ0jWG4GCmQYJ66w6pH6P/aGd3DnpJN6h/CD4icd8kZVl4HZn12KicA2k]]></acast:settings>
        <acast:network id="63997541ed122a001195e290" slug="mel-hogan"><![CDATA[MEL HOGAN]]></acast:network>
		<itunes:type>episodic</itunes:type>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			
		<item>
			<title>Mundane, with Wendy H. Wong</title>
			<itunes:title>Mundane, with Wendy H. Wong</itunes:title>
			<pubDate>Mon, 30 Mar 2026 10:00:00 GMT</pubDate>
			<itunes:duration>1:00:28</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/69bea28662f6c66afe01f8c0/media.mp3" length="83068960" type="audio/mpeg"/>
			<guid isPermaLink="false">69bea28662f6c66afe01f8c0</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/mundane-with-wendy-h-wong</link>
			<acast:episodeId>69bea28662f6c66afe01f8c0</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>mundane-with-wendy-h-wong</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxoGjO/9lLzKylq+uibUinm6TqmagYH1deaz1DyF50tAbUxBnZpsxvilzIt8PfRUCwL1k162BO/NJHW3brGwB0BV]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>79</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>In this episode, Wendy H. Wong explains the human rights implications of datafication. We talk about how data become valuable, sticky data, big tech’s encroachment on governance, our faces/selves as datapoints, and how the mundane underlies and explains so much of how data collection happens in the first place. Recorded Mar 20, 2026. Released Mar 30, 2026.</p><br><p><strong>Wendy H. Wong We, the Data: Human Rights in the Digital Age</strong></p><p><a href="https://mitpress.mit.edu/9780262048576/we-the-data/" rel="noopener noreferrer" target="_blank">https://mitpress.mit.edu/9780262048576/we-the-data/</a></p><br><p><strong>Human Rights in the Digital Age</strong></p><p><a href="https://youtu.be/HyLV5Tf54QM?si=y08lMREdIgYVULcG" rel="noopener noreferrer" target="_blank">https://youtu.be/HyLV5Tf54QM?si=y08lMREdIgYVULcG</a></p><br><p><strong>Big Tech companies govern our lives. It’s time they’re held accountable for it</strong></p><p><a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.theglobeandmail.com%2Fopinion%2Farticle-google-amazon-apple-meta-microsoft-governance-accountability%2F&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7Cad9eb26a0d5f4ba25e9908de86a83d36%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C639096254280454544%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=yNJ3ZJy%2Fwg0nNJDLUZaZ9zC889%2BEP4WnNCkZRs5BxG8%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">https://www.theglobeandmail.com/opinion/article-google-amazon-apple-meta-microsoft-governance-accountability/</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>In this episode, Wendy H. Wong explains the human rights implications of datafication. We talk about how data become valuable, sticky data, big tech’s encroachment on governance, our faces/selves as datapoints, and how the mundane underlies and explains so much of how data collection happens in the first place. Recorded Mar 20, 2026. Released Mar 30, 2026.</p><br><p><strong>Wendy H. Wong We, the Data: Human Rights in the Digital Age</strong></p><p><a href="https://mitpress.mit.edu/9780262048576/we-the-data/" rel="noopener noreferrer" target="_blank">https://mitpress.mit.edu/9780262048576/we-the-data/</a></p><br><p><strong>Human Rights in the Digital Age</strong></p><p><a href="https://youtu.be/HyLV5Tf54QM?si=y08lMREdIgYVULcG" rel="noopener noreferrer" target="_blank">https://youtu.be/HyLV5Tf54QM?si=y08lMREdIgYVULcG</a></p><br><p><strong>Big Tech companies govern our lives. It’s time they’re held accountable for it</strong></p><p><a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.theglobeandmail.com%2Fopinion%2Farticle-google-amazon-apple-meta-microsoft-governance-accountability%2F&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7Cad9eb26a0d5f4ba25e9908de86a83d36%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C639096254280454544%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=yNJ3ZJy%2Fwg0nNJDLUZaZ9zC889%2BEP4WnNCkZRs5BxG8%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">https://www.theglobeandmail.com/opinion/article-google-amazon-apple-meta-microsoft-governance-accountability/</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Hollowed, with Olivia Guest, Iris van Rooij and Andrea Reyes Elizondo</title>
			<itunes:title>Hollowed, with Olivia Guest, Iris van Rooij and Andrea Reyes Elizondo</itunes:title>
			<pubDate>Mon, 16 Mar 2026 10:00:00 GMT</pubDate>
			<itunes:duration>1:00:56</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/69a71876db942e85cc86f00c/media.mp3" length="91492768" type="audio/mpeg"/>
			<guid isPermaLink="false">69a71876db942e85cc86f00c</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/hollowed-with-olivia-guest-iris-van-rooij-and-andrea-reyes-e</link>
			<acast:episodeId>69a71876db942e85cc86f00c</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>hollowed-with-olivia-guest-iris-van-rooij-and-andrea-reyes-e</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxrssW4Ugm3ajcIksOgJFu8EApOepvCswcNo5ZwSIsQsZighwwxni++ZhK/kWbUyiVZgjRgbLCycR+Jm4mEbX4zB]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>78</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>In this episode, Olivia Guest, Iris van Rooij and Andrea Reyes Elizondo discuss why it’s important to the overall purpose and significance of the university to resist the uncritical adoption of AI in academia. The risk of AI adoption is that it’ll hollow out the institutions first, and then society at large. Recorded Feb 27, 2026. Released March 16, 2026.</p><br><p><strong>Against the Uncritical Adoption of 'AI' Technologies in Academia</strong></p><p>Olivia Guest,&nbsp;Marcela Suarez,&nbsp;Barbara Müller,&nbsp;Edwin van Meerkerk,&nbsp;Arnoud Oude Groote Beverborg,&nbsp;Ronald de Haan,&nbsp;Andrea Reyes Elizondo,&nbsp;Mark Blokpoel,&nbsp;Natalia Scharfenberg,&nbsp;Annelies Kleinherenbrink,&nbsp;Ileana Camerino,&nbsp;Marieke Woensdregt,&nbsp;Dagmar Monett,&nbsp;Jed Brown,&nbsp;Lucy Avraamidou,&nbsp;Juliette Alenda-Demoutiez,&nbsp;Felienne Hermans&nbsp;&amp;&nbsp;Iris van Rooij</p><p><a href="https://philarchive.org/rec/GUEATU" rel="noopener noreferrer" target="_blank">https://philarchive.org/rec/GUEATU</a></p><br><p><strong>*the 2nd&nbsp;quote read on the episode:</strong></p><p>Guest, O.&nbsp;(2025).&nbsp;<a href="https://can01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fdoi.org%2F10.48550%2FarXiv.2507.19960&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7C6fc7e54996fd4d56681008de769d1ecb%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C639078614368001889%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=sqZf6hGZ%2FmXwu6lq1SUh%2BJx%2BYy2OSku2kdxRG5F%2Bwzo%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">What Does 'Human-Centred AI' Mean?</a>.&nbsp;<em>arXiv</em>.&nbsp;<a href="https://can01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fdoi.org%2F10.48550%2FarXiv.2507.19960&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7C6fc7e54996fd4d56681008de769d1ecb%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C639078614368029197%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=ZlZhNyKVX7i2TZwqvN%2FhvOb%2BRFIGY%2F824nrsv%2Fgd0ws%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">https://doi.org/10.48550/arXiv.2507.19960</a></p><br><p><strong>We've been here before! What do you mean?</strong></p><p>Olivia Guest,&nbsp;18 February 2026</p><p><a href="https://olivia.science/before/" rel="noopener noreferrer" target="_blank">https://olivia.science/before/</a></p><br><p><strong>Summer School: Critical AI Literacies for Resisting and&nbsp;Reclaiming</strong></p><p><a href="https://irisvanrooijcogsci.com/2026/02/18/summer-school-critical-ai-literacies-for-resisting-and-reclaiming/" rel="noopener noreferrer" target="_blank">https://irisvanrooijcogsci.com/2026/02/18/summer-school-critical-ai-literacies-for-resisting-and-reclaiming/</a> and <a href="https://olivia.science/ai/" rel="noopener noreferrer" target="_blank">https://olivia.science/ai/</a>&nbsp;</p><br><p><strong>Academic Collaborations and Public Health: Lessons from Dutch Universities' Tobacco Industry Partnerships for Fossil Fuel Ties. </strong>Zenodo.&nbsp;</p><p>Knoester, L., Pereira, A., Vanheule, L., Reyes Elizondo, A., Littlejohn, A., &amp; Urai, A. (2025). </p><p><a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fdoi.org%2F10.5281%2Fzenodo.15274865&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7C17c35ee5be4549f855a808de796b0066%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C639081697653955943%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=jS00oUp1D4K4VcbSu78vMbRu5yZsDc80tHMQ%2FP9Hhho%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">https://doi.org/10.5281/zenodo.15274865</a></p><br><p><strong>Why AI transparency is not enough</strong></p><p><a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.leidenmadtrics.nl%2Farticles%2Fwhy-ai-transparency-is-not-enough&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7C17c35ee5be4549f855a808de796b0066%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C639081697653985478%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=Nfo1OcixwoHmCP%2FFubM8HI6X2m6J9TzQfJaeMtCB6lU%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">https://www.leidenmadtrics.nl/articles/why-ai-transparency-is-not-enough</a>&nbsp;</p><br><p><br></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>In this episode, Olivia Guest, Iris van Rooij and Andrea Reyes Elizondo discuss why it’s important to the overall purpose and significance of the university to resist the uncritical adoption of AI in academia. The risk of AI adoption is that it’ll hollow out the institutions first, and then society at large. Recorded Feb 27, 2026. Released March 16, 2026.</p><br><p><strong>Against the Uncritical Adoption of 'AI' Technologies in Academia</strong></p><p>Olivia Guest,&nbsp;Marcela Suarez,&nbsp;Barbara Müller,&nbsp;Edwin van Meerkerk,&nbsp;Arnoud Oude Groote Beverborg,&nbsp;Ronald de Haan,&nbsp;Andrea Reyes Elizondo,&nbsp;Mark Blokpoel,&nbsp;Natalia Scharfenberg,&nbsp;Annelies Kleinherenbrink,&nbsp;Ileana Camerino,&nbsp;Marieke Woensdregt,&nbsp;Dagmar Monett,&nbsp;Jed Brown,&nbsp;Lucy Avraamidou,&nbsp;Juliette Alenda-Demoutiez,&nbsp;Felienne Hermans&nbsp;&amp;&nbsp;Iris van Rooij</p><p><a href="https://philarchive.org/rec/GUEATU" rel="noopener noreferrer" target="_blank">https://philarchive.org/rec/GUEATU</a></p><br><p><strong>*the 2nd&nbsp;quote read on the episode:</strong></p><p>Guest, O.&nbsp;(2025).&nbsp;<a href="https://can01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fdoi.org%2F10.48550%2FarXiv.2507.19960&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7C6fc7e54996fd4d56681008de769d1ecb%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C639078614368001889%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=sqZf6hGZ%2FmXwu6lq1SUh%2BJx%2BYy2OSku2kdxRG5F%2Bwzo%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">What Does 'Human-Centred AI' Mean?</a>.&nbsp;<em>arXiv</em>.&nbsp;<a href="https://can01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fdoi.org%2F10.48550%2FarXiv.2507.19960&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7C6fc7e54996fd4d56681008de769d1ecb%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C639078614368029197%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=ZlZhNyKVX7i2TZwqvN%2FhvOb%2BRFIGY%2F824nrsv%2Fgd0ws%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">https://doi.org/10.48550/arXiv.2507.19960</a></p><br><p><strong>We've been here before! What do you mean?</strong></p><p>Olivia Guest,&nbsp;18 February 2026</p><p><a href="https://olivia.science/before/" rel="noopener noreferrer" target="_blank">https://olivia.science/before/</a></p><br><p><strong>Summer School: Critical AI Literacies for Resisting and&nbsp;Reclaiming</strong></p><p><a href="https://irisvanrooijcogsci.com/2026/02/18/summer-school-critical-ai-literacies-for-resisting-and-reclaiming/" rel="noopener noreferrer" target="_blank">https://irisvanrooijcogsci.com/2026/02/18/summer-school-critical-ai-literacies-for-resisting-and-reclaiming/</a> and <a href="https://olivia.science/ai/" rel="noopener noreferrer" target="_blank">https://olivia.science/ai/</a>&nbsp;</p><br><p><strong>Academic Collaborations and Public Health: Lessons from Dutch Universities' Tobacco Industry Partnerships for Fossil Fuel Ties. </strong>Zenodo.&nbsp;</p><p>Knoester, L., Pereira, A., Vanheule, L., Reyes Elizondo, A., Littlejohn, A., &amp; Urai, A. (2025). </p><p><a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fdoi.org%2F10.5281%2Fzenodo.15274865&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7C17c35ee5be4549f855a808de796b0066%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C639081697653955943%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=jS00oUp1D4K4VcbSu78vMbRu5yZsDc80tHMQ%2FP9Hhho%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">https://doi.org/10.5281/zenodo.15274865</a></p><br><p><strong>Why AI transparency is not enough</strong></p><p><a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.leidenmadtrics.nl%2Farticles%2Fwhy-ai-transparency-is-not-enough&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7C17c35ee5be4549f855a808de796b0066%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C639081697653985478%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=Nfo1OcixwoHmCP%2FFubM8HI6X2m6J9TzQfJaeMtCB6lU%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">https://www.leidenmadtrics.nl/articles/why-ai-transparency-is-not-enough</a>&nbsp;</p><br><p><br></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title><![CDATA[Opposition, with Daisy Maldonado, Annie Ersinghaus & Gilberto Manzanarez]]></title>
			<itunes:title><![CDATA[Opposition, with Daisy Maldonado, Annie Ersinghaus & Gilberto Manzanarez]]></itunes:title>
			<pubDate>Mon, 02 Mar 2026 11:00:00 GMT</pubDate>
			<itunes:duration>1:00:06</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/699e14e1123f974082fd2cf9/media.mp3" length="88959520" type="audio/mpeg"/>
			<guid isPermaLink="false">699e14e1123f974082fd2cf9</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/opposition-with-daisy-maldonado-annie-ersinghaus-gilberto-ma</link>
			<acast:episodeId>699e14e1123f974082fd2cf9</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>opposition-with-daisy-maldonado-annie-ersinghaus-gilberto-ma</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxo7yaMCth/L31RHYSJtJnXpIidDXjksCZ85ksZ7Z4ngllc4s+5oXry6jIK+R+Pvei4CvvvOe2NR/1f1Y0HuYMZB]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>77</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>In this episode, Daisy Maldonado, Annie Ersinghaus, Gilberto Manzanarez and I discuss on-the-ground opposition to two data centers being built against the will of local residents: Project Jupiter (Oracle/Open AI) in New Mexico, and the largest data center in California -- in Imperial Valley.&nbsp;This conversation was part of: <a href="https://cal.sdsu.edu/humtech/events" rel="noopener noreferrer" target="_blank">Powering AI from the Borderlands: Organizing Against Data Centers</a>, facilitated by Dustin Edwards. Recorded Feb 24, 2026. Released Monday March 2, 2026.</p><br><p><br></p><p><strong>Powering AI from the Borderlands: Organizing Against Data Centers</strong></p><p>https://cal.sdsu.edu/humtech/events</p><br><p><strong>Gilberto Manzanarez on Instagram</strong></p><p><a href="https://www.instagram.com/valleimperialresiste/" rel="noopener noreferrer" target="_blank">https://www.instagram.com/valleimperialresiste/</a></p><br><p><strong>Resistance to data centers rises on the border</strong></p><p><a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.hcn.org%2Farticles%2Fresistance-to-data-centers-rises-on-the-border%2F&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7C3f38cfc07eb34a8e97f508de6cf20568%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C639067984396997288%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=%2BAVJMAR6e7nfY96H%2F8ltjEA2ObTWf0fAhQjgcVsVAIo%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">https://www.hcn.org/articles/resistance-to-data-centers-rises-on-the-border/</a></p><br><p><strong>"Jupiter Watch" videos on Project Jupiter in New Mexico</strong></p><p><a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DPi_phIsUpRg%26t%3D5s&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7C3f38cfc07eb34a8e97f508de6cf20568%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C639067984397014406%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=B7zDofOGE4Cwtyf6SUuE7rmt8NBr8Dk6NKt4yOKBqlM%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">https://www.youtube.com/watch?v=Pi_phIsUpRg&amp;t=5s</a></p><br><p><strong>The Water Is Coming ¡Ya Viene La Agua!</strong></p><p>Annie Ersinghaus’s short doc on the politics of the Rio Grande (Project Jupiter is pulling from underground water along the Rio Grande)</p><p><a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DKgab5ICoWJI%26t%3D430s&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7C3f38cfc07eb34a8e97f508de6cf20568%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C639067984397030324%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=WYA%2FuI2ixJ%2FfKaGzVVDuCkNc3RCjE2N1tudK2JfD5hg%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">https://www.youtube.com/watch?v=Kgab5ICoWJI&amp;t=430s</a>&nbsp;</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>In this episode, Daisy Maldonado, Annie Ersinghaus, Gilberto Manzanarez and I discuss on-the-ground opposition to two data centers being built against the will of local residents: Project Jupiter (Oracle/Open AI) in New Mexico, and the largest data center in California -- in Imperial Valley.&nbsp;This conversation was part of: <a href="https://cal.sdsu.edu/humtech/events" rel="noopener noreferrer" target="_blank">Powering AI from the Borderlands: Organizing Against Data Centers</a>, facilitated by Dustin Edwards. Recorded Feb 24, 2026. Released Monday March 2, 2026.</p><br><p><br></p><p><strong>Powering AI from the Borderlands: Organizing Against Data Centers</strong></p><p>https://cal.sdsu.edu/humtech/events</p><br><p><strong>Gilberto Manzanarez on Instagram</strong></p><p><a href="https://www.instagram.com/valleimperialresiste/" rel="noopener noreferrer" target="_blank">https://www.instagram.com/valleimperialresiste/</a></p><br><p><strong>Resistance to data centers rises on the border</strong></p><p><a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.hcn.org%2Farticles%2Fresistance-to-data-centers-rises-on-the-border%2F&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7C3f38cfc07eb34a8e97f508de6cf20568%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C639067984396997288%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=%2BAVJMAR6e7nfY96H%2F8ltjEA2ObTWf0fAhQjgcVsVAIo%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">https://www.hcn.org/articles/resistance-to-data-centers-rises-on-the-border/</a></p><br><p><strong>"Jupiter Watch" videos on Project Jupiter in New Mexico</strong></p><p><a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DPi_phIsUpRg%26t%3D5s&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7C3f38cfc07eb34a8e97f508de6cf20568%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C639067984397014406%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=B7zDofOGE4Cwtyf6SUuE7rmt8NBr8Dk6NKt4yOKBqlM%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">https://www.youtube.com/watch?v=Pi_phIsUpRg&amp;t=5s</a></p><br><p><strong>The Water Is Coming ¡Ya Viene La Agua!</strong></p><p>Annie Ersinghaus’s short doc on the politics of the Rio Grande (Project Jupiter is pulling from underground water along the Rio Grande)</p><p><a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DKgab5ICoWJI%26t%3D430s&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7C3f38cfc07eb34a8e97f508de6cf20568%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C639067984397030324%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=WYA%2FuI2ixJ%2FfKaGzVVDuCkNc3RCjE2N1tudK2JfD5hg%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">https://www.youtube.com/watch?v=Kgab5ICoWJI&amp;t=430s</a>&nbsp;</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Enough, with Adam Becker</title>
			<itunes:title>Enough, with Adam Becker</itunes:title>
			<pubDate>Mon, 16 Feb 2026 11:00:00 GMT</pubDate>
			<itunes:duration>1:11:36</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/69909a48b0cb4fc2fd5e50e2/media.mp3" length="114643072" type="audio/mpeg"/>
			<guid isPermaLink="false">69909a48b0cb4fc2fd5e50e2</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/enough-with-adam-becker</link>
			<acast:episodeId>69909a48b0cb4fc2fd5e50e2</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>enough-with-adam-becker</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxo4P8y/9Q9zWazZpfWYno46T6dDx6eI7Q0DmHV4zq63N2cx6iUPA+hFIAcathsstK5ZXiyZTGFNhFhXC1waLwf4]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>76</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>In this episode, Adam Becker and I talk about our AI overlords and their “philosophical” influences — mostly eugenics-based pseudoscience and bad readings of sci-fi that make tech billionaires feel like they’ve earned their billions by being the smartest people on the planet… while ruining the planet. Recorded Feb 13, 2026. Released Feb 16, 2026.</p><br><p><strong><em>More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity</em> (2025) </strong><a href="https://mitpressbookstore.mit.edu/book/9781541619593" rel="noopener noreferrer" target="_blank">https://mitpressbookstore.mit.edu/book/9781541619593</a>&nbsp;</p><br><p>BBC:&nbsp;<a href="https://www.bbc.co.uk/programmes/w172zsskmkss5ll" rel="noopener noreferrer" target="_blank">https://www.bbc.co.uk/programmes/w172zsskmkss5ll</a> (start at 39:15)</p><br><p>Rolling Stone Q&amp;A: <a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.rollingstone.com%2Fculture%2Fculture-features%2Ftech-billionaires-adam-becker-1235381649%2F&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7Cdd7efb56287c4b1e4adb08de6b470b6d%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C639066150112348115%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=nxxUkOcU0w8Js1oFFeWIkNjgBpqlQyqMDKzWPQslRvQ%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">https://www.rollingstone.com/culture/culture-features/tech-billionaires-adam-becker-1235381649/</a></p><br><p><strong><em>Dreaming Against the Machine</em></strong></p><p><a href="http://dreamingagainstthemachine.com/" rel="noopener noreferrer" target="_blank">http://dreamingagainstthemachine.com/</a></p><p>Forthcoming: Adam Becker’s new podcast website (bookmark this now for later!) </p><br><p>Mentioned in our conversation:</p><p>I Am An AI Hater by Anthony Moser</p><p><a href="https://anthonymoser.github.io/writing/ai/haterdom/2025/08/26/i-am-an-ai-hater.html" rel="noopener noreferrer" target="_blank">https://anthonymoser.github.io/writing/ai/haterdom/2025/08/26/i-am-an-ai-hater.html</a>&nbsp;</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>In this episode, Adam Becker and I talk about our AI overlords and their “philosophical” influences — mostly eugenics-based pseudoscience and bad readings of sci-fi that make tech billionaires feel like they’ve earned their billions by being the smartest people on the planet… while ruining the planet. Recorded Feb 13, 2026. Released Feb 16, 2026.</p><br><p><strong><em>More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity</em> (2025) </strong><a href="https://mitpressbookstore.mit.edu/book/9781541619593" rel="noopener noreferrer" target="_blank">https://mitpressbookstore.mit.edu/book/9781541619593</a>&nbsp;</p><br><p>BBC:&nbsp;<a href="https://www.bbc.co.uk/programmes/w172zsskmkss5ll" rel="noopener noreferrer" target="_blank">https://www.bbc.co.uk/programmes/w172zsskmkss5ll</a> (start at 39:15)</p><br><p>Rolling Stone Q&amp;A: <a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.rollingstone.com%2Fculture%2Fculture-features%2Ftech-billionaires-adam-becker-1235381649%2F&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7Cdd7efb56287c4b1e4adb08de6b470b6d%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C639066150112348115%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=nxxUkOcU0w8Js1oFFeWIkNjgBpqlQyqMDKzWPQslRvQ%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">https://www.rollingstone.com/culture/culture-features/tech-billionaires-adam-becker-1235381649/</a></p><br><p><strong><em>Dreaming Against the Machine</em></strong></p><p><a href="http://dreamingagainstthemachine.com/" rel="noopener noreferrer" target="_blank">http://dreamingagainstthemachine.com/</a></p><p>Forthcoming: Adam Becker’s new podcast website (bookmark this now for later!) </p><br><p>Mentioned in our conversation:</p><p>I Am An AI Hater by Anthony Moser</p><p><a href="https://anthonymoser.github.io/writing/ai/haterdom/2025/08/26/i-am-an-ai-hater.html" rel="noopener noreferrer" target="_blank">https://anthonymoser.github.io/writing/ai/haterdom/2025/08/26/i-am-an-ai-hater.html</a>&nbsp;</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Containment, with Zoë Sofoulis (Zoe Sofia) and Ingrid Richardson</title>
			<itunes:title>Containment, with Zoë Sofoulis (Zoe Sofia) and Ingrid Richardson</itunes:title>
			<pubDate>Mon, 02 Feb 2026 11:00:00 GMT</pubDate>
			<itunes:duration>59:06</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/6978e497a40f59499ea8a8d7/media.mp3" length="92504608" type="audio/mpeg"/>
			<guid isPermaLink="false">6978e497a40f59499ea8a8d7</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/containment-with-zoe-sofoulis-zoe-sofia-and-ingrid-richardso</link>
			<acast:episodeId>6978e497a40f59499ea8a8d7</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>containment-with-zoe-sofoulis-zoe-sofia-and-ingrid-richardso</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxqRajFj2lygMC3rHZtnrsziG1wk9BOn/t/UwmOKNTwMJQCrOSVAAAi5XABIt2ub2VgOSl3Rdb26rlnSyP4nLl/Q]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>75</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>Great theorists make you see the world differently forever after. This is the case with my guests for this episode, Zoë Sofoulis (Zoe Sofia) and Ingrid Richardson — brilliant scholars helping us rethink <em>containers</em> and <em>containment</em> as a feminist strategy to rework worldly narratives about what holds, filters, leaks.&nbsp;This conversation was a special honour for me. Recorded January 23, 2026. Released February 2, 2026.&nbsp;</p><br><p><strong>Containment Technologies of Holding, Filtering, Leaking (2025) </strong></p><p><a href="https://meson.press/books/containment/" rel="noopener noreferrer" target="_blank">https://meson.press/books/containment/</a>&nbsp;</p><br><p>Zoë Sofia <strong>Container Technologies (2000)</strong></p><p><a href="https://www.researchgate.net/publication/227700296_Container_Technologies" rel="noopener noreferrer" target="_blank">https://www.researchgate.net/publication/227700296_Container_Technologies</a>&nbsp;</p><br><p>Shoutouts to:</p><p><strong>Re-Understanding Media: Feminist Extensions of Marshall McLuhan</strong> (Duke University Press) <a href="https://dukeupress.edu/re-understanding-media" rel="noopener noreferrer" target="_blank">https://dukeupress.edu/re-understanding-media</a>&nbsp;</p><p><strong>Insufferable Tools: Feminism Against Big Tech</strong> <a href="https://www.dukeupress.edu/insufferable-tools" rel="noopener noreferrer" target="_blank">https://www.dukeupress.edu/insufferable-tools</a>&nbsp;</p><p><strong>Understanding Media: The Extensions of Man&nbsp;Paperback</strong> –1994 <a href="https://mitpress.mit.edu/9780262631594/understanding-media/" rel="noopener noreferrer" target="_blank">https://mitpress.mit.edu/9780262631594/understanding-media/</a>&nbsp;</p><p>And much more…</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Great theorists make you see the world differently forever after. This is the case with my guests for this episode, Zoë Sofoulis (Zoe Sofia) and Ingrid Richardson — brilliant scholars helping us rethink <em>containers</em> and <em>containment</em> as a feminist strategy to rework worldly narratives about what holds, filters, leaks.&nbsp;This conversation was a special honour for me. Recorded January 23, 2026. Released February 2, 2026.&nbsp;</p><br><p><strong>Containment Technologies of Holding, Filtering, Leaking (2025) </strong></p><p><a href="https://meson.press/books/containment/" rel="noopener noreferrer" target="_blank">https://meson.press/books/containment/</a>&nbsp;</p><br><p>Zoë Sofia <strong>Container Technologies (2000)</strong></p><p><a href="https://www.researchgate.net/publication/227700296_Container_Technologies" rel="noopener noreferrer" target="_blank">https://www.researchgate.net/publication/227700296_Container_Technologies</a>&nbsp;</p><br><p>Shoutouts to:</p><p><strong>Re-Understanding Media: Feminist Extensions of Marshall McLuhan</strong> (Duke University Press) <a href="https://dukeupress.edu/re-understanding-media" rel="noopener noreferrer" target="_blank">https://dukeupress.edu/re-understanding-media</a>&nbsp;</p><p><strong>Insufferable Tools: Feminism Against Big Tech</strong> <a href="https://www.dukeupress.edu/insufferable-tools" rel="noopener noreferrer" target="_blank">https://www.dukeupress.edu/insufferable-tools</a>&nbsp;</p><p><strong>Understanding Media: The Extensions of Man&nbsp;Paperback</strong> –1994 <a href="https://mitpress.mit.edu/9780262631594/understanding-media/" rel="noopener noreferrer" target="_blank">https://mitpress.mit.edu/9780262631594/understanding-media/</a>&nbsp;</p><p>And much more…</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Friendship, with Am Johal and Matt Hern</title>
			<itunes:title>Friendship, with Am Johal and Matt Hern</itunes:title>
			<pubDate>Mon, 19 Jan 2026 11:00:00 GMT</pubDate>
			<itunes:duration>57:36</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/696d39c74796fcbb57c5dd75/media.mp3" length="89433856" type="audio/mpeg"/>
			<guid isPermaLink="false">696d39c74796fcbb57c5dd75</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/friendship-with-am-johal-and-matt-hern</link>
			<acast:episodeId>696d39c74796fcbb57c5dd75</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>friendship-with-am-johal-and-matt-hern</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxpJ7z3g4SPRv6T941BlfjhVYY0QGTwpjH6+lhCQlsYW/4lqGg0YTmmu4sbYNgpAe5NYrILUconSfauGJCjNSxlP]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>74</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>In this first episode of 2026, I speak with Am Johal and Matt Stern, authors of the "O My Friends, There Is No Friend: The Politics of Friendship at the End of Ecology" (2025) to better understand what friendship means these days... at the "end of ecology". Is friendship political? Can an AI chatbot be a friend? Recorded Jan 16, 2026. Released Jan 19, 2026. </p><br><p>Matt Hern, Am Johal O My Friends, There is No Friend (pdf)</p><p><a href="https://library.oapen.org/bitstream/id/022e1f6f-f14a-4c11-a94c-c95610919f8c/9783839470268.pdf" rel="noopener noreferrer" target="_blank">https://library.oapen.org/bitstream/id/022e1f6f-f14a-4c11-a94c-c95610919f8c/9783839470268.pdf</a></p><br><p>Below the Radar podcast:</p><p><a href="https://www.sfu.ca/vancity-office-community-engagement/below-the-radar-podcast.html" rel="noopener noreferrer" target="_blank">https://www.sfu.ca/vancity-office-community-engagement/below-the-radar-podcast.html</a></p><br><p>***</p><br><p>New Cover art by <a href="https://www.instagram.com/oonaode/" rel="noopener noreferrer" target="_blank">Oona Ode</a>!</p><br><p><br></p><br><p><br></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>In this first episode of 2026, I speak with Am Johal and Matt Stern, authors of the "O My Friends, There Is No Friend: The Politics of Friendship at the End of Ecology" (2025) to better understand what friendship means these days... at the "end of ecology". Is friendship political? Can an AI chatbot be a friend? Recorded Jan 16, 2026. Released Jan 19, 2026. </p><br><p>Matt Hern, Am Johal O My Friends, There is No Friend (pdf)</p><p><a href="https://library.oapen.org/bitstream/id/022e1f6f-f14a-4c11-a94c-c95610919f8c/9783839470268.pdf" rel="noopener noreferrer" target="_blank">https://library.oapen.org/bitstream/id/022e1f6f-f14a-4c11-a94c-c95610919f8c/9783839470268.pdf</a></p><br><p>Below the Radar podcast:</p><p><a href="https://www.sfu.ca/vancity-office-community-engagement/below-the-radar-podcast.html" rel="noopener noreferrer" target="_blank">https://www.sfu.ca/vancity-office-community-engagement/below-the-radar-podcast.html</a></p><br><p>***</p><br><p>New Cover art by <a href="https://www.instagram.com/oonaode/" rel="noopener noreferrer" target="_blank">Oona Ode</a>!</p><br><p><br></p><br><p><br></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Predictable, with José Marichal</title>
			<itunes:title>Predictable, with José Marichal</itunes:title>
			<pubDate>Mon, 22 Dec 2025 11:00:00 GMT</pubDate>
			<itunes:duration>55:30</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/6931db1d4a0500b7570fc0cd/media.mp3" length="85405120" type="audio/mpeg"/>
			<guid isPermaLink="false">6931db1d4a0500b7570fc0cd</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/predictable-with-jose-marichal</link>
			<acast:episodeId>6931db1d4a0500b7570fc0cd</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>predictable-with-jose-marichal</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxq1M4ngvvOTaaZ+/LzJzl4ApA2enCU3FTJVY/xzhFCd/9kWFkOI8NHTKW7+COkmOxPmSjzfdtEkagAywn43Z6EU]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>73</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>José Marichal is exceptionally good at making the case for why we should all become algorithmic problems and also much more reflective of our engagement with social media. We are altered by recommendation algorithms -- and we should probably think now about how to renegotiate these terms? Happy holidays! Recorded Nov 27.&nbsp;Released Dec 22, 2025.</p><br><p><strong>You Must Become an Algorithmic Problem</strong></p><p><strong>Renegotiating the Socio-Technical Contract</strong></p><p><a href="https://bristoluniversitypress.co.uk/you-must-become-an-algorithmic-problem" rel="noopener noreferrer" target="_blank">https://bristoluniversitypress.co.uk/you-must-become-an-algorithmic-problem</a>&nbsp;</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>José Marichal is exceptionally good at making the case for why we should all become algorithmic problems and also much more reflective of our engagement with social media. We are altered by recommendation algorithms -- and we should probably think now about how to renegotiate these terms? Happy holidays! Recorded Nov 27.&nbsp;Released Dec 22, 2025.</p><br><p><strong>You Must Become an Algorithmic Problem</strong></p><p><strong>Renegotiating the Socio-Technical Contract</strong></p><p><a href="https://bristoluniversitypress.co.uk/you-must-become-an-algorithmic-problem" rel="noopener noreferrer" target="_blank">https://bristoluniversitypress.co.uk/you-must-become-an-algorithmic-problem</a>&nbsp;</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Advocates, with Meg Rintoul and Kathryn Barnwell</title>
			<itunes:title>Advocates, with Meg Rintoul and Kathryn Barnwell</itunes:title>
			<pubDate>Mon, 08 Dec 2025 11:00:00 GMT</pubDate>
			<itunes:duration>1:01:40</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/6931d5c3a63c6eaa59088d3c/media.mp3" length="91883008" type="audio/mpeg"/>
			<guid isPermaLink="false">6931d5c3a63c6eaa59088d3c</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/advocates-with-meg-rintoul-and-kathryn-barnwell</link>
			<acast:episodeId>6931d5c3a63c6eaa59088d3c</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>advocates-with-meg-rintoul-and-kathryn-barnwell</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxqSZhTCyvxwcwzukt8UiscFDrHAWHI0esxqrkZe1xbVSO7IASJtIm6KWjfkfVofrI398pjPk54tk9OTQOJ4HuR+]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>72</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>In this episode I speak with friends-neighbours-advocates Meg Rintoul and Kathryn Barnwell about current plans in place to build an AI data center in Nanaimo, B-C. -- and allowing ourselves to write a new story about the future. Recorded Nov 27. Released Dec 8, 2025.</p><br><p><strong>'Very scary': Nanaimo neighbours have water worries about new data centre</strong></p><p><a href="https://www.reddit.com/r/nanaimo/comments/1oo24s3/very_scary_nanaimo_neighbours_have_water_worries/" rel="noopener noreferrer" target="_blank">https://www.reddit.com/r/nanaimo/comments/1oo24s3/very_scary_nanaimo_neighbours_have_water_worries/</a></p><br><p><strong>'An Island concern': Nanaimo water advocates want new data centre stopped</strong></p><p><a href="https://www.youtube.com/watch?v=0OQzc2bhtXw" rel="noopener noreferrer" target="_blank">https://www.youtube.com/watch?v=0OQzc2bhtXw</a></p><br><p><strong>CBC: AI-related data centres use vast amounts of water. But gauging how much is a murky business</strong></p><p><a href="https://www.cbc.ca/news/ai-data-centre-canada-water-use-9.6939684" rel="noopener noreferrer" target="_blank">https://www.cbc.ca/news/ai-data-centre-canada-water-use-9.6939684</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>In this episode I speak with friends-neighbours-advocates Meg Rintoul and Kathryn Barnwell about current plans in place to build an AI data center in Nanaimo, B-C. -- and allowing ourselves to write a new story about the future. Recorded Nov 27. Released Dec 8, 2025.</p><br><p><strong>'Very scary': Nanaimo neighbours have water worries about new data centre</strong></p><p><a href="https://www.reddit.com/r/nanaimo/comments/1oo24s3/very_scary_nanaimo_neighbours_have_water_worries/" rel="noopener noreferrer" target="_blank">https://www.reddit.com/r/nanaimo/comments/1oo24s3/very_scary_nanaimo_neighbours_have_water_worries/</a></p><br><p><strong>'An Island concern': Nanaimo water advocates want new data centre stopped</strong></p><p><a href="https://www.youtube.com/watch?v=0OQzc2bhtXw" rel="noopener noreferrer" target="_blank">https://www.youtube.com/watch?v=0OQzc2bhtXw</a></p><br><p><strong>CBC: AI-related data centres use vast amounts of water. But gauging how much is a murky business</strong></p><p><a href="https://www.cbc.ca/news/ai-data-centre-canada-water-use-9.6939684" rel="noopener noreferrer" target="_blank">https://www.cbc.ca/news/ai-data-centre-canada-water-use-9.6939684</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Tracking, with Stefanie Felsberger</title>
			<itunes:title>Tracking, with Stefanie Felsberger</itunes:title>
			<pubDate>Mon, 24 Nov 2025 11:00:00 GMT</pubDate>
			<itunes:duration>54:30</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/68fbf77c18bcdad2ab55ee2c/media.mp3" length="84491008" type="audio/mpeg"/>
			<guid isPermaLink="false">68fbf77c18bcdad2ab55ee2c</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/tracking-with-stefanie-felsberger</link>
			<acast:episodeId>68fbf77c18bcdad2ab55ee2c</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>tracking-with-stefanie-felsberger</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxrfN/+tizeJD62zeUxvK3eT4YhNYHShrHWgb0IgkWOvXXrhEqVA4j+A43RbrHgnciGEF4qVQYvhXxMRXXE0Hy2z]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>71</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>In this episode, I got to walk through a recent report called “the high stakes of tracking menstruation” with its author, Stefanie Felsberger, a sociologist of tech &amp; gender. I cannot express enough how much there is to learn from this topic that can help us understand the bigger landscape of tech promises and harms. Recorded Oct 23, 2025. Released Nov 24, 2025.</p><br><p><strong>The High Stakes of Tracking Menstruation - MCTD Cambridge</strong></p><p><a href="https://www.mctd.ac.uk/femtech-high-stakes-tracking-menstruation/" rel="noopener noreferrer" target="_blank">https://www.mctd.ac.uk/femtech-high-stakes-tracking-menstruation/</a>&nbsp;</p><p><a href="https://www.mctd.ac.uk/wp-content/uploads/2025/06/The-High-Stakes-of-Tracking-Menstruation_Accessible.html" rel="noopener noreferrer" target="_blank">https://www.mctd.ac.uk/wp-content/uploads/2025/06/The-High-Stakes-of-Tracking-Menstruation_Accessible.html</a>&nbsp;</p><br><p><strong>Menstrual apps harvest data that ‘puts women’s safety at risk’</strong></p><p><a href="https://www.thetimes.com/uk/healthcare/article/menstrual-apps-harvest-data-that-puts-womens-safety-at-risk-bd0srb8mt?" rel="noopener noreferrer" target="_blank">https://www.thetimes.com/uk/healthcare/article/menstrual-apps-harvest-data-that-puts-womens-safety-at-risk-bd0srb8mt?</a>&nbsp;</p><br><p><strong>Period:&nbsp;The Real Story of Menstruation&nbsp;(by Kate Clancy)</strong></p><p><a href="https://press.princeton.edu/books/hardcover/9780691191317/period?" rel="noopener noreferrer" target="_blank">https://press.princeton.edu/books/hardcover/9780691191317/period?</a>&nbsp;</p><br><p><strong>Website</strong></p><p><a href="https://www.stefaniefelsberger.com/" rel="noopener noreferrer" target="_blank">https://www.stefaniefelsberger.com</a> </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>In this episode, I got to walk through a recent report called “the high stakes of tracking menstruation” with its author, Stefanie Felsberger, a sociologist of tech &amp; gender. I cannot express enough how much there is to learn from this topic that can help us understand the bigger landscape of tech promises and harms. Recorded Oct 23, 2025. Released Nov 24, 2025.</p><br><p><strong>The High Stakes of Tracking Menstruation - MCTD Cambridge</strong></p><p><a href="https://www.mctd.ac.uk/femtech-high-stakes-tracking-menstruation/" rel="noopener noreferrer" target="_blank">https://www.mctd.ac.uk/femtech-high-stakes-tracking-menstruation/</a>&nbsp;</p><p><a href="https://www.mctd.ac.uk/wp-content/uploads/2025/06/The-High-Stakes-of-Tracking-Menstruation_Accessible.html" rel="noopener noreferrer" target="_blank">https://www.mctd.ac.uk/wp-content/uploads/2025/06/The-High-Stakes-of-Tracking-Menstruation_Accessible.html</a>&nbsp;</p><br><p><strong>Menstrual apps harvest data that ‘puts women’s safety at risk’</strong></p><p><a href="https://www.thetimes.com/uk/healthcare/article/menstrual-apps-harvest-data-that-puts-womens-safety-at-risk-bd0srb8mt?" rel="noopener noreferrer" target="_blank">https://www.thetimes.com/uk/healthcare/article/menstrual-apps-harvest-data-that-puts-womens-safety-at-risk-bd0srb8mt?</a>&nbsp;</p><br><p><strong>Period:&nbsp;The Real Story of Menstruation&nbsp;(by Kate Clancy)</strong></p><p><a href="https://press.princeton.edu/books/hardcover/9780691191317/period?" rel="noopener noreferrer" target="_blank">https://press.princeton.edu/books/hardcover/9780691191317/period?</a>&nbsp;</p><br><p><strong>Website</strong></p><p><a href="https://www.stefaniefelsberger.com/" rel="noopener noreferrer" target="_blank">https://www.stefaniefelsberger.com</a> </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Surveillance, with Justin Hendrix</title>
			<itunes:title>Surveillance, with Justin Hendrix</itunes:title>
			<pubDate>Mon, 10 Nov 2025 11:00:00 GMT</pubDate>
			<itunes:duration>52:28</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/68fbf2d7deee754a72e9a30d/media.mp3" length="78232576" type="audio/mpeg"/>
			<guid isPermaLink="false">68fbf2d7deee754a72e9a30d</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/surveillance-with-justin-hendrix</link>
			<acast:episodeId>68fbf2d7deee754a72e9a30d</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>surveillance-with-justin-hendrix</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxoFvVLJdlT1/jHdpJskaQNMgwVZiZv49EPYxjx/Df2N4/hlx3RiNtbpWAxkK51M+mnzQNUIYhgkPeefeiSHkDM0]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>70</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>In this episode, I speak with Justin Hendrix, the CEO and Editor of Tech Policy Press, a nonprofit media venture concerned with the intersection of technology and democracy.&nbsp;We talk about ICE (US Immigration and Customs Enforcement), surveillance, and AI. Recorded Oct 21, 2025. Released Nov 10, 2025.&nbsp;</p><br><p><strong>Republican Budget Bill Signals New Era in Federal Surveillance</strong></p><p>DEAN JACKSON,&nbsp;JUSTIN HENDRIX /&nbsp;JUL 2, 2025</p><p><a href="https://www.techpolicy.press/republican-budget-bill-signals-new-era-in-federal-surveillance/" rel="noopener noreferrer" target="_blank">https://www.techpolicy.press/republican-budget-bill-signals-new-era-in-federal-surveillance/</a></p><br><p><strong>Amidst Violent Immigration Raids, DHS Turns to Big Tech to Silence Dissent</strong></p><p>JENNA RUDDOCK /&nbsp;OCT 3, 2025</p><p><a href="https://www.techpolicy.press/amidst-violent-immigration-raids-dhs-turns-to-big-tech-to-silence-dissent/" rel="noopener noreferrer" target="_blank">https://www.techpolicy.press/amidst-violent-immigration-raids-dhs-turns-to-big-tech-to-silence-dissent/</a>&nbsp;</p><br><p><strong>AI Surveillance on the Rise in US, but Tactics of Repression Not New</strong></p><p>DIA KAYYALI /&nbsp;MAR 26, 2025</p><p><a href="https://www.techpolicy.press/ai-surveillance-on-the-rise-in-us-but-tactics-of-repression-not-new/" rel="noopener noreferrer" target="_blank">https://www.techpolicy.press/ai-surveillance-on-the-rise-in-us-but-tactics-of-repression-not-new/</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>In this episode, I speak with Justin Hendrix, the CEO and Editor of Tech Policy Press, a nonprofit media venture concerned with the intersection of technology and democracy.&nbsp;We talk about ICE (US Immigration and Customs Enforcement), surveillance, and AI. Recorded Oct 21, 2025. Released Nov 10, 2025.&nbsp;</p><br><p><strong>Republican Budget Bill Signals New Era in Federal Surveillance</strong></p><p>DEAN JACKSON,&nbsp;JUSTIN HENDRIX /&nbsp;JUL 2, 2025</p><p><a href="https://www.techpolicy.press/republican-budget-bill-signals-new-era-in-federal-surveillance/" rel="noopener noreferrer" target="_blank">https://www.techpolicy.press/republican-budget-bill-signals-new-era-in-federal-surveillance/</a></p><br><p><strong>Amidst Violent Immigration Raids, DHS Turns to Big Tech to Silence Dissent</strong></p><p>JENNA RUDDOCK /&nbsp;OCT 3, 2025</p><p><a href="https://www.techpolicy.press/amidst-violent-immigration-raids-dhs-turns-to-big-tech-to-silence-dissent/" rel="noopener noreferrer" target="_blank">https://www.techpolicy.press/amidst-violent-immigration-raids-dhs-turns-to-big-tech-to-silence-dissent/</a>&nbsp;</p><br><p><strong>AI Surveillance on the Rise in US, but Tactics of Repression Not New</strong></p><p>DIA KAYYALI /&nbsp;MAR 26, 2025</p><p><a href="https://www.techpolicy.press/ai-surveillance-on-the-rise-in-us-but-tactics-of-repression-not-new/" rel="noopener noreferrer" target="_blank">https://www.techpolicy.press/ai-surveillance-on-the-rise-in-us-but-tactics-of-repression-not-new/</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Sovereign, with Paris Marx</title>
			<itunes:title>Sovereign, with Paris Marx</itunes:title>
			<pubDate>Mon, 20 Oct 2025 10:00:00 GMT</pubDate>
			<itunes:duration>53:58</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/68f25fcdb5743a0a56575645/media.mp3" length="84826336" type="audio/mpeg"/>
			<guid isPermaLink="false">68f25fcdb5743a0a56575645</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/sovereign-with-paris-marx</link>
			<acast:episodeId>68f25fcdb5743a0a56575645</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>sovereign-with-paris-marx</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxpRKzmqDArZOBZWwT1N9AMRvUgV6Yc2IAgpM0VTPA3HTsbkglZdcrzBhxCzyXw344guwvC/28RQQY/PdUnUpMv5]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>69</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>Host of <em>Tech Won't Save Us</em> and acclaimed tech critic, author, and international speaker, Paris Marx joins me for this episode where we discuss AI futures in a Canadian context: the idea of a "sovereign cloud", an "AI minister", and much more! Recorded Oct 15, 2025. Released Oct 20, 2025.</p><br><p><strong>Website</strong></p><p><a href="https://parismarx.com/" rel="noopener noreferrer" target="_blank">https://parismarx.com/</a></p><br><p><strong>Tech Won't Save Us</strong></p><p><a href="https://techwontsave.us/" rel="noopener noreferrer" target="_blank">https://techwontsave.us/</a></p><br><p><strong>Disconnect</strong></p><p><a href="https://disconnect.blog/im-writing-a-new-book/" rel="noopener noreferrer" target="_blank">https://disconnect.blog/im-writing-a-new-book/</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Host of <em>Tech Won't Save Us</em> and acclaimed tech critic, author, and international speaker, Paris Marx joins me for this episode where we discuss AI futures in a Canadian context: the idea of a "sovereign cloud", an "AI minister", and much more! Recorded Oct 15, 2025. Released Oct 20, 2025.</p><br><p><strong>Website</strong></p><p><a href="https://parismarx.com/" rel="noopener noreferrer" target="_blank">https://parismarx.com/</a></p><br><p><strong>Tech Won't Save Us</strong></p><p><a href="https://techwontsave.us/" rel="noopener noreferrer" target="_blank">https://techwontsave.us/</a></p><br><p><strong>Disconnect</strong></p><p><a href="https://disconnect.blog/im-writing-a-new-book/" rel="noopener noreferrer" target="_blank">https://disconnect.blog/im-writing-a-new-book/</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Witnessing, with Michael Richardson</title>
			<itunes:title>Witnessing, with Michael Richardson</itunes:title>
			<pubDate>Mon, 25 Aug 2025 12:00:00 GMT</pubDate>
			<itunes:duration>1:10:48</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/68a3508cc29d1f9af2db7835/media.mp3" length="99183712" type="audio/mpeg"/>
			<guid isPermaLink="false">68a3508cc29d1f9af2db7835</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/witnessing-with-michael-richardson</link>
			<acast:episodeId>68a3508cc29d1f9af2db7835</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>witnessing-with-michael-richardson</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxrM+4E7+6zQBmY+9nLXB5yUdW/lXNc3FmdWE1Z1drwJKejbTApt1M1obEEpp/Mf1zW8iIdyOjToeGIoHEhchd/y]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>68</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>In this episode, I get to chat with the brilliant Michael Richardson on the concept of "Nonhuman Witnessing" especially in how this relates to algorithms and AI. In his book, "Nonhuman Witnessing" (Duke), he argues that a "radical rethinking of what counts as witnessing is central to building frameworks for justice in an era of endless war, ecological catastrophe, and technological capture". Recorded August 13, 2025. Released August 25, 2025. </p><br><p><strong>Nonhuman Witnessing: War, Data, and Ecology after the End of the World (Duke, 2024)</strong></p><p><a href="https://read.dukeupress.edu/books/book/3310/Nonhuman-WitnessingWar-Data-and-Ecology-after-the" rel="noopener noreferrer" target="_blank">https://read.dukeupress.edu/books/book/3310/Nonhuman-WitnessingWar-Data-and-Ecology-after-the</a></p><br><p><strong>Website </strong></p><p><a href="https://www.unsw.edu.au/staff/michael-richardson" rel="noopener noreferrer" target="_blank">https://www.unsw.edu.au/staff/michael-richardson</a></p><br><p><strong>Bluesky</strong></p><p><a href="https://bsky.app/profile/richardsonma.bsky.social" rel="noopener noreferrer" target="_blank">@richardsonma.bsky.social‬</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>In this episode, I get to chat with the brilliant Michael Richardson on the concept of "Nonhuman Witnessing" especially in how this relates to algorithms and AI. In his book, "Nonhuman Witnessing" (Duke), he argues that a "radical rethinking of what counts as witnessing is central to building frameworks for justice in an era of endless war, ecological catastrophe, and technological capture". Recorded August 13, 2025. Released August 25, 2025. </p><br><p><strong>Nonhuman Witnessing: War, Data, and Ecology after the End of the World (Duke, 2024)</strong></p><p><a href="https://read.dukeupress.edu/books/book/3310/Nonhuman-WitnessingWar-Data-and-Ecology-after-the" rel="noopener noreferrer" target="_blank">https://read.dukeupress.edu/books/book/3310/Nonhuman-WitnessingWar-Data-and-Ecology-after-the</a></p><br><p><strong>Website </strong></p><p><a href="https://www.unsw.edu.au/staff/michael-richardson" rel="noopener noreferrer" target="_blank">https://www.unsw.edu.au/staff/michael-richardson</a></p><br><p><strong>Bluesky</strong></p><p><a href="https://bsky.app/profile/richardsonma.bsky.social" rel="noopener noreferrer" target="_blank">@richardsonma.bsky.social‬</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Dopaminergic, with Rohit Revi</title>
			<itunes:title>Dopaminergic, with Rohit Revi</itunes:title>
			<pubDate>Mon, 11 Aug 2025 12:00:00 GMT</pubDate>
			<itunes:duration>58:48</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/687e8505a2391fe4321280c2/media.mp3" length="85054816" type="audio/mpeg"/>
			<guid isPermaLink="false">687e8505a2391fe4321280c2</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/dopaminergic-with-rohit-revi</link>
			<acast:episodeId>687e8505a2391fe4321280c2</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>dopaminergic-with-rohit-revi</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxradRdVokTPYA1Nma3PM/z1IJJ4t9dR0+GLOeosWZ/1j+EOEQ9ArYKs4awxmNS3dmdKvoDS/K0170YOFI9FSbWB]]></acast:settings>
			<itunes:subtitle>Dopaminergic, with Rohit Revi</itunes:subtitle>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>67</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>Rohit Revi walks us through paranoia, care, conspiracy, capitalism, and catastrophe, in relation to technology and culture, to draw us into a deeper consideration of collective psychic resources and psychological commons. We talk about psychometry and linger on the dopaminergic. Recorded July 16, 2025. Released August 11, 2025.</p><br><p><a href="https://qspace.library.queensu.ca/browse/author?scope=28264e28-1843-437c-abca-776a363a1c1c&amp;value=Revi,%20Rohit&amp;bbm.return=1" rel="noopener noreferrer" target="_blank"><strong>Great Delirium: Culture, Technology, and Paranoia in the New Age of Catastrophe</strong></a></p><p>(2025-02-24) Revi, Rohit; Cultural Studies; Murakami Wood, David; McBlane, Angus</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Rohit Revi walks us through paranoia, care, conspiracy, capitalism, and catastrophe, in relation to technology and culture, to draw us into a deeper consideration of collective psychic resources and psychological commons. We talk about psychometry and linger on the dopaminergic. Recorded July 16, 2025. Released August 11, 2025.</p><br><p><a href="https://qspace.library.queensu.ca/browse/author?scope=28264e28-1843-437c-abca-776a363a1c1c&amp;value=Revi,%20Rohit&amp;bbm.return=1" rel="noopener noreferrer" target="_blank"><strong>Great Delirium: Culture, Technology, and Paranoia in the New Age of Catastrophe</strong></a></p><p>(2025-02-24) Revi, Rohit; Cultural Studies; Murakami Wood, David; McBlane, Angus</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Con, with Emily M. Bender and Alex Hanna</title>
			<itunes:title>Con, with Emily M. Bender and Alex Hanna</itunes:title>
			<pubDate>Mon, 28 Jul 2025 12:00:00 GMT</pubDate>
			<itunes:duration>56:44</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/687e7d41fd9acfeba43be41d/media.mp3" length="82764736" type="audio/mpeg"/>
			<guid isPermaLink="false">687e7d41fd9acfeba43be41d</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/con-with-emily-m-bender-and-alex-hanna</link>
			<acast:episodeId>687e7d41fd9acfeba43be41d</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>con-with-emily-m-bender-and-alex-hanna</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxouy/3A/2ZkJSi6uysgtEZ3VJywKX874E+fWQ3PYNWHVDHqClkexeBvYwqBXBvuGjD4Ul2Zli4JffnYj0c5Q0Bh]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>66</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>Emily M.Bender and Alex Hanna have been leading the charge against "AI", helping us understand it for the con that it is, and how AI companies are turning to health, education, and other social realms to try to recover their costs. In this episode we discuss LLMs vs. what the AI (and AGI) con is -- who benefits, and who loses -- and much more. Recorded July 15, 2025. Released July 28, 2025.</p><br><p><strong>The AI Con</strong></p><p><a href="https://thecon.ai/" rel="noopener noreferrer" target="_blank">https://thecon.ai/</a></p><br><p><strong>Mystery AI Hype Theater 3000</strong></p><p><a href="https://www.dair-institute.org/maiht3k/" rel="noopener noreferrer" target="_blank">https://www.dair-institute.org/maiht3k/</a></p><br><p><a href="https://tisjune.github.io/papers/aarhus_2025_skills.pdf" rel="noopener noreferrer" target="_blank"><strong>The predatory fantasy of worker empowerment in AI marketing</strong></a></p><p>Justine Zhang, Su Lin Blodgett, Nina Markl</p><p>AI x Crisis: Tracing New Directions Beyond Deployment and Use workshop, Aarhus 2025.</p><br><p><strong>AI isn’t replacing student writing – but it is reshaping it</strong></p><p><a href="https://theconversation.com/ai-isnt-replacing-student-writing-but-it-is-reshaping-it-254878" rel="noopener noreferrer" target="_blank">https://theconversation.com/ai-isnt-replacing-student-writing-but-it-is-reshaping-it-254878</a></p><br><p><strong>Sparks of Artificial General Intelligence: Early experiments with GPT-4</strong></p><p><a href="https://arxiv.org/abs/2303.12712 " rel="noopener noreferrer" target="_blank">https://arxiv.org/abs/2303.12712 </a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Emily M.Bender and Alex Hanna have been leading the charge against "AI", helping us understand it for the con that it is, and how AI companies are turning to health, education, and other social realms to try to recover their costs. In this episode we discuss LLMs vs. what the AI (and AGI) con is -- who benefits, and who loses -- and much more. Recorded July 15, 2025. Released July 28, 2025.</p><br><p><strong>The AI Con</strong></p><p><a href="https://thecon.ai/" rel="noopener noreferrer" target="_blank">https://thecon.ai/</a></p><br><p><strong>Mystery AI Hype Theater 3000</strong></p><p><a href="https://www.dair-institute.org/maiht3k/" rel="noopener noreferrer" target="_blank">https://www.dair-institute.org/maiht3k/</a></p><br><p><a href="https://tisjune.github.io/papers/aarhus_2025_skills.pdf" rel="noopener noreferrer" target="_blank"><strong>The predatory fantasy of worker empowerment in AI marketing</strong></a></p><p>Justine Zhang, Su Lin Blodgett, Nina Markl</p><p>AI x Crisis: Tracing New Directions Beyond Deployment and Use workshop, Aarhus 2025.</p><br><p><strong>AI isn’t replacing student writing – but it is reshaping it</strong></p><p><a href="https://theconversation.com/ai-isnt-replacing-student-writing-but-it-is-reshaping-it-254878" rel="noopener noreferrer" target="_blank">https://theconversation.com/ai-isnt-replacing-student-writing-but-it-is-reshaping-it-254878</a></p><br><p><strong>Sparks of Artificial General Intelligence: Early experiments with GPT-4</strong></p><p><a href="https://arxiv.org/abs/2303.12712 " rel="noopener noreferrer" target="_blank">https://arxiv.org/abs/2303.12712 </a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Control, with Paul Schütze</title>
			<itunes:title>Control, with Paul Schütze</itunes:title>
			<pubDate>Mon, 14 Jul 2025 12:00:00 GMT</pubDate>
			<itunes:duration>55:42</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/6849e980bb8239057482b19f/media.mp3" length="85749952" type="audio/mpeg"/>
			<guid isPermaLink="false">6849e980bb8239057482b19f</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/control-with-paul-schutze</link>
			<acast:episodeId>6849e980bb8239057482b19f</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>control-with-paul-schutze</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxoVBYNJRkhaDYCpdwuxTe178zqY6+4kXwHZOLVL3s7l5DLpn/Gqc3trWFzApWzVxvSYGqc3r4OJ0W1Z4TbctVue]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>65</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>In this episode, Paul Schütze and I pick apart the inherent contradictions of “sustainable AI”, marketing language that aims to convince the public that one of the most extractive industries can be used to solve climate change. We delve into the layers of control embedded in the logics of AI, when technology becomes the fix that needs fixing. Recorded May 20, 2025. Released July 7, 2025.&nbsp;</p><br><p><strong>The impacts of AI Futurism</strong></p><p><a href="https://link.springer.com/article/10.1007/s10676-024-09758-6" rel="noopener noreferrer" target="_blank">https://link.springer.com/article/10.1007/s10676-024-09758-6</a>&nbsp;</p><br><p><strong>The Problem of Sustainable AI</strong></p><p>&nbsp;<a href="https://doi.org/10.34669/WI.WJDS/4.1.4" rel="noopener noreferrer" target="_blank">https://doi.org/10.34669/WI.WJDS/4.1.4</a>&nbsp;</p><br><p>contact <a href="mailto:paul.schuetze@uos.de" rel="noopener noreferrer" target="_blank">paul.schuetze@uos.de</a> and website: <a href="http://paulschuetze.de" rel="noopener noreferrer" target="_blank">paulschuetze.de</a>&nbsp;</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>In this episode, Paul Schütze and I pick apart the inherent contradictions of “sustainable AI”, marketing language that aims to convince the public that one of the most extractive industries can be used to solve climate change. We delve into the layers of control embedded in the logics of AI, when technology becomes the fix that needs fixing. Recorded May 20, 2025. Released July 7, 2025.&nbsp;</p><br><p><strong>The impacts of AI Futurism</strong></p><p><a href="https://link.springer.com/article/10.1007/s10676-024-09758-6" rel="noopener noreferrer" target="_blank">https://link.springer.com/article/10.1007/s10676-024-09758-6</a>&nbsp;</p><br><p><strong>The Problem of Sustainable AI</strong></p><p>&nbsp;<a href="https://doi.org/10.34669/WI.WJDS/4.1.4" rel="noopener noreferrer" target="_blank">https://doi.org/10.34669/WI.WJDS/4.1.4</a>&nbsp;</p><br><p>contact <a href="mailto:paul.schuetze@uos.de" rel="noopener noreferrer" target="_blank">paul.schuetze@uos.de</a> and website: <a href="http://paulschuetze.de" rel="noopener noreferrer" target="_blank">paulschuetze.de</a>&nbsp;</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Water, with Rebecca Kilberg, Mary-Clare Bosco and Jonathan Gilmour</title>
			<itunes:title>Water, with Rebecca Kilberg, Mary-Clare Bosco and Jonathan Gilmour</itunes:title>
			<pubDate>Mon, 16 Jun 2025 12:00:00 GMT</pubDate>
			<itunes:duration>52:46</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/682bc8d0bc0e7581523cb29b/media.mp3" length="76943488" type="audio/mpeg"/>
			<guid isPermaLink="false">682bc8d0bc0e7581523cb29b</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/water-with-rebecca-kilberg-mary-clare-bosco-jonathan-gilmour</link>
			<acast:episodeId>682bc8d0bc0e7581523cb29b</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>water-with-rebecca-kilberg-mary-clare-bosco-jonathan-gilmour</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxq7u/UYlUU18Z3XTcun0M/gwFXc0zz0OK1jyrCKrTLyF6Spd6b8WIJb1elRldT1rV3H6Rw7VlffSBlPVjCJaoHW]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>64</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>In this episode I speak with Rebecca Kilberg, Mary-Clare Bosco and Jonathan Gilmour who together use policy approaches to solve problems related to data center water usage and the various planetary and health outcomes that emerge from water consumption and extraction. They talk about how you get such data and what to do with it, and the importance of creating many local sites of resistance for a more sustainable future. Want in? <a href="https://aspenpolicyacademy.org/project/reducing-data-centers-water-consumption/" rel="noopener noreferrer" target="_blank">Get in touch</a>. Recorded May 19, 2025. Released June 16, 2025.</p><br><p><strong>Voices: Data centers must be transparent about water usage — for the sake of the Great Salt Lake</strong></p><p><a href="https://www.sltrib.com/opinion/commentary/2024/12/31/voices-utah-data-centers-must-be/" rel="noopener noreferrer" target="_blank">https://www.sltrib.com/opinion/commentary/2024/12/31/voices-utah-data-centers-must-be</a></p><br><p><strong>Reducing Data Centers’ Water Consumption </strong></p><p><a href="https://aspenpolicyacademy.org/project/reducing-data-centers-water-consumption/ " rel="noopener noreferrer" target="_blank">https://aspenpolicyacademy.org/project/reducing-data-centers-water-consumption</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>In this episode I speak with Rebecca Kilberg, Mary-Clare Bosco and Jonathan Gilmour who together use policy approaches to solve problems related to data center water usage and the various planetary and health outcomes that emerge from water consumption and extraction. They talk about how you get such data and what to do with it, and the importance of creating many local sites of resistance for a more sustainable future. Want in? <a href="https://aspenpolicyacademy.org/project/reducing-data-centers-water-consumption/" rel="noopener noreferrer" target="_blank">Get in touch</a>. Recorded May 19, 2025. Released June 16, 2025.</p><br><p><strong>Voices: Data centers must be transparent about water usage — for the sake of the Great Salt Lake</strong></p><p><a href="https://www.sltrib.com/opinion/commentary/2024/12/31/voices-utah-data-centers-must-be/" rel="noopener noreferrer" target="_blank">https://www.sltrib.com/opinion/commentary/2024/12/31/voices-utah-data-centers-must-be</a></p><br><p><strong>Reducing Data Centers’ Water Consumption </strong></p><p><a href="https://aspenpolicyacademy.org/project/reducing-data-centers-water-consumption/ " rel="noopener noreferrer" target="_blank">https://aspenpolicyacademy.org/project/reducing-data-centers-water-consumption</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Solidarity, with Shannon Wait</title>
			<itunes:title>Solidarity, with Shannon Wait</itunes:title>
			<pubDate>Mon, 02 Jun 2025 12:00:00 GMT</pubDate>
			<itunes:duration>52:58</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/681d1137ad1a4a43505b8df4/media.mp3" length="76429888" type="audio/mpeg"/>
			<guid isPermaLink="false">681d1137ad1a4a43505b8df4</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/solidarity-with-shannon-wait</link>
			<acast:episodeId>681d1137ad1a4a43505b8df4</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>solidarity-with-shannon-wait</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxouMBcfZ0qJ59rWiPTWY4uRU3Wzt//9G0JWkZ7kfb1F/nPUIyLW7FhlJHrRIOrmYoLAvngOIhC/fusiQcY2xNwR]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>63</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>In this episode, Shannon Wait — Alphabet Workers Union-CWA organizer — speaks with me about the labour conditions for data center and AI workers. We talk about contracts, sub-contracts, sub-sub-contracts, NDAs, invisible labour -- and how all of this leads to unions, solidarity, and a fight for tech workers’ rights globally. Recorded May 8, 2025. Released June 2, 2025.&nbsp;</p><br><p><strong>Interview with Shannon Wait, Alphabet Workers Union-CWA Organizer (2024)</strong></p><p><a href="https://poweratwork.us/shannon-wait-interview" rel="noopener noreferrer" target="_blank">https://poweratwork.us/shannon-wait-interview</a></p><br><p><strong>A union of Alphabet workers in the U.S. and Canada</strong></p><p><a href="https://www.alphabetworkersunion.org/" rel="noopener noreferrer" target="_blank">https://www.alphabetworkersunion.org/</a></p><br><p><strong>Google Raters Participated in Historic Action at Google HQ to Demand Google End Poverty Wages for 5,000 Workers</strong></p><p><a href="https://code-cwa.org/news/google-raters-participated-historic-action" rel="noopener noreferrer" target="_blank">https://code-cwa.org/news/google-raters-participated-historic-action</a></p><br><p><strong>The woman who took on Google and won (2021)</strong></p><p><a href="https://www.bbc.com/news/technology-56659212" rel="noopener noreferrer" target="_blank">https://www.bbc.com/news/technology-56659212</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>In this episode, Shannon Wait — Alphabet Workers Union-CWA organizer — speaks with me about the labour conditions for data center and AI workers. We talk about contracts, sub-contracts, sub-sub-contracts, NDAs, invisible labour -- and how all of this leads to unions, solidarity, and a fight for tech workers’ rights globally. Recorded May 8, 2025. Released June 2, 2025.&nbsp;</p><br><p><strong>Interview with Shannon Wait, Alphabet Workers Union-CWA Organizer (2024)</strong></p><p><a href="https://poweratwork.us/shannon-wait-interview" rel="noopener noreferrer" target="_blank">https://poweratwork.us/shannon-wait-interview</a></p><br><p><strong>A union of Alphabet workers in the U.S. and Canada</strong></p><p><a href="https://www.alphabetworkersunion.org/" rel="noopener noreferrer" target="_blank">https://www.alphabetworkersunion.org/</a></p><br><p><strong>Google Raters Participated in Historic Action at Google HQ to Demand Google End Poverty Wages for 5,000 Workers</strong></p><p><a href="https://code-cwa.org/news/google-raters-participated-historic-action" rel="noopener noreferrer" target="_blank">https://code-cwa.org/news/google-raters-participated-historic-action</a></p><br><p><strong>The woman who took on Google and won (2021)</strong></p><p><a href="https://www.bbc.com/news/technology-56659212" rel="noopener noreferrer" target="_blank">https://www.bbc.com/news/technology-56659212</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Remade, with Allison Carruth </title>
			<itunes:title>Remade, with Allison Carruth </itunes:title>
			<pubDate>Mon, 19 May 2025 12:00:00 GMT</pubDate>
			<itunes:duration>1:02:12</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/681bce0f24b1daf01a3283db/media.mp3" length="94764544" type="audio/mpeg"/>
			<guid isPermaLink="false">681bce0f24b1daf01a3283db</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/remade-with-allison-carruth</link>
			<acast:episodeId>681bce0f24b1daf01a3283db</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>remade-with-allison-carruth</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxrEWiHrNpj6L7wIvdDWNHPBOhwNF8PhU4GYPenlHd3FhHWaawPXQKMGjCNvNysaI3437wrmE8uBm8plpgRe0ALL]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>62</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>Allison Carruth and I talk about her new book which gets at some of the material infrastructure and social systems that have made the US a settler state ever obsessed with new frontiers, including space. We talk about tech imaginaries, worlds remade, and better futures — a vision that invites confronting the state of things head-on, a slower redoing, and is based on connection, love, and friendship (maybe with aliens, too). Recorded May 7, 2025. Released May 19, 2025.</p><br><p><strong><em>Novel Ecologies: Nature Remade and the Illusions of Tech </em></strong>(2025)</p><p>https://press.uchicago.edu/ucp/books/book/chicago/N/bo239362741.html</p><br><p><strong>Allison Carruth&nbsp;</strong></p><p>https://allisoncarruth.com/</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Allison Carruth and I talk about her new book which gets at some of the material infrastructure and social systems that have made the US a settler state ever obsessed with new frontiers, including space. We talk about tech imaginaries, worlds remade, and better futures — a vision that invites confronting the state of things head-on, a slower redoing, and is based on connection, love, and friendship (maybe with aliens, too). Recorded May 7, 2025. Released May 19, 2025.</p><br><p><strong><em>Novel Ecologies: Nature Remade and the Illusions of Tech </em></strong>(2025)</p><p>https://press.uchicago.edu/ucp/books/book/chicago/N/bo239362741.html</p><br><p><strong>Allison Carruth&nbsp;</strong></p><p>https://allisoncarruth.com/</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Hype, with Dani Shanley and Gemma Milne</title>
			<itunes:title>Hype, with Dani Shanley and Gemma Milne</itunes:title>
			<pubDate>Mon, 05 May 2025 12:00:00 GMT</pubDate>
			<itunes:duration>1:00:06</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/6813e17d6ac0e5213bff9eac/media.mp3" length="93481408" type="audio/mpeg"/>
			<guid isPermaLink="false">6813e17d6ac0e5213bff9eac</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/hype-with-dani-shanley-and-gemma-milne</link>
			<acast:episodeId>6813e17d6ac0e5213bff9eac</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>hype-with-dani-shanley-and-gemma-milne</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxrmJ2iHcWI6BPrQ+3gKpF4F3ZG9MHg7X1/+bE5lCCVFE4I4dUc7S3YemrzXC90ZAFig7aaqw5MPgCkKVCkr+cXr]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>61</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>In this episode, <a href="https://bsky.app/profile/danishanley.bsky.social " rel="noopener noreferrer" target="_blank">Dani Shanley</a> and <a href="https://bsky.app/profile/gemmamilne.bsky.social" rel="noopener noreferrer" target="_blank">Gemma Milne</a> walk me through "hype" -- what it means in various technological contexts, how it works, what it is definitionally, how it feels in the body, who it serves, who it harms, and how we might need to nuance our relationship to it, especially as critical (tech) scholars. Recorded May 1, 2025. Released May 5, 2025.</p><p><br></p><ul><li><a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.hachette.co.uk%2Ftitles%2Fgemma-milne%2Fsmoke-mirrors%2F9781472143655%2F&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7C8f5ca1f0921043280bd108dd88d8e1a1%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C638817187296514876%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=6tt7awHm%2BwqHkYv%2B3YfncKcm5iusDtNUig47sz0S9Uc%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">https://www.hachette.co.uk/titles/gemma-milne/smoke-mirrors/9781472143655/</a></li><li><a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fradicalsciencepodcast.com%2F&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7C8f5ca1f0921043280bd108dd88d8e1a1%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C638817187296530714%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=90DyLicvWBIJaplZZD2eXtzBGu8jUNaznxOkf%2FTJK5w%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">https://radicalsciencepodcast.com/</a></li><li><a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fbrainreel.substack.com%2F&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7C8f5ca1f0921043280bd108dd88d8e1a1%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C638817187296546301%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=eEaa8FGld0WxOmcJCHpDRqhORvJr0z65FBz1Ynx7evY%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">https://brainreel.substack.com/</a></li><li><a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fhypestudies.org%2F&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7Cc7daf944040a4cfd0b5908dd8953b7be%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C638817714868818256%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=5vLUWlGOfowpCYSyGwkC0s6qS96m7NhPp3%2B7qNhCKnA%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">https://hypestudies.org/</a></li><li><a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.biss-institute.com%2Fen%2Fabout&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7Cc7daf944040a4cfd0b5908dd8953b7be%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C638817714868842151%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=JOueUiR2X6uDWm2xZWPU2zy6zBkOGZsv8h11FcmJ%2FXg%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">https://www.biss-institute.com/en/about</a></li><li><a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.maastrichtuniversity.nl%2Fnews%2Fnew-technologies-heroes-or-villains&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7Cc7daf944040a4cfd0b5908dd8953b7be%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C638817714868861027%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=RRbx7stcIfkPQ5SJQ42FPWGTnrSPVdVS5h0yMSC1sl4%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">https://www.maastrichtuniversity.nl/news/new-technologies-heroes-or-villains</a></li></ul><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>In this episode, <a href="https://bsky.app/profile/danishanley.bsky.social " rel="noopener noreferrer" target="_blank">Dani Shanley</a> and <a href="https://bsky.app/profile/gemmamilne.bsky.social" rel="noopener noreferrer" target="_blank">Gemma Milne</a> walk me through "hype" -- what it means in various technological contexts, how it works, what it is definitionally, how it feels in the body, who it serves, who it harms, and how we might need to nuance our relationship to it, especially as critical (tech) scholars. Recorded May 1, 2025. Released May 5, 2025.</p><p><br></p><ul><li><a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.hachette.co.uk%2Ftitles%2Fgemma-milne%2Fsmoke-mirrors%2F9781472143655%2F&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7C8f5ca1f0921043280bd108dd88d8e1a1%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C638817187296514876%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=6tt7awHm%2BwqHkYv%2B3YfncKcm5iusDtNUig47sz0S9Uc%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">https://www.hachette.co.uk/titles/gemma-milne/smoke-mirrors/9781472143655/</a></li><li><a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fradicalsciencepodcast.com%2F&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7C8f5ca1f0921043280bd108dd88d8e1a1%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C638817187296530714%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=90DyLicvWBIJaplZZD2eXtzBGu8jUNaznxOkf%2FTJK5w%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">https://radicalsciencepodcast.com/</a></li><li><a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fbrainreel.substack.com%2F&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7C8f5ca1f0921043280bd108dd88d8e1a1%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C638817187296546301%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=eEaa8FGld0WxOmcJCHpDRqhORvJr0z65FBz1Ynx7evY%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">https://brainreel.substack.com/</a></li><li><a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fhypestudies.org%2F&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7Cc7daf944040a4cfd0b5908dd8953b7be%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C638817714868818256%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=5vLUWlGOfowpCYSyGwkC0s6qS96m7NhPp3%2B7qNhCKnA%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">https://hypestudies.org/</a></li><li><a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.biss-institute.com%2Fen%2Fabout&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7Cc7daf944040a4cfd0b5908dd8953b7be%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C638817714868842151%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=JOueUiR2X6uDWm2xZWPU2zy6zBkOGZsv8h11FcmJ%2FXg%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">https://www.biss-institute.com/en/about</a></li><li><a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.maastrichtuniversity.nl%2Fnews%2Fnew-technologies-heroes-or-villains&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7Cc7daf944040a4cfd0b5908dd8953b7be%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C638817714868861027%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=RRbx7stcIfkPQ5SJQ42FPWGTnrSPVdVS5h0yMSC1sl4%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank">https://www.maastrichtuniversity.nl/news/new-technologies-heroes-or-villains</a></li></ul><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Damage, with Dustin Edwards</title>
			<itunes:title>Damage, with Dustin Edwards</itunes:title>
			<pubDate>Mon, 28 Apr 2025 12:00:00 GMT</pubDate>
			<itunes:duration>57:24</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/67f6d4f002787319786b049d/media.mp3" length="43085056" type="audio/mpeg"/>
			<guid isPermaLink="false">67f6d4f002787319786b049d</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/damage-with-dustin-edwards</link>
			<acast:episodeId>67f6d4f002787319786b049d</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>damage-with-dustin-edwards</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxo/r/6K9ioRAbpaYuQAR6W+J9/znw4Hq5XaQMVufSXGG9MlsZLe0SkD77DZLo0QJ9bE2ONRIxsJeRh5VKnPCQZX]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>60</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>Dustin Edwards and I discuss the damage caused by digital infrastructure and its extractive requirements. We talk about data centers and copper mines, but more than this, we delve into the what a decolonial, feminist, anti-racist approach can look like for white settler scholars grappling with their inheritances and obligations to the landscapes and to the stories they tell themselves, as we make (new) worlds. Recorded Apr 8, 2025. Released April 28, 2025.</p><br><p><strong>Enduring Digital Damage: Rhetorical Reckonings for Planetary Survival</strong></p><p><a href="https://www.uapress.ua.edu/9780817322472/enduring-digital-damage/" rel="noopener noreferrer" target="_blank">https://www.uapress.ua.edu/9780817322472/enduring-digital-damage/</a>&nbsp;</p><br><p><strong>The making of critical data center studies</strong></p><p>Dustin&nbsp;Edwards,&nbsp;Zane Griffin Talley&nbsp;Cooper&nbsp;and&nbsp;Mél&nbsp;Hogan</p><p><a href="https://journals.sagepub.com/doi/full/10.1177/13548565231224157" rel="noopener noreferrer" target="_blank">https://journals.sagepub.com/doi/full/10.1177/13548565231224157</a>&nbsp;</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Dustin Edwards and I discuss the damage caused by digital infrastructure and its extractive requirements. We talk about data centers and copper mines, but more than this, we delve into the what a decolonial, feminist, anti-racist approach can look like for white settler scholars grappling with their inheritances and obligations to the landscapes and to the stories they tell themselves, as we make (new) worlds. Recorded Apr 8, 2025. Released April 28, 2025.</p><br><p><strong>Enduring Digital Damage: Rhetorical Reckonings for Planetary Survival</strong></p><p><a href="https://www.uapress.ua.edu/9780817322472/enduring-digital-damage/" rel="noopener noreferrer" target="_blank">https://www.uapress.ua.edu/9780817322472/enduring-digital-damage/</a>&nbsp;</p><br><p><strong>The making of critical data center studies</strong></p><p>Dustin&nbsp;Edwards,&nbsp;Zane Griffin Talley&nbsp;Cooper&nbsp;and&nbsp;Mél&nbsp;Hogan</p><p><a href="https://journals.sagepub.com/doi/full/10.1177/13548565231224157" rel="noopener noreferrer" target="_blank">https://journals.sagepub.com/doi/full/10.1177/13548565231224157</a>&nbsp;</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Consent, with Jasmine McNealy</title>
			<itunes:title>Consent, with Jasmine McNealy</itunes:title>
			<pubDate>Mon, 14 Apr 2025 12:00:00 GMT</pubDate>
			<itunes:duration>52:10</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/67f6d188027873197868e9d2/media.mp3" length="36089632" type="audio/mpeg"/>
			<guid isPermaLink="false">67f6d188027873197868e9d2</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/consent-with-jasmine-mcnealy</link>
			<acast:episodeId>67f6d188027873197868e9d2</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>consent-with-jasmine-mcnealy</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxp4QEaq/cJh5y7o/AcgZVZN0SxISmRKuD20C2GujUygEIZxapntz7fMsNFmTw5gPUwv1CA4rlZf+MRstAXjJIVY]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>59</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>In this episode, I ask Jasmine McNealy about the role of consent online, from social media exchanges to the circulation of deep fakes. Who gets to define harm? Who is responsible for the damage? Does anyone have to take accountability? We also talk about surveillance, sonic privacy, and the many data trails the body leaves behind. Recorded Apr 4, 2025. Released April 14, 2025.</p><br><p><strong>Sonic Privacy.</strong>&nbsp;</p><p><em>Yale Journal of Law &amp; Technology/Yale ISP-Knight Foundation Public Sphere Series</em>.</p><p><a href="https://law.yale.edu/sites/default/files/area/center/isp/documents/mcnealy.pdf" rel="noopener noreferrer" target="_blank">https://law.yale.edu/sites/default/files/area/center/isp/documents/mcnealy.pdf</a></p><br><p><strong>Consent (Still) Won’t Save Us </strong></p><p><em>Chapter from:&nbsp;Feminist Cyberlaw </em></p><p><a href="https://uplopen.com/chapters/e/10.1525/luminos.190.p" rel="noopener noreferrer" target="_blank">https://uplopen.com/chapters/e/10.1525/luminos.190.p</a>&nbsp;</p><p><br></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>In this episode, I ask Jasmine McNealy about the role of consent online, from social media exchanges to the circulation of deep fakes. Who gets to define harm? Who is responsible for the damage? Does anyone have to take accountability? We also talk about surveillance, sonic privacy, and the many data trails the body leaves behind. Recorded Apr 4, 2025. Released April 14, 2025.</p><br><p><strong>Sonic Privacy.</strong>&nbsp;</p><p><em>Yale Journal of Law &amp; Technology/Yale ISP-Knight Foundation Public Sphere Series</em>.</p><p><a href="https://law.yale.edu/sites/default/files/area/center/isp/documents/mcnealy.pdf" rel="noopener noreferrer" target="_blank">https://law.yale.edu/sites/default/files/area/center/isp/documents/mcnealy.pdf</a></p><br><p><strong>Consent (Still) Won’t Save Us </strong></p><p><em>Chapter from:&nbsp;Feminist Cyberlaw </em></p><p><a href="https://uplopen.com/chapters/e/10.1525/luminos.190.p" rel="noopener noreferrer" target="_blank">https://uplopen.com/chapters/e/10.1525/luminos.190.p</a>&nbsp;</p><p><br></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Together, with Collin Bjork</title>
			<itunes:title>Together, with Collin Bjork</itunes:title>
			<pubDate>Mon, 24 Mar 2025 12:00:00 GMT</pubDate>
			<itunes:duration>58:38</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/67ad24e99c6f7f7f287d555a/media.mp3" length="90954688" type="audio/mpeg"/>
			<guid isPermaLink="false">67ad24e99c6f7f7f287d555a</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/together-with-collin-bjork</link>
			<acast:episodeId>67ad24e99c6f7f7f287d555a</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>together-with-collin-bjork</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxpQl+vhDqnhGtJti3i+mVhskdWrghikTrtq/DZp9qlMBLJyKlKxNIsrm5KpaeULtOXH8iuC3HmMIimdkOMd0Ppu]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>58</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p><a href=" http://www.collinbjork.com/" rel="noopener noreferrer" target="_blank">Collin Bjork</a> and I discuss the double (triple?) meaning of "extractive AI". Collin explains Otter.ai -- an AI powered voice-to-text transcription software, and the capitalist logics that enable it -- vs. Maori-led Te Hiku Media, based on principles of stewardship, community and collaboration. Collin also explains how rhetoric is about <em>togetherness</em> more than persuasion. Recorded Feb 12, 2025. Released March 24, 2025.</p><br><p><strong>Mentioned in ep: </strong></p><p><a href="http://chrome-extension//efaidnbmnnnibpcajpcglclefindmkaj/https://textbooks.lib.wvu.edu/badideas/badideasaboutwriting-book.pdf" rel="noopener noreferrer" target="_blank"><strong><em>Bad Ideas About Writing</em></strong></a><strong>&nbsp;</strong>(Ball &amp; Loewe); <strong>Extractive AI and Its Challenge to Technical Communication</strong> (Bjork; forthcoming,&nbsp;<em>Journal of Business and Technical Communication</em>, October 2025); <a href="http://chrome-extension//efaidnbmnnnibpcajpcglclefindmkaj/https://sterneworks.org/wp-content/uploads/2024/01/Sterne-Sawhney-AcousmaticQuestion.pdf" rel="noopener noreferrer" target="_blank"><strong>"The Acousmatic Question and the Will to Datafy"</strong></a><strong>&nbsp;</strong>(Sterne &amp; Sawhney); <a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.insidehighered.com%2Fopinion%2Fviews%2F2024%2F12%2F02%2Funiversities-must-beware-reliance-big-ai-opinion&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7C85f4617f80d042540c5808dd4bb3dfcd%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C638749958114891237%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=GB%2BGNYBaRhKoIa%2FtaQCfJQAJNnbLcpzceWbGuq42a%2B4%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank"><strong>"Big AI Companies Need Higher Ed...But Does Higher Ed Need Them?"</strong></a><strong>&nbsp;</strong>(Bjork); <a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Ftheconversation.com%2Fchatgpt-threatens-language-diversity-more-needs-to-be-done-to-protect-our-differences-in-the-age-of-ai-198878&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7C85f4617f80d042540c5808dd4bb3dfcd%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C638749958114911084%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=%2FeeMZ5IGLTEiGfD4Yn57DTRlhijKy6W8xepFO83G3f0%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank"><strong>"ChatGPT Threatens Language Diversity"</strong></a>&nbsp;(Bjork); <a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Ftehiku.nz%2Fte-hiku-tech%2Fpapa-reo%2F14135%2Fte-reo-maori-speech-recognition&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7C85f4617f80d042540c5808dd4bb3dfcd%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C638749958114927341%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=i1GFVQU2CRnuJYLKEj1eouh7EUlTDYmdfBbjrbxCOjE%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank"><strong>Te Reo Māori Speech Recognition</strong></a>&nbsp;(Te Hiku Media); <a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.indigenous-ai.net%2Fabundant%2F&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7C85f4617f80d042540c5808dd4bb3dfcd%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C638749958114943296%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=9lG%2FmsaTdVwd%2B84uV05pGsYBt92m0iVq0WRXAC9IGDA%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank"><strong>Abundant Intelligences - Indigenous AI</strong></a><strong>&nbsp;</strong>(Jason Edward Lewis, Hemi Whaanga, et al...)</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p><a href=" http://www.collinbjork.com/" rel="noopener noreferrer" target="_blank">Collin Bjork</a> and I discuss the double (triple?) meaning of "extractive AI". Collin explains Otter.ai -- an AI powered voice-to-text transcription software, and the capitalist logics that enable it -- vs. Maori-led Te Hiku Media, based on principles of stewardship, community and collaboration. Collin also explains how rhetoric is about <em>togetherness</em> more than persuasion. Recorded Feb 12, 2025. Released March 24, 2025.</p><br><p><strong>Mentioned in ep: </strong></p><p><a href="http://chrome-extension//efaidnbmnnnibpcajpcglclefindmkaj/https://textbooks.lib.wvu.edu/badideas/badideasaboutwriting-book.pdf" rel="noopener noreferrer" target="_blank"><strong><em>Bad Ideas About Writing</em></strong></a><strong>&nbsp;</strong>(Ball &amp; Loewe); <strong>Extractive AI and Its Challenge to Technical Communication</strong> (Bjork; forthcoming,&nbsp;<em>Journal of Business and Technical Communication</em>, October 2025); <a href="http://chrome-extension//efaidnbmnnnibpcajpcglclefindmkaj/https://sterneworks.org/wp-content/uploads/2024/01/Sterne-Sawhney-AcousmaticQuestion.pdf" rel="noopener noreferrer" target="_blank"><strong>"The Acousmatic Question and the Will to Datafy"</strong></a><strong>&nbsp;</strong>(Sterne &amp; Sawhney); <a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.insidehighered.com%2Fopinion%2Fviews%2F2024%2F12%2F02%2Funiversities-must-beware-reliance-big-ai-opinion&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7C85f4617f80d042540c5808dd4bb3dfcd%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C638749958114891237%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=GB%2BGNYBaRhKoIa%2FtaQCfJQAJNnbLcpzceWbGuq42a%2B4%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank"><strong>"Big AI Companies Need Higher Ed...But Does Higher Ed Need Them?"</strong></a><strong>&nbsp;</strong>(Bjork); <a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Ftheconversation.com%2Fchatgpt-threatens-language-diversity-more-needs-to-be-done-to-protect-our-differences-in-the-age-of-ai-198878&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7C85f4617f80d042540c5808dd4bb3dfcd%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C638749958114911084%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=%2FeeMZ5IGLTEiGfD4Yn57DTRlhijKy6W8xepFO83G3f0%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank"><strong>"ChatGPT Threatens Language Diversity"</strong></a>&nbsp;(Bjork); <a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Ftehiku.nz%2Fte-hiku-tech%2Fpapa-reo%2F14135%2Fte-reo-maori-speech-recognition&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7C85f4617f80d042540c5808dd4bb3dfcd%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C638749958114927341%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=i1GFVQU2CRnuJYLKEj1eouh7EUlTDYmdfBbjrbxCOjE%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank"><strong>Te Reo Māori Speech Recognition</strong></a>&nbsp;(Te Hiku Media); <a href="https://can01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.indigenous-ai.net%2Fabundant%2F&amp;data=05%7C02%7Cmel.hogan%40queensu.ca%7C85f4617f80d042540c5808dd4bb3dfcd%7Cd61ecb3b38b142d582c4efb2838b925c%7C1%7C0%7C638749958114943296%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;sdata=9lG%2FmsaTdVwd%2B84uV05pGsYBt92m0iVq0WRXAC9IGDA%3D&amp;reserved=0" rel="noopener noreferrer" target="_blank"><strong>Abundant Intelligences - Indigenous AI</strong></a><strong>&nbsp;</strong>(Jason Edward Lewis, Hemi Whaanga, et al...)</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Deskill, with Hagen Blix</title>
			<itunes:title>Deskill, with Hagen Blix</itunes:title>
			<pubDate>Mon, 17 Mar 2025 12:00:00 GMT</pubDate>
			<itunes:duration>1:04:59</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/67c89b73fc5f88b98d4eee51/media.mp3" length="99371584" type="audio/mpeg"/>
			<guid isPermaLink="false">67c89b73fc5f88b98d4eee51</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/deskill-with-hagen-blix</link>
			<acast:episodeId>67c89b73fc5f88b98d4eee51</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>deskill-with-hagen-blix</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxpVhfagycsyO47Lbn9A5LPfsuoJhlPnuf1MtMxgzucQtgZFtrZ09qL2WNjdFyidG0l+BroIYEmOXO0WRjHuqtCA]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>57</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>In this episode Hagen Blix and I talk about how the fear of AI, from the non-billionaire CEO class, comes from the threat of deskilling workers. Recorded Mar 5, 2025. Released March 17, 2025.</p><br><p><strong>Tech Workers Can Still Fight Silicon Valley’s Overlords</strong></p><p>by&nbsp;Hagen Blix&nbsp;and&nbsp;Ingeborg Glimmer&nbsp;</p><p><a href="https://jacobin.com/2025/02/tech-workers-silicon-valley-trump/" rel="noopener noreferrer" target="_blank">https://jacobin.com/2025/02/tech-workers-silicon-valley-trump/</a></p><br><p><strong>Why We Fear AI: On the Interpretation of Nightmares&nbsp;Paperback – March 21 2025</strong></p><p>by&nbsp;Hagen Blix&nbsp;and&nbsp;Ingeborg Glimmer&nbsp;</p><p><a href="https://www.commonnotions.org/why-we-fear-ai" rel="noopener noreferrer" target="_blank">https://www.commonnotions.org/why-we-fear-ai</a>&nbsp;</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>In this episode Hagen Blix and I talk about how the fear of AI, from the non-billionaire CEO class, comes from the threat of deskilling workers. Recorded Mar 5, 2025. Released March 17, 2025.</p><br><p><strong>Tech Workers Can Still Fight Silicon Valley’s Overlords</strong></p><p>by&nbsp;Hagen Blix&nbsp;and&nbsp;Ingeborg Glimmer&nbsp;</p><p><a href="https://jacobin.com/2025/02/tech-workers-silicon-valley-trump/" rel="noopener noreferrer" target="_blank">https://jacobin.com/2025/02/tech-workers-silicon-valley-trump/</a></p><br><p><strong>Why We Fear AI: On the Interpretation of Nightmares&nbsp;Paperback – March 21 2025</strong></p><p>by&nbsp;Hagen Blix&nbsp;and&nbsp;Ingeborg Glimmer&nbsp;</p><p><a href="https://www.commonnotions.org/why-we-fear-ai" rel="noopener noreferrer" target="_blank">https://www.commonnotions.org/why-we-fear-ai</a>&nbsp;</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Unlearning, with Kane Murdoch </title>
			<itunes:title>Unlearning, with Kane Murdoch </itunes:title>
			<pubDate>Mon, 10 Mar 2025 12:00:00 GMT</pubDate>
			<itunes:duration>1:02:39</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/67ad1f67505cb2a0a3778efb/media.mp3" length="98319040" type="audio/mpeg"/>
			<guid isPermaLink="false">67ad1f67505cb2a0a3778efb</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/unlearning-with-kane-murdoch</link>
			<acast:episodeId>67ad1f67505cb2a0a3778efb</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>unlearning-with-kane-murdoch</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxp311vsfGtq9Z4AK0hTH0Ix6OQwKWcX3nJM+OemTbyt2aDcThqR8v5T8o3fmbSfDyY1wvuR1AxIODbheiySWm4A]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>56</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>In this episode I speak with Kane Murdoch about the perils of contract cheating. As an integrity officer, he frames what's happening with "cheating" as an unlearning that we should all be paying attention to if we care about education. Recorded Feb 6, 2025. Released March 10, 2025.</p><br><p><strong>Guerilla Warfare</strong></p><p><a href="https://www.guerillawarfare.net/" rel="noopener noreferrer" target="_blank">https://www.guerillawarfare.net/</a></p><br><p>Ellis, C., &amp; Murdoch, K. (2024). The educational integrity enforcement pyramid: a new framework for challenging and responding to student cheating. <em>Assessment &amp; Evaluation in Higher Education</em>, 49(7), 924–934. <a href="https://doi.org/10.1080/02602938.2024.2329167" rel="noopener noreferrer" target="_blank">https://doi.org/10.1080/02602938.2024.2329167</a></p><br><p><strong>Lures and violent threats: old school cheating still rampant at Australian universities, even as AI rises — The Guardian</strong></p><p><a href="https://www.theguardian.com/australia-news/article/2024/aug/01/lures-and-violent-threats-old-school-cheating-still-rampant-at-australian-universities-even-as-ai-rises" rel="noopener noreferrer" target="_blank">https://www.theguardian.com/australia-news/article/2024/aug/01/lures-and-violent-threats-old-school-cheating-still-rampant-at-australian-universities-even-as-ai-rises</a></p><br><p><strong>Ghost writers helping UNSW students to cheat on assessments, leaked report reveals</strong></p><p><a href="https://www.smh.com.au/national/nsw/cheating-unsw-students-hire-ghost-writers-from-messaging-site-wechat-to-complete-work-20200505-p54q3f.html" rel="noopener noreferrer" target="_blank">https://www.smh.com.au/national/nsw/cheating-unsw-students-hire-ghost-writers-from-messaging-site-wechat-to-complete-work-20200505-p54q3f.html</a></p><br><p><strong>Cheating found at UNSW up by 2000% as new detection methods used</strong></p><p><a href="https://www.smh.com.au/education/cheating-found-at-unsw-up-by-2000-percent-as-new-detection-methods-used-20190814-p52gz4.html" rel="noopener noreferrer" target="_blank">https://www.smh.com.au/education/cheating-found-at-unsw-up-by-2000-percent-as-new-detection-methods-used-20190814-p52gz4.html</a></p><br><p><strong>University students caught paying others to do their work at record levels</strong></p><p><a href="https://www.smh.com.au/national/nsw/university-students-caught-paying-others-to-do-their-work-at-record-levels-20221025-p5bsrx.html&nbsp;" rel="noopener noreferrer" target="_blank">https://www.smh.com.au/national/nsw/university-students-caught-paying-others-to-do-their-work-at-record-levels-20221025-p5bsrx.html&nbsp;</a></p><br><p><strong>The Rise of Plagiarism: Contract Cheating&nbsp;</strong></p><p><a href="https://www.turnitin.ca/products/originality/contract-cheating" rel="noopener noreferrer" target="_blank">https://www.turnitin.ca/products/originality/contract-cheating</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>In this episode I speak with Kane Murdoch about the perils of contract cheating. As an integrity officer, he frames what's happening with "cheating" as an unlearning that we should all be paying attention to if we care about education. Recorded Feb 6, 2025. Released March 10, 2025.</p><br><p><strong>Guerilla Warfare</strong></p><p><a href="https://www.guerillawarfare.net/" rel="noopener noreferrer" target="_blank">https://www.guerillawarfare.net/</a></p><br><p>Ellis, C., &amp; Murdoch, K. (2024). The educational integrity enforcement pyramid: a new framework for challenging and responding to student cheating. <em>Assessment &amp; Evaluation in Higher Education</em>, 49(7), 924–934. <a href="https://doi.org/10.1080/02602938.2024.2329167" rel="noopener noreferrer" target="_blank">https://doi.org/10.1080/02602938.2024.2329167</a></p><br><p><strong>Lures and violent threats: old school cheating still rampant at Australian universities, even as AI rises — The Guardian</strong></p><p><a href="https://www.theguardian.com/australia-news/article/2024/aug/01/lures-and-violent-threats-old-school-cheating-still-rampant-at-australian-universities-even-as-ai-rises" rel="noopener noreferrer" target="_blank">https://www.theguardian.com/australia-news/article/2024/aug/01/lures-and-violent-threats-old-school-cheating-still-rampant-at-australian-universities-even-as-ai-rises</a></p><br><p><strong>Ghost writers helping UNSW students to cheat on assessments, leaked report reveals</strong></p><p><a href="https://www.smh.com.au/national/nsw/cheating-unsw-students-hire-ghost-writers-from-messaging-site-wechat-to-complete-work-20200505-p54q3f.html" rel="noopener noreferrer" target="_blank">https://www.smh.com.au/national/nsw/cheating-unsw-students-hire-ghost-writers-from-messaging-site-wechat-to-complete-work-20200505-p54q3f.html</a></p><br><p><strong>Cheating found at UNSW up by 2000% as new detection methods used</strong></p><p><a href="https://www.smh.com.au/education/cheating-found-at-unsw-up-by-2000-percent-as-new-detection-methods-used-20190814-p52gz4.html" rel="noopener noreferrer" target="_blank">https://www.smh.com.au/education/cheating-found-at-unsw-up-by-2000-percent-as-new-detection-methods-used-20190814-p52gz4.html</a></p><br><p><strong>University students caught paying others to do their work at record levels</strong></p><p><a href="https://www.smh.com.au/national/nsw/university-students-caught-paying-others-to-do-their-work-at-record-levels-20221025-p5bsrx.html&nbsp;" rel="noopener noreferrer" target="_blank">https://www.smh.com.au/national/nsw/university-students-caught-paying-others-to-do-their-work-at-record-levels-20221025-p5bsrx.html&nbsp;</a></p><br><p><strong>The Rise of Plagiarism: Contract Cheating&nbsp;</strong></p><p><a href="https://www.turnitin.ca/products/originality/contract-cheating" rel="noopener noreferrer" target="_blank">https://www.turnitin.ca/products/originality/contract-cheating</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Indexicality, with Roland Meyer and Gillian Rose</title>
			<itunes:title>Indexicality, with Roland Meyer and Gillian Rose</itunes:title>
			<pubDate>Mon, 24 Feb 2025 13:00:00 GMT</pubDate>
			<itunes:duration>1:02:39</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/67a3b56a340a5590cd17bd9a/media.mp3" length="95781664" type="audio/mpeg"/>
			<guid isPermaLink="false">67a3b56a340a5590cd17bd9a</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/indexicality-with-roland-meyer-and-gillian-rose</link>
			<acast:episodeId>67a3b56a340a5590cd17bd9a</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>indexicality-with-roland-meyer-and-gillian-rose</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxpsmdAbeIRlj7KuAr9T0vVeq7mRUq1fV7s3R8O3GsqnFd2c1oUWSV79Lr/liYIyq+8pC5bAvzGvQ+jYfLdy0vrn]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>55</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>AI images are circulating more and more online and sometimes we can't tell the 'real' from AI-generated. But as I discuss with the inimitable Gillian Rose and Roland Meyer, we need to think about AI images beyond their indexicality, i.e. the idea that a photograph is a direct representation of the subject it captures. In this episode, we grapple with AI generators that are part of extractive, colonial industries, and how that shapes the affect of AI visuals. Recorded Jan 29, 2025. Released Feb 24, 2025.</p><br><p><strong>It’s a flat world. The Synthetic Realities of Sora</strong></p><p><a href="https://rrrreflect.org/special-issue-1/its-a-flat-world-the-synthetic-realities-of-sora" rel="noopener noreferrer" target="_blank">https://rrrreflect.org/special-issue-1/its-a-flat-world-the-synthetic-realities-of-sora</a></p><br><p><strong>“It’s a flat world. The Synthetic Realities of AI Video” by Roland Meyer at Hidden Layers 24</strong></p><p><a href="https://vimeo.com/1011342969" rel="noopener noreferrer" target="_blank">https://vimeo.com/1011342969</a></p><br><p><strong>The New Value of the Archive</strong></p><p><strong>AI Image Generation and the Visual Economy of ‘Style’</strong></p><p><a href="https://image-journal.de/the-new-value-of-the-archive/" rel="noopener noreferrer" target="_blank">https://image-journal.de/the-new-value-of-the-archive/</a></p><br><p><strong>“Generic Pastness. AI Image Synthesis and the Virtualization of the Archive”</strong></p><p><a href="https://vimeo.com/873978726" rel="noopener noreferrer" target="_blank">https://vimeo.com/873978726</a></p><br><p><strong>Models All The Way Down by Christo Buschek &amp; Jer Thorp</strong></p><p><a href="https://knowingmachines.org/models-all-the-way" rel="noopener noreferrer" target="_blank">https://knowingmachines.org/models-all-the-way</a></p><br><p><strong>Gillian Rose, Professor of Human Geography</strong></p><p><strong>Fellow of the British Academy and Academy of Social Sciences</strong></p><p><strong>webpage: </strong>https://www.geog.ox.ac.uk/staff/grose.html</p><p><strong>bluesky: </strong>https://bsky.app/profile/profgillian.bsky.social</p><p><strong>blog: </strong>visualmethodculture.wordpress.com</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>AI images are circulating more and more online and sometimes we can't tell the 'real' from AI-generated. But as I discuss with the inimitable Gillian Rose and Roland Meyer, we need to think about AI images beyond their indexicality, i.e. the idea that a photograph is a direct representation of the subject it captures. In this episode, we grapple with AI generators that are part of extractive, colonial industries, and how that shapes the affect of AI visuals. Recorded Jan 29, 2025. Released Feb 24, 2025.</p><br><p><strong>It’s a flat world. The Synthetic Realities of Sora</strong></p><p><a href="https://rrrreflect.org/special-issue-1/its-a-flat-world-the-synthetic-realities-of-sora" rel="noopener noreferrer" target="_blank">https://rrrreflect.org/special-issue-1/its-a-flat-world-the-synthetic-realities-of-sora</a></p><br><p><strong>“It’s a flat world. The Synthetic Realities of AI Video” by Roland Meyer at Hidden Layers 24</strong></p><p><a href="https://vimeo.com/1011342969" rel="noopener noreferrer" target="_blank">https://vimeo.com/1011342969</a></p><br><p><strong>The New Value of the Archive</strong></p><p><strong>AI Image Generation and the Visual Economy of ‘Style’</strong></p><p><a href="https://image-journal.de/the-new-value-of-the-archive/" rel="noopener noreferrer" target="_blank">https://image-journal.de/the-new-value-of-the-archive/</a></p><br><p><strong>“Generic Pastness. AI Image Synthesis and the Virtualization of the Archive”</strong></p><p><a href="https://vimeo.com/873978726" rel="noopener noreferrer" target="_blank">https://vimeo.com/873978726</a></p><br><p><strong>Models All The Way Down by Christo Buschek &amp; Jer Thorp</strong></p><p><a href="https://knowingmachines.org/models-all-the-way" rel="noopener noreferrer" target="_blank">https://knowingmachines.org/models-all-the-way</a></p><br><p><strong>Gillian Rose, Professor of Human Geography</strong></p><p><strong>Fellow of the British Academy and Academy of Social Sciences</strong></p><p><strong>webpage: </strong>https://www.geog.ox.ac.uk/staff/grose.html</p><p><strong>bluesky: </strong>https://bsky.app/profile/profgillian.bsky.social</p><p><strong>blog: </strong>visualmethodculture.wordpress.com</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Asymmetries, with Jathan Sadowski</title>
			<itunes:title>Asymmetries, with Jathan Sadowski</itunes:title>
			<pubDate>Mon, 10 Feb 2025 13:00:00 GMT</pubDate>
			<itunes:duration>1:00:36</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/678fcbeb16bc7a854535e4bc/media.mp3" length="92392768" type="audio/mpeg"/>
			<guid isPermaLink="false">678fcbeb16bc7a854535e4bc</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/asymmetries-with-jathan-sadowski</link>
			<acast:episodeId>678fcbeb16bc7a854535e4bc</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>asymmetries-with-jathan-sadowski</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxrTon4KeQ+x2lFQ0/Dl6hZtZkZwH4ckMKj+DSWcl3WjvinRnbhw4GOCs/8ir5dntaidxCfsUz3U16d/LwX+P1P7]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>54</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>In this episode Jathan Sadowski discusses the 'risk industry' as imagined by FIRE (finance, insurance and real estate) and the asymmetries they create. Recorded January 15, 2025. Released February 10, 2025.</p><br><p><strong>The Mechanic and the Luddite: A Ruthless Criticism of Technology and Capitalism</strong></p><p><a href="https://www.ucpress.edu/books/the-mechanic-and-the-luddite/paper" rel="noopener noreferrer" target="_blank">https://www.ucpress.edu/books/the-mechanic-and-the-luddite/paper</a></p><br><p><strong>This Machine Kills: A podcast about technology and political economy&nbsp;</strong></p><p><a href="https://soundcloud.com/thismachinekillspod" rel="noopener noreferrer" target="_blank">https://soundcloud.com/thismachinekillspod</a>&nbsp;</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>In this episode Jathan Sadowski discusses the 'risk industry' as imagined by FIRE (finance, insurance and real estate) and the asymmetries they create. Recorded January 15, 2025. Released February 10, 2025.</p><br><p><strong>The Mechanic and the Luddite: A Ruthless Criticism of Technology and Capitalism</strong></p><p><a href="https://www.ucpress.edu/books/the-mechanic-and-the-luddite/paper" rel="noopener noreferrer" target="_blank">https://www.ucpress.edu/books/the-mechanic-and-the-luddite/paper</a></p><br><p><strong>This Machine Kills: A podcast about technology and political economy&nbsp;</strong></p><p><a href="https://soundcloud.com/thismachinekillspod" rel="noopener noreferrer" target="_blank">https://soundcloud.com/thismachinekillspod</a>&nbsp;</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Lessons, with Charles Logan</title>
			<itunes:title>Lessons, with Charles Logan</itunes:title>
			<pubDate>Mon, 27 Jan 2025 13:00:00 GMT</pubDate>
			<itunes:duration>55:08</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/6786b1763ceecdbe85e23614/media.mp3" length="80602240" type="audio/mpeg"/>
			<guid isPermaLink="false">6786b1763ceecdbe85e23614</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/lessons-with-charles-logan</link>
			<acast:episodeId>6786b1763ceecdbe85e23614</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>lessons-with-charles-logan</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxo1GAIkyK0QmkQ26biOItdKEec91cjIABlWcX9Yw/50rMXQKmg+7zeNoK4vHFLX/Z+VfTHYnZbKd9RAryoNS2JI]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>53</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>Charles Logan is the go-to person to talk to about how AI is infiltrating the many layers of education, from K-12 to universities. In this conversation, we learn our lessons; we talk about what Ed Tech is, its promise and hype, and (ultimately) how to refuse it as professors and teach students to resist it as well. We also wonder about 'AI-proofing' the classroom and wether this is the way to deal with its onslaught. Recorded January 14, 2025. Released January 27, 2025.</p><br><p><strong>Applying the Baldwin Test to Ed-Tech</strong></p><p><a href="https://www.civicsoftechnology.org/blog/applying-the-baldwin-test-to-ed-tech" rel="noopener noreferrer" target="_blank">https://www.civicsoftechnology.org/blog/applying-the-baldwin-test-to-ed-tech</a></p><br><p><strong>The Captivating Creature from Educaria and Other Scary Stories</strong></p><p><a href="https://www.civicsoftechnology.org/blog/the-captivating-creature-from-educaria-and-other-scary-stories" rel="noopener noreferrer" target="_blank">https://www.civicsoftechnology.org/blog/the-captivating-creature-from-educaria-and-other-scary-stories</a>&nbsp;</p><br><p><strong>Iggy Peck, Architect Is an AI Doomer and Other Things I Struggle to Talk with My Kids About</strong></p><p><a href="https://www.civicsoftechnology.org/blog/iggy-peck-architect-is-an-ai-doomer-and-other-things-i-struggle-to-talk-with-my-kids-about" rel="noopener noreferrer" target="_blank">https://www.civicsoftechnology.org/blog/iggy-peck-architect-is-an-ai-doomer-and-other-things-i-struggle-to-talk-with-my-kids-about</a>&nbsp;</p><br><p><strong>Lessons on How to Practice Everyday Resistance and Refusal</strong></p><p><a href="https://www.civicsoftechnology.org/blog/lessons-on-how-to-practice-everyday-resistance-and-refusal" rel="noopener noreferrer" target="_blank">https://www.civicsoftechnology.org/blog/lessons-on-how-to-practice-everyday-resistance-and-refusal</a>&nbsp;</p><br><p><strong>You need to talk to your kid about AI. Here are 6 things you should say.</strong></p><p><a href="https://www.technologyreview.com/2023/09/05/1079009/you-need-to-talk-to-your-kid-about-ai-here-are-6-things-you-should-say" rel="noopener noreferrer" target="_blank">https://www.technologyreview.com/2023/09/05/1079009/you-need-to-talk-to-your-kid-about-ai-here-are-6-things-you-should-say</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Charles Logan is the go-to person to talk to about how AI is infiltrating the many layers of education, from K-12 to universities. In this conversation, we learn our lessons; we talk about what Ed Tech is, its promise and hype, and (ultimately) how to refuse it as professors and teach students to resist it as well. We also wonder about 'AI-proofing' the classroom and wether this is the way to deal with its onslaught. Recorded January 14, 2025. Released January 27, 2025.</p><br><p><strong>Applying the Baldwin Test to Ed-Tech</strong></p><p><a href="https://www.civicsoftechnology.org/blog/applying-the-baldwin-test-to-ed-tech" rel="noopener noreferrer" target="_blank">https://www.civicsoftechnology.org/blog/applying-the-baldwin-test-to-ed-tech</a></p><br><p><strong>The Captivating Creature from Educaria and Other Scary Stories</strong></p><p><a href="https://www.civicsoftechnology.org/blog/the-captivating-creature-from-educaria-and-other-scary-stories" rel="noopener noreferrer" target="_blank">https://www.civicsoftechnology.org/blog/the-captivating-creature-from-educaria-and-other-scary-stories</a>&nbsp;</p><br><p><strong>Iggy Peck, Architect Is an AI Doomer and Other Things I Struggle to Talk with My Kids About</strong></p><p><a href="https://www.civicsoftechnology.org/blog/iggy-peck-architect-is-an-ai-doomer-and-other-things-i-struggle-to-talk-with-my-kids-about" rel="noopener noreferrer" target="_blank">https://www.civicsoftechnology.org/blog/iggy-peck-architect-is-an-ai-doomer-and-other-things-i-struggle-to-talk-with-my-kids-about</a>&nbsp;</p><br><p><strong>Lessons on How to Practice Everyday Resistance and Refusal</strong></p><p><a href="https://www.civicsoftechnology.org/blog/lessons-on-how-to-practice-everyday-resistance-and-refusal" rel="noopener noreferrer" target="_blank">https://www.civicsoftechnology.org/blog/lessons-on-how-to-practice-everyday-resistance-and-refusal</a>&nbsp;</p><br><p><strong>You need to talk to your kid about AI. Here are 6 things you should say.</strong></p><p><a href="https://www.technologyreview.com/2023/09/05/1079009/you-need-to-talk-to-your-kid-about-ai-here-are-6-things-you-should-say" rel="noopener noreferrer" target="_blank">https://www.technologyreview.com/2023/09/05/1079009/you-need-to-talk-to-your-kid-about-ai-here-are-6-things-you-should-say</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Storying, with Dillon Mahmoudi and Anthony Levenda</title>
			<itunes:title>Storying, with Dillon Mahmoudi and Anthony Levenda</itunes:title>
			<pubDate>Mon, 20 Jan 2025 13:00:00 GMT</pubDate>
			<itunes:duration>56:46</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/6786adac45dea7883626f04b/media.mp3" length="80474272" type="audio/mpeg"/>
			<guid isPermaLink="false">6786adac45dea7883626f04b</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/storying-with-dillon-mahmoudi-and-anthony-levenda</link>
			<acast:episodeId>6786adac45dea7883626f04b</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>storying-with-dillon-mahmoudi-and-anthony-levenda</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxr3fmE+76G/K9m59WaOdalVc5v5ZmYeYd/JfWZQEaJfA+E1YjKZVGJ6sgdsOWm1yPYCYoNT2DRnuk9IHV4C4PGC]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>52</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>In this episode I speak with Dillon Mahmoudi and Anthony Levenda about the relationship (feedback loop) between data and urban planning. We focus on the idea of 'storying' data to make it compelling and to get past the inertia of data delivered as mere stats or numbers that have little resonance and don't (or no longer) move people to action, towards better living conditions. Recorded January 13, 2025. Released January 20, 2025. </p><br><p><strong>The urban-tech feedback loop: a surveillance and development data-walk in South Lake Union</strong></p><p><a href="https://dillonm.io/papers/the-urban-tech-feedback-loop/" rel="noopener noreferrer" target="_blank">https://dillonm.io/papers/the-urban-tech-feedback-loop/</a>&nbsp;</p><br><p><strong>The Amazon Warehouse&nbsp;</strong></p><p><a href="https://dillonm.io/papers/the-amazon-warehouse/" rel="noopener noreferrer" target="_blank">https://dillonm.io/papers/the-amazon-warehouse/</a>&nbsp;</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>In this episode I speak with Dillon Mahmoudi and Anthony Levenda about the relationship (feedback loop) between data and urban planning. We focus on the idea of 'storying' data to make it compelling and to get past the inertia of data delivered as mere stats or numbers that have little resonance and don't (or no longer) move people to action, towards better living conditions. Recorded January 13, 2025. Released January 20, 2025. </p><br><p><strong>The urban-tech feedback loop: a surveillance and development data-walk in South Lake Union</strong></p><p><a href="https://dillonm.io/papers/the-urban-tech-feedback-loop/" rel="noopener noreferrer" target="_blank">https://dillonm.io/papers/the-urban-tech-feedback-loop/</a>&nbsp;</p><br><p><strong>The Amazon Warehouse&nbsp;</strong></p><p><a href="https://dillonm.io/papers/the-amazon-warehouse/" rel="noopener noreferrer" target="_blank">https://dillonm.io/papers/the-amazon-warehouse/</a>&nbsp;</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Defining, with Ali Alkhatib</title>
			<itunes:title>Defining, with Ali Alkhatib</itunes:title>
			<pubDate>Mon, 13 Jan 2025 13:00:15 GMT</pubDate>
			<itunes:duration>55:02</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/67709121c82b0a6413e52e3a/media.mp3" length="81949216" type="audio/mpeg"/>
			<guid isPermaLink="false">67709121c82b0a6413e52e3a</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/defining-with-ali-alkhabid</link>
			<acast:episodeId>67709121c82b0a6413e52e3a</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>defining-with-ali-alkhabid</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxoQX/JIldOHkG1h85jlFB/n/Y3xepaPq4PJR819Wu8Bbgi5+eTUuXxAR+CyPB5JEvr2g3A+2ASaJrZO8PapRW6F]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>51</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>I got to speak with the brilliant Ali Alkhatib&nbsp;about his blog post "defining AI" -- an object, subject, metaphor, and discursive formation used amongst all of us trying to figure out how to grapple with AI's ownership, deployments, and impacts. Who gets to define AI? Is it just computer scientists? What are the stakes of having it defined only technologically? Recorded December 23, 2025. Released January 13, 2025.</p><br><p><br></p><p><strong>Ali Alkhatib&nbsp;(website)</strong></p><p><a href="https://ali-alkhatib.com/" rel="noopener noreferrer" target="_blank">https://ali-alkhatib.com/</a></p><br><p><strong>Defining AI</strong></p><p><a href="https://ali-alkhatib.com/blog/defining-ai" rel="noopener noreferrer" target="_blank">https://ali-alkhatib.com/blog/defining-ai</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>I got to speak with the brilliant Ali Alkhatib&nbsp;about his blog post "defining AI" -- an object, subject, metaphor, and discursive formation used amongst all of us trying to figure out how to grapple with AI's ownership, deployments, and impacts. Who gets to define AI? Is it just computer scientists? What are the stakes of having it defined only technologically? Recorded December 23, 2025. Released January 13, 2025.</p><br><p><br></p><p><strong>Ali Alkhatib&nbsp;(website)</strong></p><p><a href="https://ali-alkhatib.com/" rel="noopener noreferrer" target="_blank">https://ali-alkhatib.com/</a></p><br><p><strong>Defining AI</strong></p><p><a href="https://ali-alkhatib.com/blog/defining-ai" rel="noopener noreferrer" target="_blank">https://ali-alkhatib.com/blog/defining-ai</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Colonialism, with Ulises A. Mejias and Nick Couldry</title>
			<itunes:title>Colonialism, with Ulises A. Mejias and Nick Couldry</itunes:title>
			<pubDate>Mon, 06 Jan 2025 13:00:10 GMT</pubDate>
			<itunes:duration>1:00:14</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/677089b1310557bf4f28382d/media.mp3" length="86961664" type="audio/mpeg"/>
			<guid isPermaLink="false">677089b1310557bf4f28382d</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/colonialism-with-ulises-a-mejias-and-nick-couldry</link>
			<acast:episodeId>677089b1310557bf4f28382d</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>colonialism-with-ulises-a-mejias-and-nick-couldry</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxpN5CGhzM/Avn6Knk41cBqfYmFZsYp97MzK/nQQH6tjfrOB9TntdOG96lcl9PAuUPwZWyONHdJaoU/7PdxTZmnP]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>50</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>I start the new year with an episode on "data colonialism". I had the great pleasure of speaking with Ulises A. Mejias and Nick Couldry about our contemporary relationship to corporations, about the idea that there’s no capitalism without colonialism (and vice versa), about how human lives are being exploited these days, and about data being a cheap resource. Recorded December 16, 2024. Released January 6, 2025.&nbsp;</p><br><p><br></p><p><strong>Data Grab: The New Colonialism of Big Tech and How to Fight Back</strong></p><p><a href="https://press.uchicago.edu/ucp/books/book/chicago/D/bo216184200.html " rel="noopener noreferrer" target="_blank">https://press.uchicago.edu/ucp/books/book/chicago/D/bo216184200.html </a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>I start the new year with an episode on "data colonialism". I had the great pleasure of speaking with Ulises A. Mejias and Nick Couldry about our contemporary relationship to corporations, about the idea that there’s no capitalism without colonialism (and vice versa), about how human lives are being exploited these days, and about data being a cheap resource. Recorded December 16, 2024. Released January 6, 2025.&nbsp;</p><br><p><br></p><p><strong>Data Grab: The New Colonialism of Big Tech and How to Fight Back</strong></p><p><a href="https://press.uchicago.edu/ucp/books/book/chicago/D/bo216184200.html " rel="noopener noreferrer" target="_blank">https://press.uchicago.edu/ucp/books/book/chicago/D/bo216184200.html </a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Investigative, with Déborah López and Hadin Charbel</title>
			<itunes:title>Investigative, with Déborah López and Hadin Charbel</itunes:title>
			<pubDate>Mon, 30 Dec 2024 13:00:30 GMT</pubDate>
			<itunes:duration>1:02:00</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/675b361db8324e024f9789fc/media.mp3" length="94094272" type="audio/mpeg"/>
			<guid isPermaLink="false">675b361db8324e024f9789fc</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/investigative-with-deborah-lopez-and-hadin-charbel</link>
			<acast:episodeId>675b361db8324e024f9789fc</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>investigative-with-deborah-lopez-and-hadin-charbel</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxrVbEnzR0cuqsCLUNvURcCl/bcNTgkWeLRncZn0tdOqcOCdGUxjAQqOyFptIzKaWiWU2YxRKBtWDguWfDwRimpK]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>49</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p><br></p><p>Such a delight speaking with Déborah López and Hadin Charbel, incredible artists-architects-scholars as investigators of future possibilities in light of climate change rapidly changing arctic (and other) landscapes. We discuss a range of art projects, from large installations to projections to speculative fiction, and how these modes and conditions can help us think and feel about alternate endings -- in our teaching and in our day to day embodied, lived realities.&nbsp;Recorded Dec 12, 2024. Released Dec 30, 2024. </p><br><p>Artists website</p><p><a href="https://pareid.com/" rel="noopener noreferrer" target="_blank">https://pareid.com/</a></p><br><p>Artists Instagram</p><p><a href="https://www.instagram.com/pareid.architecture/" rel="noopener noreferrer" target="_blank">https://www.instagram.com/pareid.architecture/</a>&nbsp;</p><br><p>Pareid creates organ-like installation from corrugated plastic tubes in Madrid</p><p><a href="https://www.dezeen.com/2022/04/20/pareid-everywhere-nowhere-installation-urvanity/" rel="noopener noreferrer" target="_blank">https://www.dezeen.com/2022/04/20/pareid-everywhere-nowhere-installation-urvanity/</a></p><br><p>Re: Arctic</p><p><a href="https://vimeo.com/469736816?&amp;login=true" rel="noopener noreferrer" target="_blank">https://vimeo.com/469736816?&amp;login=true</a>&nbsp;</p><p><br></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p><br></p><p>Such a delight speaking with Déborah López and Hadin Charbel, incredible artists-architects-scholars as investigators of future possibilities in light of climate change rapidly changing arctic (and other) landscapes. We discuss a range of art projects, from large installations to projections to speculative fiction, and how these modes and conditions can help us think and feel about alternate endings -- in our teaching and in our day to day embodied, lived realities.&nbsp;Recorded Dec 12, 2024. Released Dec 30, 2024. </p><br><p>Artists website</p><p><a href="https://pareid.com/" rel="noopener noreferrer" target="_blank">https://pareid.com/</a></p><br><p>Artists Instagram</p><p><a href="https://www.instagram.com/pareid.architecture/" rel="noopener noreferrer" target="_blank">https://www.instagram.com/pareid.architecture/</a>&nbsp;</p><br><p>Pareid creates organ-like installation from corrugated plastic tubes in Madrid</p><p><a href="https://www.dezeen.com/2022/04/20/pareid-everywhere-nowhere-installation-urvanity/" rel="noopener noreferrer" target="_blank">https://www.dezeen.com/2022/04/20/pareid-everywhere-nowhere-installation-urvanity/</a></p><br><p>Re: Arctic</p><p><a href="https://vimeo.com/469736816?&amp;login=true" rel="noopener noreferrer" target="_blank">https://vimeo.com/469736816?&amp;login=true</a>&nbsp;</p><p><br></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Collaborative, with Chris Gilliard </title>
			<itunes:title>Collaborative, with Chris Gilliard </itunes:title>
			<pubDate>Mon, 16 Dec 2024 13:00:12 GMT</pubDate>
			<itunes:duration>1:03:24</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/6759f1adc2a496b7a353508d/media.mp3" length="98052448" type="audio/mpeg"/>
			<guid isPermaLink="false">6759f1adc2a496b7a353508d</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/collaborative-with-chris-gilliard</link>
			<acast:episodeId>6759f1adc2a496b7a353508d</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>collaborative-with-chris-gilliard</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxpYEBOH2FnAVTHaTBLe/CG3TsNuATHoGRXNpkQSxKWMZ43jYlLLk2Hfd+db4fGkMmPJWWAjrcwjILXzpgtccPl8]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>48</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>In this episode, I spoke with Chris Gilliard&nbsp;(@hypervisible) about AI’s encroachment on universities and what this means for collaboration — i.e. learning, writing, thinking and feeling. This conversation puts out a warning of sorts to universities adopting AI given that, as a technology, it is built off of stolen materials, relies on extraction and colonial labour practices, is racist, misogynist and transphobic in its outputs, and terrible for the environment — all issues the university claim to value and fight against? Recorded Dec 11, 2024. Released Dec 16, 2024. </p><br><p>“ChatGPT Should Not Exist” by David Golumbia (Dec 14, 2022)</p><p><a href="https://davidgolumbia.medium.com/chatgpt-should-not-exist-aab0867abace" rel="noopener noreferrer" target="_blank">https://davidgolumbia.medium.com/chatgpt-should-not-exist-aab0867abace</a></p><br><p>“Practico-inertia” by Rob Horning (March 1, 2024)</p><p><a href="https://robhorning.substack.com/p/practico-inertia" rel="noopener noreferrer" target="_blank">https://robhorning.substack.com/p/practico-inertia</a>&nbsp;</p><br><p>“Critical keywords of AI in&nbsp;education”&nbsp;by&nbsp;Ben Williamson (November 8, 2024&nbsp;)</p><p><a href="https://codeactsineducation.wordpress.com/" rel="noopener noreferrer" target="_blank">https://codeactsineducation.wordpress.com/</a>&nbsp;</p><br><p>“Big AI Companies Need Higher Ed … but Does Higher Ed Need Them?” by&nbsp;Collin Bjork (Dec 2, 2024)</p><p><a href="https://www.insidehighered.com/opinion/views/2024/12/02/universities-must-beware-reliance-big-ai-opinion" rel="noopener noreferrer" target="_blank">https://www.insidehighered.com/opinion/views/2024/12/02/universities-must-beware-reliance-big-ai-opinion</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>In this episode, I spoke with Chris Gilliard&nbsp;(@hypervisible) about AI’s encroachment on universities and what this means for collaboration — i.e. learning, writing, thinking and feeling. This conversation puts out a warning of sorts to universities adopting AI given that, as a technology, it is built off of stolen materials, relies on extraction and colonial labour practices, is racist, misogynist and transphobic in its outputs, and terrible for the environment — all issues the university claim to value and fight against? Recorded Dec 11, 2024. Released Dec 16, 2024. </p><br><p>“ChatGPT Should Not Exist” by David Golumbia (Dec 14, 2022)</p><p><a href="https://davidgolumbia.medium.com/chatgpt-should-not-exist-aab0867abace" rel="noopener noreferrer" target="_blank">https://davidgolumbia.medium.com/chatgpt-should-not-exist-aab0867abace</a></p><br><p>“Practico-inertia” by Rob Horning (March 1, 2024)</p><p><a href="https://robhorning.substack.com/p/practico-inertia" rel="noopener noreferrer" target="_blank">https://robhorning.substack.com/p/practico-inertia</a>&nbsp;</p><br><p>“Critical keywords of AI in&nbsp;education”&nbsp;by&nbsp;Ben Williamson (November 8, 2024&nbsp;)</p><p><a href="https://codeactsineducation.wordpress.com/" rel="noopener noreferrer" target="_blank">https://codeactsineducation.wordpress.com/</a>&nbsp;</p><br><p>“Big AI Companies Need Higher Ed … but Does Higher Ed Need Them?” by&nbsp;Collin Bjork (Dec 2, 2024)</p><p><a href="https://www.insidehighered.com/opinion/views/2024/12/02/universities-must-beware-reliance-big-ai-opinion" rel="noopener noreferrer" target="_blank">https://www.insidehighered.com/opinion/views/2024/12/02/universities-must-beware-reliance-big-ai-opinion</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Regulated, with Jennifer Holt</title>
			<itunes:title>Regulated, with Jennifer Holt</itunes:title>
			<pubDate>Mon, 25 Nov 2024 11:00:15 GMT</pubDate>
			<itunes:duration>59:10</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/673e818293bec8617f5ef320/media.mp3" length="91054432" type="audio/mpeg"/>
			<guid isPermaLink="false">673e818293bec8617f5ef320</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/regulated-with-jennifer-holt</link>
			<acast:episodeId>673e818293bec8617f5ef320</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>regulated-with-jennifer-holt</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxquvaUM8OIj94yhKXM2WPP1PraRLgGW9EqEwEjM+fn10VWylCUa1dt8NHhonJXIyWmvfdXbyiLjJso0wTvow5S4]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>47</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>It was a real honour and joy to speak with someone whose work has so significantly shaped my own (and many of us writing about data centers): Jennifer Holt joined me for a chat about US cloud policy. The Cloud is understood in this episode through the lens of policy, which means we grapple with who owns data, its infrastructures and our data futures. We also talked a bit about what the latest US elections might mean for Big Tech... Recorded Nov 20, 2024. Released Nov 25, 2024.</p><br><p><strong>Cloud Policy: A History of Regulating Pipelines, Platforms, and Data</strong></p><p><a href="https://mitpress.mit.edu/9780262548069/cloud-policy/ " rel="noopener noreferrer" target="_blank">https://mitpress.mit.edu/9780262548069/cloud-policy/ </a></p><br><p><strong>CMSW podcast: Jennifer Holt, “Cloud Policy: Anatomy of a Regulatory Crisis”</strong></p><p><a href="https://cmsw.mit.edu/podcast-jennifer-holt-cloud-policy-anatomy-regulatory-crisis/ " rel="noopener noreferrer" target="_blank">https://cmsw.mit.edu/podcast-jennifer-holt-cloud-policy-anatomy-regulatory-crisis/ </a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>It was a real honour and joy to speak with someone whose work has so significantly shaped my own (and many of us writing about data centers): Jennifer Holt joined me for a chat about US cloud policy. The Cloud is understood in this episode through the lens of policy, which means we grapple with who owns data, its infrastructures and our data futures. We also talked a bit about what the latest US elections might mean for Big Tech... Recorded Nov 20, 2024. Released Nov 25, 2024.</p><br><p><strong>Cloud Policy: A History of Regulating Pipelines, Platforms, and Data</strong></p><p><a href="https://mitpress.mit.edu/9780262548069/cloud-policy/ " rel="noopener noreferrer" target="_blank">https://mitpress.mit.edu/9780262548069/cloud-policy/ </a></p><br><p><strong>CMSW podcast: Jennifer Holt, “Cloud Policy: Anatomy of a Regulatory Crisis”</strong></p><p><a href="https://cmsw.mit.edu/podcast-jennifer-holt-cloud-policy-anatomy-regulatory-crisis/ " rel="noopener noreferrer" target="_blank">https://cmsw.mit.edu/podcast-jennifer-holt-cloud-policy-anatomy-regulatory-crisis/ </a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Frequencies, with Trent Wintermeier </title>
			<itunes:title>Frequencies, with Trent Wintermeier </itunes:title>
			<pubDate>Mon, 11 Nov 2024 11:00:39 GMT</pubDate>
			<itunes:duration>47:57</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/6706d3ff47b414d1546f1c93/media.mp3" length="69016384" type="audio/mpeg"/>
			<guid isPermaLink="false">6706d3ff47b414d1546f1c93</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/frequencies-with-trent-wintermeier</link>
			<acast:episodeId>6706d3ff47b414d1546f1c93</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>frequencies-with-trent-wintermeier</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxqBvYH353Xr5Jeg5xptb3e/e5FGq9Hgu1CB+LP7zj8H+wWmMoitpKUATDTAaIaedaV6j/LRoXCNyHXyaTDo4yh1]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>46</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>Listen to the data center's hum with your feet first... on this episode, Trent Wintermeier and I discuss what it means to absorb sound through the body and "hear" vibrations with and through your limbs and ears. We discuss what this means for folks living near data centers, especially in places imagined as kinds of sacrifice zones. Recorded Oct 9, 2024. Release Nov 11, 2024.</p><br><p><strong>Trent Wintermeier</strong></p><p><a href="https://trentwintermeier.cargo.site/ " rel="noopener noreferrer" target="_blank">https://trentwintermeier.cargo.site</a></p><br><p><strong>Affective Footprints</strong></p><p><a href="https://www.heliotropejournal.net/helio/affective-footprints" rel="noopener noreferrer" target="_blank">https://www.heliotropejournal.net/helio/affective-footprints</a></p><br><p><br></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Listen to the data center's hum with your feet first... on this episode, Trent Wintermeier and I discuss what it means to absorb sound through the body and "hear" vibrations with and through your limbs and ears. We discuss what this means for folks living near data centers, especially in places imagined as kinds of sacrifice zones. Recorded Oct 9, 2024. Release Nov 11, 2024.</p><br><p><strong>Trent Wintermeier</strong></p><p><a href="https://trentwintermeier.cargo.site/ " rel="noopener noreferrer" target="_blank">https://trentwintermeier.cargo.site</a></p><br><p><strong>Affective Footprints</strong></p><p><a href="https://www.heliotropejournal.net/helio/affective-footprints" rel="noopener noreferrer" target="_blank">https://www.heliotropejournal.net/helio/affective-footprints</a></p><br><p><br></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Safety, with Remmelt Ellen</title>
			<itunes:title>Safety, with Remmelt Ellen</itunes:title>
			<pubDate>Mon, 28 Oct 2024 10:00:46 GMT</pubDate>
			<itunes:duration>58:17</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/6702ab486f369dd03574e310/media.mp3" length="83656960" type="audio/mpeg"/>
			<guid isPermaLink="false">6702ab486f369dd03574e310</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/safety-with-remmelt-ellen</link>
			<acast:episodeId>6702ab486f369dd03574e310</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>safety-with-remmelt-ellen</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxpfwZNtkFhviszSVTf+kO9ElNDkOdbqhgLtQSze8OPX8/BCc1KlFSs0mOnApPJcQt+j9+8INUhRv1JHvesxDrxx]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>45</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>In this episode, I have a conversation with Remmelt Ellen from AI Safety Camp. We discuss AI safety and his 44-page book <em>Artifical Bodies</em> outlining AI harms from the perspective of someone really grappling with the ethics, hype, and harms of the industry and beyond. Recorded Oct 4, 2024. Released Oct 28, 2024.</p><br><p><strong>Artificial Bodies </strong></p><p>https://workflowy.com/s/artificial-bodies/znDloerXJaEQvKF6#/846236876b45</p><br><p><strong>AI Safety Camp</strong></p><p>https://www.aisafety.camp/</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>In this episode, I have a conversation with Remmelt Ellen from AI Safety Camp. We discuss AI safety and his 44-page book <em>Artifical Bodies</em> outlining AI harms from the perspective of someone really grappling with the ethics, hype, and harms of the industry and beyond. Recorded Oct 4, 2024. Released Oct 28, 2024.</p><br><p><strong>Artificial Bodies </strong></p><p>https://workflowy.com/s/artificial-bodies/znDloerXJaEQvKF6#/846236876b45</p><br><p><strong>AI Safety Camp</strong></p><p>https://www.aisafety.camp/</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Deep, with Lisa Yin Han</title>
			<itunes:title>Deep, with Lisa Yin Han</itunes:title>
			<pubDate>Mon, 14 Oct 2024 10:00:25 GMT</pubDate>
			<itunes:duration>59:24</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/66f72830224b00387d032516/media.mp3" length="91010560" type="audio/mpeg"/>
			<guid isPermaLink="false">66f72830224b00387d032516</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/deep-with-lisa-yin-han</link>
			<acast:episodeId>66f72830224b00387d032516</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>deep-with-lisa-yin-han</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxr8dCJ5PWYj31Ugcog55AyQNqKAp1gKIkSFwThebAeYSNnJNo0FG32hjiOSuXjTPBHVyDcgvXniwwG8q1wMakPA]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>44</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>Everyone should read Lisa Yin Han's Deepwater Alchemy! It's a stunningly well written book about how we come to value the ocean through various extractive mediations. Recorded Sept 27, 2024. Released Oct 14, 2024.</p><br><p><strong>Deepwater Alchemy: Extractive Mediation and the Taming of the Seafloor</strong></p><p>How underwater mediation has transformed deep-sea spaces into resource-rich frontiers</p><p><a href="https://www.upress.umn.edu/9781517915940/deepwater-alchemy/ " rel="noopener noreferrer" target="_blank">https://www.upress.umn.edu/9781517915940/deepwater-alchemy</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Everyone should read Lisa Yin Han's Deepwater Alchemy! It's a stunningly well written book about how we come to value the ocean through various extractive mediations. Recorded Sept 27, 2024. Released Oct 14, 2024.</p><br><p><strong>Deepwater Alchemy: Extractive Mediation and the Taming of the Seafloor</strong></p><p>How underwater mediation has transformed deep-sea spaces into resource-rich frontiers</p><p><a href="https://www.upress.umn.edu/9781517915940/deepwater-alchemy/ " rel="noopener noreferrer" target="_blank">https://www.upress.umn.edu/9781517915940/deepwater-alchemy</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Dancing, with Joana Chicau</title>
			<itunes:title>Dancing, with Joana Chicau</itunes:title>
			<pubDate>Mon, 23 Sep 2024 10:00:07 GMT</pubDate>
			<itunes:duration>58:04</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/66e45524c09a244b8c0aff9a/media.mp3" length="84528352" type="audio/mpeg"/>
			<guid isPermaLink="false">66e45524c09a244b8c0aff9a</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/dancing-with-joana-chicau</link>
			<acast:episodeId>66e45524c09a244b8c0aff9a</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>dancing-with-joana-chicau</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxpUVpPVJvcuvzdqVxRJgRoutOzjeZ1VcaKEmJ30XhL+/1wji04LNjOjdqDVuZeUjZvb/8wPDmpv3HnWDF40qWGu]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>43</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>Joana Chicau is a designer, researcher and coder, with a background in choreography and performance. We had a truly delightful chat about how dance can make you understand data differently. Recorded Sept 13, 2024. Released Sept 23, 2024.</p><br><p><strong>Website</strong></p><p><a href="https://joanachicau.com/about.html " rel="noopener noreferrer" target="_blank">https://joanachicau.com/about.html </a></p><br><p><strong>Publications</strong></p><p><a href="https://researchers.arts.ac.uk/2383-joana-chicau/publications&nbsp;" rel="noopener noreferrer" target="_blank">https://researchers.arts.ac.uk/2383-joana-chicau/publications&nbsp;</a></p><br><p><strong>Choreographing You</strong></p><p><a href="https://re-coding.technology/choreographing-you/" rel="noopener noreferrer" target="_blank">https://re-coding.technology/choreographing-you/</a></p><br><p><strong>From Individual Discomfort to Collective Solidarity: Choreographic Exploration of Extractivist Technology&nbsp;</strong></p><p><a href="https://www.researchgate.net/publication/378139744_From_Individual_Discomfort_to_Collective_Solidarity_Choreographic_Exploration_of_Extractivist_Technology" rel="noopener noreferrer" target="_blank">https://www.researchgate.net/publication/378139744_From_Individual_Discomfort_to_Collective_Solidarity_Choreographic_Exploration_of_Extractivist_Technology</a></p><p><br></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Joana Chicau is a designer, researcher and coder, with a background in choreography and performance. We had a truly delightful chat about how dance can make you understand data differently. Recorded Sept 13, 2024. Released Sept 23, 2024.</p><br><p><strong>Website</strong></p><p><a href="https://joanachicau.com/about.html " rel="noopener noreferrer" target="_blank">https://joanachicau.com/about.html </a></p><br><p><strong>Publications</strong></p><p><a href="https://researchers.arts.ac.uk/2383-joana-chicau/publications&nbsp;" rel="noopener noreferrer" target="_blank">https://researchers.arts.ac.uk/2383-joana-chicau/publications&nbsp;</a></p><br><p><strong>Choreographing You</strong></p><p><a href="https://re-coding.technology/choreographing-you/" rel="noopener noreferrer" target="_blank">https://re-coding.technology/choreographing-you/</a></p><br><p><strong>From Individual Discomfort to Collective Solidarity: Choreographic Exploration of Extractivist Technology&nbsp;</strong></p><p><a href="https://www.researchgate.net/publication/378139744_From_Individual_Discomfort_to_Collective_Solidarity_Choreographic_Exploration_of_Extractivist_Technology" rel="noopener noreferrer" target="_blank">https://www.researchgate.net/publication/378139744_From_Individual_Discomfort_to_Collective_Solidarity_Choreographic_Exploration_of_Extractivist_Technology</a></p><p><br></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Geologica, with Siobhan Angus</title>
			<itunes:title>Geologica, with Siobhan Angus</itunes:title>
			<pubDate>Mon, 09 Sep 2024 10:00:00 GMT</pubDate>
			<itunes:duration>46:44</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/66db320be3cb6d8da99b2246/media.mp3" length="67339168" type="audio/mpeg"/>
			<guid isPermaLink="false">66db320be3cb6d8da99b2246</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/geologica-with-siobhan-angus</link>
			<acast:episodeId>66db320be3cb6d8da99b2246</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>geologica-with-siobhan-angus</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxoEKBXLDp+/BGcjb3/eZNq0//IyODEsxybWuvBgTABLxF1EDOnapLMGNCvPsTHL785VSge6BvqwxmN7p5GIXYlf]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>42</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>It was such an honour to be in conversation with Siobhan Angus about what can only be describe as a masterpiece: her book <em>Camera Geologica</em>. Recorded August 8, 2024. Released Sept 9, 2024.</p><br><p><strong>Camera Geologica: </strong>An Elemental History of Photography</p><p><a href="https://www.dukeupress.edu/camera-geologica" rel="noopener noreferrer" target="_blank">https://www.dukeupress.edu/camera-geologica</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>It was such an honour to be in conversation with Siobhan Angus about what can only be describe as a masterpiece: her book <em>Camera Geologica</em>. Recorded August 8, 2024. Released Sept 9, 2024.</p><br><p><strong>Camera Geologica: </strong>An Elemental History of Photography</p><p><a href="https://www.dukeupress.edu/camera-geologica" rel="noopener noreferrer" target="_blank">https://www.dukeupress.edu/camera-geologica</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Reform, with Leslie R. Shade</title>
			<itunes:title>Reform, with Leslie R. Shade</itunes:title>
			<pubDate>Mon, 26 Aug 2024 12:00:30 GMT</pubDate>
			<itunes:duration>51:40</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/66ac0e6c5019f4489133507e/media.mp3" length="75896320" type="audio/mpeg"/>
			<guid isPermaLink="false">66ac0e6c5019f4489133507e</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/reform-with-leslie-r-shade</link>
			<acast:episodeId>66ac0e6c5019f4489133507e</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>reform-with-leslie-r-shade</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxpBLvwwE+j0ciFLDNX7EXqlVdPNedg93hR5SdDVZyXdARfXcxSfPpftBkwCnVQ6R6BDoM4R2SBs7gGu7ZeRGw+z]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>41</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>In this episode, I speak with my dear friend and colleague, Leslie R. Shade about the importance of media reform from an intersectional feminist political economic perspective! Recorded Aug 1. Released Aug 26, 2024.</p><br><p><strong>Chapter 5: From Media Reform to Data Justice: Situating Women's Rights as Human Rights </strong>from<strong> </strong><em>The Handbook of Gender, Communication, and Women's Human Rights </em>Margaret Gallagher (Editor), Aimee Vega Montiel (Editor)&nbsp;ISBN: 978-1-119-80068-2 November 2023, Wiley-Blackwell <a href="https://www.wiley.com/en-us/The+Handbook+of+Gender%2C+Communication%2C+and+Women's+Human+Rights-p-9781119800682#tableofcontents-section " rel="noopener noreferrer" target="_blank">https://www.wiley.com/en-us/The+Handbook+of+Gender%2C+Communication%2C+and+Women's+Human+Rights-p-9781119800682#tableofcontents-section </a></p><br><p>Read all her work here: <a href="https://discover.research.utoronto.ca/2541-leslie-shade/publications" rel="noopener noreferrer" target="_blank">https://discover.research.utoronto.ca/2541-leslie-shade/publications </a></p><p><br></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>In this episode, I speak with my dear friend and colleague, Leslie R. Shade about the importance of media reform from an intersectional feminist political economic perspective! Recorded Aug 1. Released Aug 26, 2024.</p><br><p><strong>Chapter 5: From Media Reform to Data Justice: Situating Women's Rights as Human Rights </strong>from<strong> </strong><em>The Handbook of Gender, Communication, and Women's Human Rights </em>Margaret Gallagher (Editor), Aimee Vega Montiel (Editor)&nbsp;ISBN: 978-1-119-80068-2 November 2023, Wiley-Blackwell <a href="https://www.wiley.com/en-us/The+Handbook+of+Gender%2C+Communication%2C+and+Women's+Human+Rights-p-9781119800682#tableofcontents-section " rel="noopener noreferrer" target="_blank">https://www.wiley.com/en-us/The+Handbook+of+Gender%2C+Communication%2C+and+Women's+Human+Rights-p-9781119800682#tableofcontents-section </a></p><br><p>Read all her work here: <a href="https://discover.research.utoronto.ca/2541-leslie-shade/publications" rel="noopener noreferrer" target="_blank">https://discover.research.utoronto.ca/2541-leslie-shade/publications </a></p><p><br></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Territorial, with Alina Utrata</title>
			<itunes:title>Territorial, with Alina Utrata</itunes:title>
			<pubDate>Mon, 24 Jun 2024 12:00:02 GMT</pubDate>
			<itunes:duration>51:42</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/6654beafbcc0a700124d4614/media.mp3" length="76500160" type="audio/mpeg"/>
			<guid isPermaLink="false">6654beafbcc0a700124d4614</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/territorial-with-alina-utrata</link>
			<acast:episodeId>6654beafbcc0a700124d4614</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>territorial-with-alina-utrata</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxo4Z4d0FiGGqDxGnI7NxZ/fa7m4I/YgavnXOAcBZ04slZivCN6rFlHBX7OEGYl39MA9YC9ZTHaZlprJM+w8mXU7]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>40</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>Alina Utrata and I have a conversation about billionaires conquering space for personal pleasure, in the pursuit of energy sources or minerals, or, to push forward a longtermist interplanetary movement. Alina explains how when we think about outer space as "empty", we unwittingly thinking <em>territorially</em> -- an incredibly valuable contribution to critical space scholarship. Recorded May 20. Released June 24, 2024.</p><br><p><strong>Engineering Territory: Space and Colonies in Silicon Valley</strong></p><p><a href="https://www.cambridge.org/core/journals/american-political-science-review/article/engineering-territory-space-and-colonies-in-silicon-valley/5D6EA4D306E8F3E0465F4A05C89454D6 " rel="noopener noreferrer" target="_blank">https://www.cambridge.org/core/journals/american-political-science-review/article/engineering-territory-space-and-colonies-in-silicon-valley/5D6EA4D306E8F3E0465F4A05C89454D6 </a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Alina Utrata and I have a conversation about billionaires conquering space for personal pleasure, in the pursuit of energy sources or minerals, or, to push forward a longtermist interplanetary movement. Alina explains how when we think about outer space as "empty", we unwittingly thinking <em>territorially</em> -- an incredibly valuable contribution to critical space scholarship. Recorded May 20. Released June 24, 2024.</p><br><p><strong>Engineering Territory: Space and Colonies in Silicon Valley</strong></p><p><a href="https://www.cambridge.org/core/journals/american-political-science-review/article/engineering-territory-space-and-colonies-in-silicon-valley/5D6EA4D306E8F3E0465F4A05C89454D6 " rel="noopener noreferrer" target="_blank">https://www.cambridge.org/core/journals/american-political-science-review/article/engineering-territory-space-and-colonies-in-silicon-valley/5D6EA4D306E8F3E0465F4A05C89454D6 </a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Futures, with Lee Vinsel</title>
			<itunes:title>Futures, with Lee Vinsel</itunes:title>
			<pubDate>Mon, 10 Jun 2024 12:00:09 GMT</pubDate>
			<itunes:duration>51:32</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/6622de6e0b12320012b7cf12/media.mp3" length="80591872" type="audio/mpeg"/>
			<guid isPermaLink="false">6622de6e0b12320012b7cf12</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/futures-with-lee-vinsel</link>
			<acast:episodeId>6622de6e0b12320012b7cf12</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>futures-with-lee-vinsel</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxrWgNhn7K0gXl20F+xGdfw3BVbVWWuNcT9/SQY/hLNEI9yvUT0IMNysXiF/Qtt0qFANx3ISJpzFujAWkWWfrUX5]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>39</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>I invited Lee Vinsel to discuss with me a post he wrote from a workshop on "Politics of Controlling Powerful Technologies". In this episode we discuss how futures are (imagined to be) predicted through data modelling and crunching numbers, and how various alternatives to these statistical imaginaries also come short of knowing what awaits us. Can we stand to not know? And if we don't know what the future holds, how do we plan politically? Recorded April 19. Released June 10, 2024. </p><br><p><strong>How to Be a Better Reactionary: Time and Knowledge in Technology Regulation</strong></p><p><a href="https://sts-news.medium.com/how-to-be-a-better-reactionary-1630b5098fbc" rel="noopener noreferrer" target="_blank">https://sts-news.medium.com/how-to-be-a-better-reactionary-1630b5098fbc</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>I invited Lee Vinsel to discuss with me a post he wrote from a workshop on "Politics of Controlling Powerful Technologies". In this episode we discuss how futures are (imagined to be) predicted through data modelling and crunching numbers, and how various alternatives to these statistical imaginaries also come short of knowing what awaits us. Can we stand to not know? And if we don't know what the future holds, how do we plan politically? Recorded April 19. Released June 10, 2024. </p><br><p><strong>How to Be a Better Reactionary: Time and Knowledge in Technology Regulation</strong></p><p><a href="https://sts-news.medium.com/how-to-be-a-better-reactionary-1630b5098fbc" rel="noopener noreferrer" target="_blank">https://sts-news.medium.com/how-to-be-a-better-reactionary-1630b5098fbc</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Objective, with Lisa Messeri and M. J. Crockett</title>
			<itunes:title>Objective, with Lisa Messeri and M. J. Crockett</itunes:title>
			<pubDate>Mon, 27 May 2024 12:00:38 GMT</pubDate>
			<itunes:duration>53:42</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/6622dd2209a9320012ce2999/media.mp3" length="79114336" type="audio/mpeg"/>
			<guid isPermaLink="false">6622dd2209a9320012ce2999</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/objective-with-lisa-messeri-and-m-j-crockett</link>
			<acast:episodeId>6622dd2209a9320012ce2999</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>objective-with-lisa-messeri-and-m-j-crockett</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxqkjE+r/PhKrlXcvtXuhiFtDII1Tp8TuZoQ2Hk+ukaUzYwcQTlOMzUXj73Dug0V6/ZHoA4jB/4Ak3Lan7eDlQnS]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>38</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>In this episode, Lisa Messeri and M. J. Crockett discuss how scientists are in danger of overlooking AI tools’ limitations, and how science is made stronger by questioning its obsession with objectivity. Recorded April 18, 2024. Released May 27, 2024.</p><br><p><strong>Artificial intelligence and illusions of understanding in scientific research</strong></p><p>Lisa Messeri &amp; M. J. Crockett&nbsp; <em>Nature</em> volume 627, pages49–58 (2024)Cite this article</p><p><a href="https://www.nature.com/articles/s41586-024-07146-0" rel="noopener noreferrer" target="_blank">https://www.nature.com/articles/s41586-024-07146-0</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>In this episode, Lisa Messeri and M. J. Crockett discuss how scientists are in danger of overlooking AI tools’ limitations, and how science is made stronger by questioning its obsession with objectivity. Recorded April 18, 2024. Released May 27, 2024.</p><br><p><strong>Artificial intelligence and illusions of understanding in scientific research</strong></p><p>Lisa Messeri &amp; M. J. Crockett&nbsp; <em>Nature</em> volume 627, pages49–58 (2024)Cite this article</p><p><a href="https://www.nature.com/articles/s41586-024-07146-0" rel="noopener noreferrer" target="_blank">https://www.nature.com/articles/s41586-024-07146-0</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Thirsty, with Shaolei Ren</title>
			<itunes:title>Thirsty, with Shaolei Ren</itunes:title>
			<pubDate>Mon, 13 May 2024 12:00:23 GMT</pubDate>
			<itunes:duration>48:22</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/6622db6d09a9320012cdcda0/media.mp3" length="67833376" type="audio/mpeg"/>
			<guid isPermaLink="false">6622db6d09a9320012cdcda0</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/thirsty-with</link>
			<acast:episodeId>6622db6d09a9320012cdcda0</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>thirsty-with</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxoimUXqIiGOGNPz9ZEwl7VP4YIRcMz+0VHyeTAqhrvCRJnCSAu9ZCwA1/8yIGITa1qDqErIu22YAqai2P87S7FI]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>37</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[In this episode, Shaolei Ren and I discuss the relationship between water and generative AI. We delve into what happens to water in the (very thirsty) data center, what it's used for, and how much fresh water the AI revolution will ask of the planet in the future, and at what costs. Big Tech doesn't yet disclose its water withdrawal or consumption, so researcher like Shaolei Ren take up the work and propose solutions for a more sustainable future for AI. Recorded April 19, 2024. Released May 13, 2024.<hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[In this episode, Shaolei Ren and I discuss the relationship between water and generative AI. We delve into what happens to water in the (very thirsty) data center, what it's used for, and how much fresh water the AI revolution will ask of the planet in the future, and at what costs. Big Tech doesn't yet disclose its water withdrawal or consumption, so researcher like Shaolei Ren take up the work and propose solutions for a more sustainable future for AI. Recorded April 19, 2024. Released May 13, 2024.<hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Diversity, with Catherine Stinson and Sophie Vlaad</title>
			<itunes:title>Diversity, with Catherine Stinson and Sophie Vlaad</itunes:title>
			<pubDate>Mon, 22 Apr 2024 10:00:53 GMT</pubDate>
			<itunes:duration>53:32</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/65fdb2be3402060016d43660/media.mp3" length="83495104" type="audio/mpeg"/>
			<guid isPermaLink="false">65fdb2be3402060016d43660</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/diversity-with-catherine-stinson-and-sophie-vlaad</link>
			<acast:episodeId>65fdb2be3402060016d43660</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>diversity-with-catherine-stinson-and-sophie-vlaad</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxpa1ZaSEQ8AHvKnVOFMC11K1JEzJnfWYOoRcIL+w+UMHoEA84DSrxR4fDF+da73H6bfJUb2m7oMTr/m3LgIf8X6]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>36</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>With Catherine Stinson and Sophie Vlaad, we discuss what diversity means in the context of AI -- its applications, conceptualizations, teams, institutions, networks, members, and ideals. As they ask in a recent article, "diversity" is often proposed as a solution to ethical problems in artificial intelligence (AI), but what exactly is meant by "diversity" and how it can it solve those problems?&nbsp;Recorded March 22, 2024. Released April 22, 2024.</p><br><p><strong>A feeling for the algorithm: Diversity, expertise, and artificial intelligence</strong></p><p>Stinson, C., &amp; Vlaad, S. (2024). A feeling for the algorithm: Diversity, expertise, and artificial intelligence. Big Data &amp; Society, 11(1). https://doi.org/10.1177/20539517231224247</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>With Catherine Stinson and Sophie Vlaad, we discuss what diversity means in the context of AI -- its applications, conceptualizations, teams, institutions, networks, members, and ideals. As they ask in a recent article, "diversity" is often proposed as a solution to ethical problems in artificial intelligence (AI), but what exactly is meant by "diversity" and how it can it solve those problems?&nbsp;Recorded March 22, 2024. Released April 22, 2024.</p><br><p><strong>A feeling for the algorithm: Diversity, expertise, and artificial intelligence</strong></p><p>Stinson, C., &amp; Vlaad, S. (2024). A feeling for the algorithm: Diversity, expertise, and artificial intelligence. Big Data &amp; Society, 11(1). https://doi.org/10.1177/20539517231224247</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Unsustainable, with Matthew Archer</title>
			<itunes:title>Unsustainable, with Matthew Archer</itunes:title>
			<pubDate>Mon, 08 Apr 2024 10:00:18 GMT</pubDate>
			<itunes:duration>47:52</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/65bd416d8639490016efcb2f/media.mp3" length="70933408" type="audio/mpeg"/>
			<guid isPermaLink="false">65bd416d8639490016efcb2f</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/unsustainable-with-matthew-archer</link>
			<acast:episodeId>65bd416d8639490016efcb2f</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>unsustainable-with-matthew-archer</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxpx5akmCTsASjIa8X4Jvm+CWKj18lzsemt8WyCd7ZASq2YOvVRu3DBjX412C6TuJJCNPhxsAIvRLS/6yX5twZfJ]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>35</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>Listen to my conversation with Matthew Archer, author of <em>Unsustainable: Measurement, Reporting, and the Limits of Corporate Sustainability. </em>In his beautifully written book, Matthew makes a case for being highly skeptical of corporate sustainability initiatives, especially as they've become increasingly grounded in metrics of all kinds that measure just and exactly what the companies themselves determine to be worthy of measuring. Framing sustainability as a technical issue has been and continues to be a failure, and so we ask: what it might mean to take this criticism seriously? Recorded Feb 2, 2024. Released Apr 8, 2024.</p><br><p><strong><em>Unsustainable: Measurement, Reporting, and the Limits of Corporate Sustainability</em> (Feb 2024, Published by NYU Press)</strong></p><p>https://nyupress.org/9781479822027/unsustainable/</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Listen to my conversation with Matthew Archer, author of <em>Unsustainable: Measurement, Reporting, and the Limits of Corporate Sustainability. </em>In his beautifully written book, Matthew makes a case for being highly skeptical of corporate sustainability initiatives, especially as they've become increasingly grounded in metrics of all kinds that measure just and exactly what the companies themselves determine to be worthy of measuring. Framing sustainability as a technical issue has been and continues to be a failure, and so we ask: what it might mean to take this criticism seriously? Recorded Feb 2, 2024. Released Apr 8, 2024.</p><br><p><strong><em>Unsustainable: Measurement, Reporting, and the Limits of Corporate Sustainability</em> (Feb 2024, Published by NYU Press)</strong></p><p>https://nyupress.org/9781479822027/unsustainable/</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Change, with Sireesh Gururaja, Amanda Bertsch and Clara Na</title>
			<itunes:title>Change, with Sireesh Gururaja, Amanda Bertsch and Clara Na</itunes:title>
			<pubDate>Mon, 25 Mar 2024 10:00:45 GMT</pubDate>
			<itunes:duration>1:00:28</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/65b2c71d291f120017409fc7/media.mp3" length="88748992" type="audio/mpeg"/>
			<guid isPermaLink="false">65b2c71d291f120017409fc7</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/change-with-sireesh-gururaja-amanda-bertsch-and-clara-na</link>
			<acast:episodeId>65b2c71d291f120017409fc7</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>change-with-sireesh-gururaja-amanda-bertsch-and-clara-na</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxqRlPNm4OY2V4NiiSevSbOX1aMbHW4Yo6eMaGbAuE0MPfHzZ6wfjvNzNDrPrq1OMkz/Iez98zLUDZgfK6rJoyYo]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>34</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>Together, Sireesh Gururaja, Amanda Bertsch and Clara Na explain the paradigm shifts in Natural Language Processing that they've noticed themselves, observed in the community, and documented through a series of interviews with NLP researchers. They share their hopes for the NLP field -- as less focused on benchmarks, and as more self-reflexive and ethically-driven -- moving forward. Recorded Jan 19, 2024. Released March 25, 2024. </p><br><p><strong>To Build Our Future, We Must Know Our Past: Contextualizing Paradigm Shifts in Natural Language Processing</strong></p><p>by Sireesh Gururaja, Amanda Bertsch, Clara Na, David Gray Widder, Emma Strubell</p><p><a href="https://arxiv.org/abs/2310.07715 " rel="noopener noreferrer" target="_blank">https://arxiv.org/abs/2310.07715 </a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Together, Sireesh Gururaja, Amanda Bertsch and Clara Na explain the paradigm shifts in Natural Language Processing that they've noticed themselves, observed in the community, and documented through a series of interviews with NLP researchers. They share their hopes for the NLP field -- as less focused on benchmarks, and as more self-reflexive and ethically-driven -- moving forward. Recorded Jan 19, 2024. Released March 25, 2024. </p><br><p><strong>To Build Our Future, We Must Know Our Past: Contextualizing Paradigm Shifts in Natural Language Processing</strong></p><p>by Sireesh Gururaja, Amanda Bertsch, Clara Na, David Gray Widder, Emma Strubell</p><p><a href="https://arxiv.org/abs/2310.07715 " rel="noopener noreferrer" target="_blank">https://arxiv.org/abs/2310.07715 </a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Adversarial, with Steph Maj Swanson</title>
			<itunes:title>Adversarial, with Steph Maj Swanson</itunes:title>
			<pubDate>Mon, 11 Mar 2024 10:00:19 GMT</pubDate>
			<itunes:duration>50:21</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/65b2c476c88e8800164427b6/media.mp3" length="73874848" type="audio/mpeg"/>
			<guid isPermaLink="false">65b2c476c88e8800164427b6</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/adversarial-with-steph-maj-swanson</link>
			<acast:episodeId>65b2c476c88e8800164427b6</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>adversarial-with-steph-maj-swanson</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxpm62a2ZVktK4l34/WmpjfnPLiLObrpDzyH1ek0zDqScrgzIu3VvdoSvsdzB4s85HDJ9Pnk/aGoB/6bb4KC93F6]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>33</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>Steph Maj Swanson, aka <em>Supercomposite</em> and I discuss the spooky Loab phenomenon, generative adversarial network, negative prompts and the demons (maybe?) lurking in large datasets. Recorded Jan 19, 2024. Released March 11, 2024. </p><br><p>What I Learned from Loab: AI as a creative adversary</p><p>The artist behind the viral cryptid "Loab" reflects on her critical relationship to AI art tools</p><p><a href="https://media.ccc.de/v/37c3-12052-what_i_learned_from_loab_ai_as_a_creative_adversary" rel="noopener noreferrer" target="_blank">https://media.ccc.de/v/37c3-12052-what_i_learned_from_loab_ai_as_a_creative_adversary</a>&nbsp;</p><br><p>Original Twitter thread:</p><p><a href="https://twitter.com/supercomposite/status/1567162288087470081?lang=en" rel="noopener noreferrer" target="_blank">https://twitter.com/supercomposite/status/1567162288087470081?lang=en</a></p><br><p>Insta</p><p><a href="https://www.instagram.com/supercomposite/ " rel="noopener noreferrer" target="_blank">https://www.instagram.com/supercomposite/ </a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Steph Maj Swanson, aka <em>Supercomposite</em> and I discuss the spooky Loab phenomenon, generative adversarial network, negative prompts and the demons (maybe?) lurking in large datasets. Recorded Jan 19, 2024. Released March 11, 2024. </p><br><p>What I Learned from Loab: AI as a creative adversary</p><p>The artist behind the viral cryptid "Loab" reflects on her critical relationship to AI art tools</p><p><a href="https://media.ccc.de/v/37c3-12052-what_i_learned_from_loab_ai_as_a_creative_adversary" rel="noopener noreferrer" target="_blank">https://media.ccc.de/v/37c3-12052-what_i_learned_from_loab_ai_as_a_creative_adversary</a>&nbsp;</p><br><p>Original Twitter thread:</p><p><a href="https://twitter.com/supercomposite/status/1567162288087470081?lang=en" rel="noopener noreferrer" target="_blank">https://twitter.com/supercomposite/status/1567162288087470081?lang=en</a></p><br><p>Insta</p><p><a href="https://www.instagram.com/supercomposite/ " rel="noopener noreferrer" target="_blank">https://www.instagram.com/supercomposite/ </a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Mirrored, with Kyriaki Goni </title>
			<itunes:title>Mirrored, with Kyriaki Goni </itunes:title>
			<pubDate>Mon, 26 Feb 2024 11:00:40 GMT</pubDate>
			<itunes:duration>1:00:09</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/65a177444e159400173471e0/media.mp3" length="82646560" type="audio/mpeg"/>
			<guid isPermaLink="false">65a177444e159400173471e0</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/mirrored-with-kyriaki-goni</link>
			<acast:episodeId>65a177444e159400173471e0</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>mirrored-with-kyriaki-goni</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxrYxni3bLb7JzdbfUDqn/RXwzyO/zgswLT/+ggxgPrfxSQxQFe3svY7tSj7+hj1cy7y2gEM5Yt6Q3CUFpiH/0fe]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>32</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>Kyriaki Goni - an artist with a background in social and cultural anthropology -&nbsp;and I start our conversation reflecting back on the lockdowns of April 2020 in Athens; what this signified, and how it shaped her art, which ultimately manifested as "The Portal or Let’s Stand Still for the Whales", which was a reflection on the tensions between the darkness of pandemic realities and the quiet restoration of natural things in her surroundings, and beyond. We also talk about "Perfect Love #couplegoals #AIgenerated, 2020,2022", as an exploration of intimacy, doomscrolling and isolation. We finish our conversation on "Not Allowed for Algorithmic Audiences, 2021" which focuses more specifically on 'audio assistant' tech, and the way algorithms pull audio from social media and various corners of the internet to then classify and reorganize the way we're listened to and heard. One of the (MANY) things I love about Kyriaki's work is that it is decidedly not preachy -- instead it holds a mirror to the audience to reflect gently, and in their own time, on the significance of technology in various contexts. Recorded Jan 12, 2024. Released Feb 26, 2024.</p><br><p>KYRIAKI GONI</p><p><a href="https://kyriakigoni.com/" rel="noopener noreferrer" target="_blank">https://kyriakigoni.com/</a></p><br><p>ANTHROPOCENE ON HOLD</p><p><a href="https://www.pcai.gr/anthroposcene-on-hold" rel="noopener noreferrer" target="_blank">https://www.pcai.gr/anthroposcene-on-hold</a></p><br><p>NOT ALLOWED FOR ALGORITHMIC AUDIENCES</p><p>March 23, 2023–April 29, 2023</p><p>The Breeder Feeder</p><p><a href="https://thebreedersystem.com/uncategorized/kyriaki-goni_-not-allowed-for-algorithmic-audiences/ " rel="noopener noreferrer" target="_blank">https://thebreedersystem.com/uncategorized/kyriaki-goni_-not-allowed-for-algorithmic-audiences/ </a></p><br><p><em>studio international</em>: Kyriaki Goni – interview: ‘For me, technology is an existential discussion’</p><p><a href="https://www.studiointernational.com/index.php/kyriaki-goni-interview-for-me-technology-is-existential-discussion-data-garden-blenheim-walk-gallery-leeds-arts-university" rel="noopener noreferrer" target="_blank">https://www.studiointernational.com/index.php/kyriaki-goni-interview-for-me-technology-is-existential-discussion-data-garden-blenheim-walk-gallery-leeds-arts-university</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Kyriaki Goni - an artist with a background in social and cultural anthropology -&nbsp;and I start our conversation reflecting back on the lockdowns of April 2020 in Athens; what this signified, and how it shaped her art, which ultimately manifested as "The Portal or Let’s Stand Still for the Whales", which was a reflection on the tensions between the darkness of pandemic realities and the quiet restoration of natural things in her surroundings, and beyond. We also talk about "Perfect Love #couplegoals #AIgenerated, 2020,2022", as an exploration of intimacy, doomscrolling and isolation. We finish our conversation on "Not Allowed for Algorithmic Audiences, 2021" which focuses more specifically on 'audio assistant' tech, and the way algorithms pull audio from social media and various corners of the internet to then classify and reorganize the way we're listened to and heard. One of the (MANY) things I love about Kyriaki's work is that it is decidedly not preachy -- instead it holds a mirror to the audience to reflect gently, and in their own time, on the significance of technology in various contexts. Recorded Jan 12, 2024. Released Feb 26, 2024.</p><br><p>KYRIAKI GONI</p><p><a href="https://kyriakigoni.com/" rel="noopener noreferrer" target="_blank">https://kyriakigoni.com/</a></p><br><p>ANTHROPOCENE ON HOLD</p><p><a href="https://www.pcai.gr/anthroposcene-on-hold" rel="noopener noreferrer" target="_blank">https://www.pcai.gr/anthroposcene-on-hold</a></p><br><p>NOT ALLOWED FOR ALGORITHMIC AUDIENCES</p><p>March 23, 2023–April 29, 2023</p><p>The Breeder Feeder</p><p><a href="https://thebreedersystem.com/uncategorized/kyriaki-goni_-not-allowed-for-algorithmic-audiences/ " rel="noopener noreferrer" target="_blank">https://thebreedersystem.com/uncategorized/kyriaki-goni_-not-allowed-for-algorithmic-audiences/ </a></p><br><p><em>studio international</em>: Kyriaki Goni – interview: ‘For me, technology is an existential discussion’</p><p><a href="https://www.studiointernational.com/index.php/kyriaki-goni-interview-for-me-technology-is-existential-discussion-data-garden-blenheim-walk-gallery-leeds-arts-university" rel="noopener noreferrer" target="_blank">https://www.studiointernational.com/index.php/kyriaki-goni-interview-for-me-technology-is-existential-discussion-data-garden-blenheim-walk-gallery-leeds-arts-university</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Efficient, with Anne Pasek</title>
			<itunes:title>Efficient, with Anne Pasek</itunes:title>
			<pubDate>Mon, 12 Feb 2024 13:00:55 GMT</pubDate>
			<itunes:duration>1:06:59</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/6580a97302852500179a433a/media.mp3" length="80389663" type="audio/mpeg"/>
			<guid isPermaLink="false">6580a97302852500179a433a</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://shows.acast.com/the-data-fix/episodes/efficient-with-anne-pasek</link>
			<acast:episodeId>6580a97302852500179a433a</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>efficient-with-anne-pasek</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxoRUAPdgG7pTZewNWjg5LiaaBOOOEky1/5kq+menMgErOEXTI+8h2cQbQyFhkGc0cowfi9nJrQBBYGniAjGp0ud]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>31</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>Most of us researching data centers have come to rely on various figures and stats telling us how environmentally impactful the internet has become: how big is the footprint? how much energy is used? Anne Pasek and I discuss in this episode just how these things get tallied, and by whom, and to what ends. We also discuss what gets omitted in these calculations, and how a "relational footprinting" approach might help us situate our knowledge about this topic. We also briefly talk about open access publishing and the power of zines in particular. Recorded Dec 12, 2023. Released Feb 12, 2024.</p><br><p>Digital Energetics</p><p><a href="https://meson.press/books/digital-energetics/ " rel="noopener noreferrer" target="_blank">https://meson.press/books/digital-energetics/ </a></p><br><p>Getting Into Fights with Data Centers (zine)</p><p><a href="https://emmlab.info/Resources_page/Data%20Center%20Fights_digital.pdf" rel="noopener noreferrer" target="_blank">https://emmlab.info/Resources_page/Data%20Center%20Fights_digital.pdf</a></p><br><p>Pasek, A., Vaughan, H., &amp; Starosielski, N. (2023). The world wide web of carbon: Toward a relational footprinting of information and communications technology’s climate impacts. Big Data &amp; Society, 10(1). </p><p><br></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Most of us researching data centers have come to rely on various figures and stats telling us how environmentally impactful the internet has become: how big is the footprint? how much energy is used? Anne Pasek and I discuss in this episode just how these things get tallied, and by whom, and to what ends. We also discuss what gets omitted in these calculations, and how a "relational footprinting" approach might help us situate our knowledge about this topic. We also briefly talk about open access publishing and the power of zines in particular. Recorded Dec 12, 2023. Released Feb 12, 2024.</p><br><p>Digital Energetics</p><p><a href="https://meson.press/books/digital-energetics/ " rel="noopener noreferrer" target="_blank">https://meson.press/books/digital-energetics/ </a></p><br><p>Getting Into Fights with Data Centers (zine)</p><p><a href="https://emmlab.info/Resources_page/Data%20Center%20Fights_digital.pdf" rel="noopener noreferrer" target="_blank">https://emmlab.info/Resources_page/Data%20Center%20Fights_digital.pdf</a></p><br><p>Pasek, A., Vaughan, H., &amp; Starosielski, N. (2023). The world wide web of carbon: Toward a relational footprinting of information and communications technology’s climate impacts. Big Data &amp; Society, 10(1). </p><p><br></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Conflicted, with Tobias Williams</title>
			<itunes:title>Conflicted, with Tobias Williams</itunes:title>
			<pubDate>Mon, 25 Dec 2023 13:00:15 GMT</pubDate>
			<itunes:duration>53:51</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/652ef8b70a6d2b001228d88c/media.mp3" length="64629467" type="audio/mpeg"/>
			<guid isPermaLink="false">652ef8b70a6d2b001228d88c</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/030</link>
			<acast:episodeId>652ef8b70a6d2b001228d88c</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>conflicted-with-tobias-williams</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsoxhINu4Ad7VkAnsB5MGv7cvr5WFMEwdDai3D1TxCsNIzuDGND56VIRAxBLWU/NJxcf74KD0hAaABLr2CtxFyestvwZKt1T/JJGpvvv85yDo=]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>30</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>Teaching in times of generative AI is weird, and sometimes wonderful. Tobias Williams and I discuss what it means to make art and teach art at this juncture and the conflicted feelings that emerge from resisting with the tools of creation. Recorded Oct 10, 2023. Released Dec 25, 2023.</p><br><p>Profile</p><p><a href="http://tobiasjwilliams.com/#/face-filters/" rel="noopener noreferrer" target="_blank">http://tobiasjwilliams.com</a></p><br><p>Instagram</p><p><a href="https://www.instagram.com/getrichnever/" rel="noopener noreferrer" target="_blank">https://www.instagram.com/getrichnever/</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Teaching in times of generative AI is weird, and sometimes wonderful. Tobias Williams and I discuss what it means to make art and teach art at this juncture and the conflicted feelings that emerge from resisting with the tools of creation. Recorded Oct 10, 2023. Released Dec 25, 2023.</p><br><p>Profile</p><p><a href="http://tobiasjwilliams.com/#/face-filters/" rel="noopener noreferrer" target="_blank">http://tobiasjwilliams.com</a></p><br><p>Instagram</p><p><a href="https://www.instagram.com/getrichnever/" rel="noopener noreferrer" target="_blank">https://www.instagram.com/getrichnever/</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Impacts, with Irene Niet</title>
			<itunes:title>Impacts, with Irene Niet</itunes:title>
			<pubDate>Mon, 11 Dec 2023 13:00:30 GMT</pubDate>
			<itunes:duration>51:59</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/652ef5428ff4510012600441/media.mp3" length="62389728" type="audio/mpeg"/>
			<guid isPermaLink="false">652ef5428ff4510012600441</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/029</link>
			<acast:episodeId>652ef5428ff4510012600441</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>impacts-with-irene-niet</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsoxhINu4Ad7VkAnsB5MGv7cZ/SpMiyVOMk/S/fWa+7t99qXYLBM94p1azhDgeHQJIORgAw08+/sIxF9CLeFB6lYNj/SCd9ZXT+2BE4gJr168=]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>29</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>Irene Niet and I have a conversation about how researchers might consider the environmental impacts of AI in relation to their social consequences, and in relation to their impacts on democracy. Recorded Oct 16, 2023. Released Dec 11, 2023.</p><br><p>Research profile</p><p><a href="https://research.tue.nl/en/persons/irene-a-niet" rel="noopener noreferrer" target="_blank">https://research.tue.nl/en/persons/irene-a-niet</a></p><br><p>Digital (Un)sustainability - Routledge</p><p><a href="https://crowdusg.net/2022/06/06/digital-unsustainabilities-call-for-chapters/" rel="noopener noreferrer" target="_blank">https://crowdusg.net/2022/06/06/digital-unsustainabilities-call-for-chapters/</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Irene Niet and I have a conversation about how researchers might consider the environmental impacts of AI in relation to their social consequences, and in relation to their impacts on democracy. Recorded Oct 16, 2023. Released Dec 11, 2023.</p><br><p>Research profile</p><p><a href="https://research.tue.nl/en/persons/irene-a-niet" rel="noopener noreferrer" target="_blank">https://research.tue.nl/en/persons/irene-a-niet</a></p><br><p>Digital (Un)sustainability - Routledge</p><p><a href="https://crowdusg.net/2022/06/06/digital-unsustainabilities-call-for-chapters/" rel="noopener noreferrer" target="_blank">https://crowdusg.net/2022/06/06/digital-unsustainabilities-call-for-chapters/</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Chipified, with MC Forelle</title>
			<itunes:title>Chipified, with MC Forelle</itunes:title>
			<pubDate>Mon, 27 Nov 2023 13:00:07 GMT</pubDate>
			<itunes:duration>45:43</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/652450537c5f6b00120acf1b/media.mp3" length="54869597" type="audio/mpeg"/>
			<guid isPermaLink="false">652450537c5f6b00120acf1b</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/028</link>
			<acast:episodeId>652450537c5f6b00120acf1b</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>chipified-with-mc-forelle</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsoxhINu4Ad7VkAnsB5MGv7apDSp9+X/6NXamDG1e4KDDX83EnlENL1mbg8BOpe3To/Fv2EAiiY+Gv2vDlHuPSFGG0s0Ak5TV6B8WHAIEhh1c=]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>28</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>Ever wonder about the microchips in your car? MC Forelle and I talk about the “chipification” process of cars, since the 70s and 80s, and how these processes and logics see to increased corporate control and surveillance, while making opting out and DIY tinkering more difficult. We briefly touch on subscription model for automotive features — like BMW did for its heated seats not long ago — remember that? Recorded Oct 5, 2023. Released Nov 27, 2023.</p><br><p>The material consequences of “chipification”: The case of software-embedded cars</p><p><a href="https://journals-sagepub-com.ezproxy.lib.ucalgary.ca/doi/full/10.1177/20539517221095429" rel="noopener noreferrer" target="_blank">https://journals-sagepub-com.ezproxy.lib.ucalgary.ca/doi/full/10.1177/20539517221095429</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Ever wonder about the microchips in your car? MC Forelle and I talk about the “chipification” process of cars, since the 70s and 80s, and how these processes and logics see to increased corporate control and surveillance, while making opting out and DIY tinkering more difficult. We briefly touch on subscription model for automotive features — like BMW did for its heated seats not long ago — remember that? Recorded Oct 5, 2023. Released Nov 27, 2023.</p><br><p>The material consequences of “chipification”: The case of software-embedded cars</p><p><a href="https://journals-sagepub-com.ezproxy.lib.ucalgary.ca/doi/full/10.1177/20539517221095429" rel="noopener noreferrer" target="_blank">https://journals-sagepub-com.ezproxy.lib.ucalgary.ca/doi/full/10.1177/20539517221095429</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Empathy, with Steven Gonzalez Monserrate</title>
			<itunes:title>Empathy, with Steven Gonzalez Monserrate</itunes:title>
			<pubDate>Mon, 13 Nov 2023 13:00:49 GMT</pubDate>
			<itunes:duration>1:00:29</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/65244b1de2e00600123e87cf/media.mp3" length="72589499" type="audio/mpeg"/>
			<guid isPermaLink="false">65244b1de2e00600123e87cf</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/027</link>
			<acast:episodeId>65244b1de2e00600123e87cf</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>empathy-with-steven-gonzalez-monserrate</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsoxhINu4Ad7VkAnsB5MGv7W+RBsY6MTuSPyk71ItYynTWnwSAj2+ZfyYF5f/9Rvq7hzk/ICp3BKSIgSLEaOCJT3f8O094zvRQKtz9Sl/fSu0=]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>27</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>In this absolutely delightful hour, Steven Gonzalez, an anthropologist of data centers, walks us through various sites of ethnographic research — from places of abandonment post-climate disaster, the desert, to the suburbs. While much as been written and discussed about data centers in the last decade, nobody does it quite as thoughtfully as Steven — showing empathy and feeling in and through his writing. Recorded Oct 2, 2023. Released Nov 13, 2023.</p><br><p>Sordidez</p><p><a href="https://egconde.com/books/sordidez/" rel="noopener noreferrer" target="_blank">https://egconde.com/books/sordidez/</a>&nbsp;</p><br><p>Q&amp;A: Steven Gonzalez on Indigenous futurist science fiction</p><p><a href="https://news.mit.edu/2023/qa-steven-gonzalez-indigenous-futurist-science-fiction-sordidez-0821" rel="noopener noreferrer" target="_blank">https://news.mit.edu/2023/qa-steven-gonzalez-indigenous-futurist-science-fiction-sordidez-0821</a>&nbsp;</p><br><p>The Staggering Ecological Impacts of Computation and the Cloud</p><p><a href="https://thereader.mitpress.mit.edu/the-staggering-ecological-impacts-of-computation-and-the-cloud/" rel="noopener noreferrer" target="_blank">https://thereader.mitpress.mit.edu/the-staggering-ecological-impacts-of-computation-and-the-cloud/</a>&nbsp;</p><br><p>Steven Gonzalez Monserrate</p><p><a href="https://www.stevengonzalezm.com/" rel="noopener noreferrer" target="_blank">https://www.stevengonzalezm.com/</a></p><br><p>The people of the cloud</p><p><a href="https://aeon.co/essays/downtime-is-not-an-option-meet-the-stewards-of-the-cloud" rel="noopener noreferrer" target="_blank">https://aeon.co/essays/downtime-is-not-an-option-meet-the-stewards-of-the-cloud</a></p><br><p>La Nube sin Apagón</p><p><a href="https://www.anthropology-news.org/articles/la-nube-sin-apagon/" rel="noopener noreferrer" target="_blank">https://www.anthropology-news.org/articles/la-nube-sin-apagon/</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>In this absolutely delightful hour, Steven Gonzalez, an anthropologist of data centers, walks us through various sites of ethnographic research — from places of abandonment post-climate disaster, the desert, to the suburbs. While much as been written and discussed about data centers in the last decade, nobody does it quite as thoughtfully as Steven — showing empathy and feeling in and through his writing. Recorded Oct 2, 2023. Released Nov 13, 2023.</p><br><p>Sordidez</p><p><a href="https://egconde.com/books/sordidez/" rel="noopener noreferrer" target="_blank">https://egconde.com/books/sordidez/</a>&nbsp;</p><br><p>Q&amp;A: Steven Gonzalez on Indigenous futurist science fiction</p><p><a href="https://news.mit.edu/2023/qa-steven-gonzalez-indigenous-futurist-science-fiction-sordidez-0821" rel="noopener noreferrer" target="_blank">https://news.mit.edu/2023/qa-steven-gonzalez-indigenous-futurist-science-fiction-sordidez-0821</a>&nbsp;</p><br><p>The Staggering Ecological Impacts of Computation and the Cloud</p><p><a href="https://thereader.mitpress.mit.edu/the-staggering-ecological-impacts-of-computation-and-the-cloud/" rel="noopener noreferrer" target="_blank">https://thereader.mitpress.mit.edu/the-staggering-ecological-impacts-of-computation-and-the-cloud/</a>&nbsp;</p><br><p>Steven Gonzalez Monserrate</p><p><a href="https://www.stevengonzalezm.com/" rel="noopener noreferrer" target="_blank">https://www.stevengonzalezm.com/</a></p><br><p>The people of the cloud</p><p><a href="https://aeon.co/essays/downtime-is-not-an-option-meet-the-stewards-of-the-cloud" rel="noopener noreferrer" target="_blank">https://aeon.co/essays/downtime-is-not-an-option-meet-the-stewards-of-the-cloud</a></p><br><p>La Nube sin Apagón</p><p><a href="https://www.anthropology-news.org/articles/la-nube-sin-apagon/" rel="noopener noreferrer" target="_blank">https://www.anthropology-news.org/articles/la-nube-sin-apagon/</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Open, with David Gray Widder</title>
			<itunes:title>Open, with David Gray Widder</itunes:title>
			<pubDate>Mon, 23 Oct 2023 12:00:01 GMT</pubDate>
			<itunes:duration>57:24</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/65008f338c35840011e7c6a9/media.mp3" length="68899442" type="audio/mpeg"/>
			<guid isPermaLink="false">65008f338c35840011e7c6a9</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/026</link>
			<acast:episodeId>65008f338c35840011e7c6a9</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>open-with-david-gray-widder</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxogHghVrs72LQUqzBfNdjM+DSb5HAOKCHTWfvFGnZdVfFsMYreiIk4r2DOfl1dCuPThzB0ESYrwiM3rQ+Cum1P7]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>26</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>While efforts to make “AI” more “open” have gained momentum lately, it seems like both concepts are worth scrutinizing and historicizing so that we can better understand how these marketing terms become a focus (and distraction), as material conditions are downplayed. With David Gray Widder, we discuss  where “ethics” are located and how “AI” workers of all kinds imagine responsibility to be someone else’s problem, or somewhere else down the chain. Recorded Sept 12, 2023. Released Oct 23, 2023.</p><br><p>The Myth of ‘Open Source’ AI</p><p><a href="https://www.wired.com/story/the-myth-of-open-source-ai/" rel="noopener noreferrer" target="_blank">https://www.wired.com/story/the-myth-of-open-source-ai/</a> </p><br><p>Open (For Business): Big Tech, Concentrated Power, and the Political Economy of Open AI</p><p><a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4543807" rel="noopener noreferrer" target="_blank">https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4543807</a> </p><br><p>Limits and Possibilities for “Ethical AI” in Open Source: A Study of Deepfakes</p><p><a href="https://davidwidder.me/deepfakes.pdf" rel="noopener noreferrer" target="_blank">https://davidwidder.me/deepfakes.pdf</a> </p><br><p>Dislocated accountabilities in the “AI supply chain”: Modularity and developers’ notions of responsibility</p><p><a href="https://davidwidder.me/supply-chain.pdf" rel="noopener noreferrer" target="_blank">https://davidwidder.me/supply-chain.pdf</a> </p><br><p>Computer scientists designing the future can’t agree on what privacy means</p><p><a href="https://www.technologyreview.com/2023/04/03/1070665/cmu-university-privacy-battle-smart-building-sensors-mites/" rel="noopener noreferrer" target="_blank">https://www.technologyreview.com/2023/04/03/1070665/cmu-university-privacy-battle-smart-building-sensors-mites/</a> </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>While efforts to make “AI” more “open” have gained momentum lately, it seems like both concepts are worth scrutinizing and historicizing so that we can better understand how these marketing terms become a focus (and distraction), as material conditions are downplayed. With David Gray Widder, we discuss  where “ethics” are located and how “AI” workers of all kinds imagine responsibility to be someone else’s problem, or somewhere else down the chain. Recorded Sept 12, 2023. Released Oct 23, 2023.</p><br><p>The Myth of ‘Open Source’ AI</p><p><a href="https://www.wired.com/story/the-myth-of-open-source-ai/" rel="noopener noreferrer" target="_blank">https://www.wired.com/story/the-myth-of-open-source-ai/</a> </p><br><p>Open (For Business): Big Tech, Concentrated Power, and the Political Economy of Open AI</p><p><a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4543807" rel="noopener noreferrer" target="_blank">https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4543807</a> </p><br><p>Limits and Possibilities for “Ethical AI” in Open Source: A Study of Deepfakes</p><p><a href="https://davidwidder.me/deepfakes.pdf" rel="noopener noreferrer" target="_blank">https://davidwidder.me/deepfakes.pdf</a> </p><br><p>Dislocated accountabilities in the “AI supply chain”: Modularity and developers’ notions of responsibility</p><p><a href="https://davidwidder.me/supply-chain.pdf" rel="noopener noreferrer" target="_blank">https://davidwidder.me/supply-chain.pdf</a> </p><br><p>Computer scientists designing the future can’t agree on what privacy means</p><p><a href="https://www.technologyreview.com/2023/04/03/1070665/cmu-university-privacy-battle-smart-building-sensors-mites/" rel="noopener noreferrer" target="_blank">https://www.technologyreview.com/2023/04/03/1070665/cmu-university-privacy-battle-smart-building-sensors-mites/</a> </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Chemical, with Josh Lepawsky</title>
			<itunes:title>Chemical, with Josh Lepawsky</itunes:title>
			<pubDate>Mon, 09 Oct 2023 12:00:13 GMT</pubDate>
			<itunes:duration>56:51</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/64cbdceb7a102900115392ab/media.mp3" length="68229663" type="audio/mpeg"/>
			<guid isPermaLink="false">64cbdceb7a102900115392ab</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/025</link>
			<acast:episodeId>64cbdceb7a102900115392ab</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>chemical-with-josh-lepawsky</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxoRQqmAiJj2l/yxpNIic1QoAbL3OHd5K4L2qgVFLz8o5ckE4eTTPbfnlqzs8/8Lesaynn6cwO2IrFY6FVvE9pM+]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>25</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>If you think you know anything about e-waste from what’s normally reported in the news, please have a listen to this episode with the brilliant Josh Lepawsky — it will really open up everything you thought you understood! We discuss global waste industries, shipping waste, the ideals of recycling, plastics and mining waste, and the need for collective action. And chemicals. Recorded Aug 3, 2023. Released Oct 9, 2023.</p><br><p>What E-Waste Journalism Gets Wrong</p><p><a href="https://thereader.mitpress.mit.edu/what-e-waste-journalism-gets-wrong/" rel="noopener noreferrer" target="_blank">https://thereader.mitpress.mit.edu/what-e-waste-journalism-gets-wrong/</a> </p><br><p>Sources and Streams of Electronic Waste</p><p><a href="https://www.cell.com/one-earth/fulltext/S2590-3322(20)30307-9" rel="noopener noreferrer" target="_blank">https://www.cell.com/one-earth/fulltext/S2590-3322(20)30307-9</a> </p><br><p>Materials in Flux - A Symposium curated by Formafantasma Session 2: Contextualizing</p><p><a href="https://youtu.be/ChKchFQGcyc?t=1506" rel="noopener noreferrer" target="_blank">https://youtu.be/ChKchFQGcyc?t=1506</a> </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>If you think you know anything about e-waste from what’s normally reported in the news, please have a listen to this episode with the brilliant Josh Lepawsky — it will really open up everything you thought you understood! We discuss global waste industries, shipping waste, the ideals of recycling, plastics and mining waste, and the need for collective action. And chemicals. Recorded Aug 3, 2023. Released Oct 9, 2023.</p><br><p>What E-Waste Journalism Gets Wrong</p><p><a href="https://thereader.mitpress.mit.edu/what-e-waste-journalism-gets-wrong/" rel="noopener noreferrer" target="_blank">https://thereader.mitpress.mit.edu/what-e-waste-journalism-gets-wrong/</a> </p><br><p>Sources and Streams of Electronic Waste</p><p><a href="https://www.cell.com/one-earth/fulltext/S2590-3322(20)30307-9" rel="noopener noreferrer" target="_blank">https://www.cell.com/one-earth/fulltext/S2590-3322(20)30307-9</a> </p><br><p>Materials in Flux - A Symposium curated by Formafantasma Session 2: Contextualizing</p><p><a href="https://youtu.be/ChKchFQGcyc?t=1506" rel="noopener noreferrer" target="_blank">https://youtu.be/ChKchFQGcyc?t=1506</a> </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Intimacy, with Mirabelle Jones</title>
			<itunes:title>Intimacy, with Mirabelle Jones</itunes:title>
			<pubDate>Mon, 25 Sep 2023 12:00:51 GMT</pubDate>
			<itunes:duration>1:00:15</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/64ca7ee729e742001139f6d7/media.mp3" length="72309467" type="audio/mpeg"/>
			<guid isPermaLink="false">64ca7ee729e742001139f6d7</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/024</link>
			<acast:episodeId>64ca7ee729e742001139f6d7</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>intimacy-with-mirabelle-jones</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxpLA8PIbpCACpzBAwqYY9Xm2ybIMgzhJ6Ff07E/Sb4ttOdAQH6GcK50zVCBfWs/r4laGgmlDhOHJ/nxpEyLoSPe]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>24</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>A truly delightful conversation with creative technologist and artist, Mirabelle Jones. Their work uses generative AI – early iterations of it – to make compelling observations about intimacy. Jone’s work is iterative, where reactions to projects invite new forms of self-reflection, and makes us wonder if we even have a ‘true’ self, if continuity is real, and what quantum alternatives might be out there. Recorded Aug 2, 2023. Released Sept 25, 2023.</p><br><p>Translating Trauma</p><p><a href="https://www.instagram.com/p/CvF3Nkqrc_H/" rel="noopener noreferrer" target="_blank">https://www.instagram.com/p/CvF3Nkqrc_H/</a> </p><br><p>Mirabelle Jones</p><p><a href="https://www.mirabellejones.com/" rel="noopener noreferrer" target="_blank">https://www.mirabellejones.com/</a> </p><br><p>It’s time We Talked </p><p><a href="https://www.mirabellejones.com/digital-alchemy-its-time-we-talked/" rel="noopener noreferrer" target="_blank">https://www.mirabellejones.com/digital-alchemy-its-time-we-talked/</a> </p><br><p>Artificial Intimacy</p><p><a href="https://www.mirabellejones.com/artificial-intimacy/" rel="noopener noreferrer" target="_blank">https://www.mirabellejones.com/artificial-intimacy/</a> </p><br><p>Embodying the Algorithm</p><p><a href="https://aiperformance.space/mirabelle-jones-ill-be-very-nervous/" rel="noopener noreferrer" target="_blank">https://aiperformance.space/mirabelle-jones-ill-be-very-nervous/</a> </p><br><p>Zoom Reads You</p><p><a href="https://www.mirabellejones.com/zoom-reads-you/" rel="noopener noreferrer" target="_blank">https://www.mirabellejones.com/zoom-reads-you/</a> </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>A truly delightful conversation with creative technologist and artist, Mirabelle Jones. Their work uses generative AI – early iterations of it – to make compelling observations about intimacy. Jone’s work is iterative, where reactions to projects invite new forms of self-reflection, and makes us wonder if we even have a ‘true’ self, if continuity is real, and what quantum alternatives might be out there. Recorded Aug 2, 2023. Released Sept 25, 2023.</p><br><p>Translating Trauma</p><p><a href="https://www.instagram.com/p/CvF3Nkqrc_H/" rel="noopener noreferrer" target="_blank">https://www.instagram.com/p/CvF3Nkqrc_H/</a> </p><br><p>Mirabelle Jones</p><p><a href="https://www.mirabellejones.com/" rel="noopener noreferrer" target="_blank">https://www.mirabellejones.com/</a> </p><br><p>It’s time We Talked </p><p><a href="https://www.mirabellejones.com/digital-alchemy-its-time-we-talked/" rel="noopener noreferrer" target="_blank">https://www.mirabellejones.com/digital-alchemy-its-time-we-talked/</a> </p><br><p>Artificial Intimacy</p><p><a href="https://www.mirabellejones.com/artificial-intimacy/" rel="noopener noreferrer" target="_blank">https://www.mirabellejones.com/artificial-intimacy/</a> </p><br><p>Embodying the Algorithm</p><p><a href="https://aiperformance.space/mirabelle-jones-ill-be-very-nervous/" rel="noopener noreferrer" target="_blank">https://aiperformance.space/mirabelle-jones-ill-be-very-nervous/</a> </p><br><p>Zoom Reads You</p><p><a href="https://www.mirabellejones.com/zoom-reads-you/" rel="noopener noreferrer" target="_blank">https://www.mirabellejones.com/zoom-reads-you/</a> </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Alien, with Gregory Betts</title>
			<itunes:title>Alien, with Gregory Betts</itunes:title>
			<pubDate>Mon, 11 Sep 2023 12:00:54 GMT</pubDate>
			<itunes:duration>47:17</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/64c888b08ad4d40011fd5814/media.mp3" length="56749369" type="audio/mpeg"/>
			<guid isPermaLink="false">64c888b08ad4d40011fd5814</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/023</link>
			<acast:episodeId>64c888b08ad4d40011fd5814</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>alien-with-gregory-betts</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxpzMrp2E6kBVuT21bsYQ5WWgyJmzV5lyFcZ7rs3B5GiJrcX/LCtvRr3ESqV/UYeWu5rl2GmT1ll4P/UCqDVwV0W]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>23</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>This is a fun conversation with scholar-poet Gregory Betts about communicating with aliens! How are humans on earth communicating with aliens? What technology does it require? Why? What void does it fill? And do aliens want to be in contact with us? What relationships are made possible by thinking poetically about aliens? On the episode we ponder the cultural contexts for thinking about alien life in a hostile [to human life] universe. Recorded Jul 21, 2023. Released Sept 11, 2023.</p><br><p>A Sign in Space</p><p><a href="https://asignin.space/listening-to-space/" rel="noopener noreferrer" target="_blank">https://asignin.space/listening-to-space/</a>   </p><br><p>Extraterrestrial Signal Test (<em>NYT</em>)</p><p><a href="https://www.nytimes.com/2023/05/24/science/extraterrestrial-signal-test.html" rel="noopener noreferrer" target="_blank">https://www.nytimes.com/2023/05/24/science/extraterrestrial-signal-test.html</a>  </p><br><p>Aliens in the Void: Writing Beyond the Limits of Language in bpNichol’s The Martyrology (and (Luigi Serafini’s ((Code)x Seriphian(us))))</p><p><a href="https://www.academia.edu/99536790/Aliens_in_the_Void_Writing_Beyond_the_Limits_of_Language_in_bpNichol_s_The_Martyrology_and_Luigi_Serafini_s_Code_x_Seriphian_us_%20and" rel="noopener noreferrer" target="_blank">https://www.academia.edu/99536790/</a> </p><br><p>Here’s the Discord Channel:</p><p><a href="https://discord.com/channels/1066055437457297469/1110258553689739276/1111325509993898096" rel="noopener noreferrer" target="_blank">https://discord.com/channels/1066055437457297469/1110258553689739276/1111325509993898096</a></p><br><p>WRETI workshop:</p><p><a href="https://www.youtube.com/watch?v=f9LLwepou8Q&amp;list=LLDkmzBShjBSqUTvnlWaaKlQ&amp;ab_channel=SETIInstitute" rel="noopener noreferrer" target="_blank">https://www.youtube.com/watch?v=f9LLwepou8Q&amp;list=LLDkmzBShjBSqUTvnlWaaKlQ&amp;ab_channel=SETIInstitute</a></p><p>&nbsp;</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>This is a fun conversation with scholar-poet Gregory Betts about communicating with aliens! How are humans on earth communicating with aliens? What technology does it require? Why? What void does it fill? And do aliens want to be in contact with us? What relationships are made possible by thinking poetically about aliens? On the episode we ponder the cultural contexts for thinking about alien life in a hostile [to human life] universe. Recorded Jul 21, 2023. Released Sept 11, 2023.</p><br><p>A Sign in Space</p><p><a href="https://asignin.space/listening-to-space/" rel="noopener noreferrer" target="_blank">https://asignin.space/listening-to-space/</a>   </p><br><p>Extraterrestrial Signal Test (<em>NYT</em>)</p><p><a href="https://www.nytimes.com/2023/05/24/science/extraterrestrial-signal-test.html" rel="noopener noreferrer" target="_blank">https://www.nytimes.com/2023/05/24/science/extraterrestrial-signal-test.html</a>  </p><br><p>Aliens in the Void: Writing Beyond the Limits of Language in bpNichol’s The Martyrology (and (Luigi Serafini’s ((Code)x Seriphian(us))))</p><p><a href="https://www.academia.edu/99536790/Aliens_in_the_Void_Writing_Beyond_the_Limits_of_Language_in_bpNichol_s_The_Martyrology_and_Luigi_Serafini_s_Code_x_Seriphian_us_%20and" rel="noopener noreferrer" target="_blank">https://www.academia.edu/99536790/</a> </p><br><p>Here’s the Discord Channel:</p><p><a href="https://discord.com/channels/1066055437457297469/1110258553689739276/1111325509993898096" rel="noopener noreferrer" target="_blank">https://discord.com/channels/1066055437457297469/1110258553689739276/1111325509993898096</a></p><br><p>WRETI workshop:</p><p><a href="https://www.youtube.com/watch?v=f9LLwepou8Q&amp;list=LLDkmzBShjBSqUTvnlWaaKlQ&amp;ab_channel=SETIInstitute" rel="noopener noreferrer" target="_blank">https://www.youtube.com/watch?v=f9LLwepou8Q&amp;list=LLDkmzBShjBSqUTvnlWaaKlQ&amp;ab_channel=SETIInstitute</a></p><p>&nbsp;</p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Cycles, with Ana Valdivia</title>
			<itunes:title>Cycles, with Ana Valdivia</itunes:title>
			<pubDate>Mon, 28 Aug 2023 12:00:28 GMT</pubDate>
			<itunes:duration>45:43</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/64c3eb140a9bc70011e2d926/media.mp3" length="54869597" type="audio/mpeg"/>
			<guid isPermaLink="false">64c3eb140a9bc70011e2d926</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/022</link>
			<acast:episodeId>64c3eb140a9bc70011e2d926</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>cycles-with-ana-valdivia</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxoozozoCXNA4njBxaz5GAA0Iy3Tk+imeeFCAhOB6C5KUrS09jw8B4yBc5sll8XNUfO1yYd5lgNY72QiTuOHBCHV]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>22</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>Ana Valdivia walks me through her research on the connections between data centers, AI, and mining. We discuss what it means to be a researcher looking at controversial, problematic, and difficult to access sites, and what resistance to the AI industry — which gobbles up water, minerals, land, and electricity at incredible rates — can look like. Recorded Jul 28, 2023. Released Aug 28, 2023.</p><br><p>Rural Spain could end up hosting infrastructure hubs for AI – here’s what the environmental cost could&nbsp;be</p><p><a href="https://theconversation.com/rural-spain-could-end-up-hosting-infrastructure-hubs-for-ai-heres-what-the-environmental-cost-could-be-205504" rel="noopener noreferrer" target="_blank">https://theconversation.com/rural-spain-could-end-up-hosting-infrastructure-hubs-for-ai-heres-what-the-environmental-cost-could-be-205504</a> </p><br><p>Machines in Flames [Full Documentary]</p><p><a href="https://www.youtube.com/watch?v=qGVMu5OPu7E&amp;ab_channel=DestructionistInternational" rel="noopener noreferrer" target="_blank">https://www.youtube.com/watch?v=qGVMu5OPu7E&amp;ab_channel=DestructionistInternational</a> </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Ana Valdivia walks me through her research on the connections between data centers, AI, and mining. We discuss what it means to be a researcher looking at controversial, problematic, and difficult to access sites, and what resistance to the AI industry — which gobbles up water, minerals, land, and electricity at incredible rates — can look like. Recorded Jul 28, 2023. Released Aug 28, 2023.</p><br><p>Rural Spain could end up hosting infrastructure hubs for AI – here’s what the environmental cost could&nbsp;be</p><p><a href="https://theconversation.com/rural-spain-could-end-up-hosting-infrastructure-hubs-for-ai-heres-what-the-environmental-cost-could-be-205504" rel="noopener noreferrer" target="_blank">https://theconversation.com/rural-spain-could-end-up-hosting-infrastructure-hubs-for-ai-heres-what-the-environmental-cost-could-be-205504</a> </p><br><p>Machines in Flames [Full Documentary]</p><p><a href="https://www.youtube.com/watch?v=qGVMu5OPu7E&amp;ab_channel=DestructionistInternational" rel="noopener noreferrer" target="_blank">https://www.youtube.com/watch?v=qGVMu5OPu7E&amp;ab_channel=DestructionistInternational</a> </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Contradictions, with Melissa Gregg</title>
			<itunes:title>Contradictions, with Melissa Gregg</itunes:title>
			<pubDate>Mon, 14 Aug 2023 12:00:39 GMT</pubDate>
			<itunes:duration>56:43</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/64beb6cd4fe1da0012fef20b/media.mp3" length="68069793" type="audio/mpeg"/>
			<guid isPermaLink="false">64beb6cd4fe1da0012fef20b</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/021</link>
			<acast:episodeId>64beb6cd4fe1da0012fef20b</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>contradictions-with-melissa-gregg</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxpIKGt4ApHdg76tANy9JmT4hqCCKwqItXBOfHbGtuIlfERI08bIHbia8tPvbSMNcTop5Boj4D2oXt+Vh3UeYRT0]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>21</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1768409801246-2f3adf3e-8650-4b5c-adc0-777c6a9cecbe.jpeg"/>
			<description><![CDATA[<p>Melissa Gregg and I discuss what it means to think ecologically, in, through, and with technology, at a time of perpetually new things compelling us into a sense of urgency about the climate crisis. Together we untangle the affective investments and contradictions embedded in moment of reckoning with out place in the world and on this planet. Recorded Jul 12, 2023. Released Aug 14, 2023.</p><br><p>Keynote: The Ecological Impact of an Automated Society (2022 ADM+S Symposium)</p><p><a href="https://www.youtube.com/watch?v=TWaBkHsUsqM" rel="noopener noreferrer" target="_blank">https://www.youtube.com/watch?v=TWaBkHsUsqM</a></p><br><p>Circularity conference report</p><p><a href="https://melgregg.com/2023/06/14/talking-in-circles-about-e-waste/" rel="noopener noreferrer" target="_blank">https://melgregg.com/2023/06/14/talking-in-circles-about-e-waste/</a></p><br><p>Where to donate your computer (US)</p><p><a href="https://digitunity.org/get-involved/give-equipment/donate-your-computer/" rel="noopener noreferrer" target="_blank">https://digitunity.org/get-involved/give-equipment/donate-your-computer/</a></p><p>Right to repair association</p><p><a href="https://www.repair.org/" rel="noopener noreferrer" target="_blank">https://www.repair.org/</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Melissa Gregg and I discuss what it means to think ecologically, in, through, and with technology, at a time of perpetually new things compelling us into a sense of urgency about the climate crisis. Together we untangle the affective investments and contradictions embedded in moment of reckoning with out place in the world and on this planet. Recorded Jul 12, 2023. Released Aug 14, 2023.</p><br><p>Keynote: The Ecological Impact of an Automated Society (2022 ADM+S Symposium)</p><p><a href="https://www.youtube.com/watch?v=TWaBkHsUsqM" rel="noopener noreferrer" target="_blank">https://www.youtube.com/watch?v=TWaBkHsUsqM</a></p><br><p>Circularity conference report</p><p><a href="https://melgregg.com/2023/06/14/talking-in-circles-about-e-waste/" rel="noopener noreferrer" target="_blank">https://melgregg.com/2023/06/14/talking-in-circles-about-e-waste/</a></p><br><p>Where to donate your computer (US)</p><p><a href="https://digitunity.org/get-involved/give-equipment/donate-your-computer/" rel="noopener noreferrer" target="_blank">https://digitunity.org/get-involved/give-equipment/donate-your-computer/</a></p><p>Right to repair association</p><p><a href="https://www.repair.org/" rel="noopener noreferrer" target="_blank">https://www.repair.org/</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Nuance, with Tega Brain</title>
			<itunes:title>Nuance, with Tega Brain</itunes:title>
			<pubDate>Mon, 24 Jul 2023 12:00:48 GMT</pubDate>
			<itunes:duration>54:52</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/646e89912ebbc000114c3da0/media.mp3" length="65849385" type="audio/mpeg"/>
			<guid isPermaLink="false">646e89912ebbc000114c3da0</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/020</link>
			<acast:episodeId>646e89912ebbc000114c3da0</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>nuance-with-tega-brain</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxrDwZyKJ2orSIqVlGxc0ZsrPGvrh5NgVhQnhJNmCUkE79QthbSWexOoFRIjGMJ26iHpPXsVusqQ2yVTMcoh777a]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>20</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1671481599523-356f38441c9a3ffc416c1e8c237f365e.jpeg"/>
			<description><![CDATA[<p>With Tega Brain — who, after many recommendations – I finally got to meet over Zoom, we discuss using bots to distort clicks related to climate and how the concept of ecology both informs and distorts our ideas about human-nature relationships. Art, addressing these concerns, requires a lot of nuance. Recorded May 23, 2023. Released July 24, 2023.</p><br><p>The Battle to Control the Carbon Media Cycle, Ding Magazine.</p><p><a href="https://dingdingding.org/issue-4/the-battle-to-control-the-carbon-media-cycle/" rel="noopener noreferrer" target="_blank">https://dingdingding.org/issue-4/the-battle-to-control-the-carbon-media-cycle/</a> </p><br><p>Synthetic Messenger</p><p><a href="https://syntheticmessenger.labr.io/#about" rel="noopener noreferrer" target="_blank">https://syntheticmessenger.labr.io/#about</a> </p><p>(on YouTube): </p><p><a href="https://www.youtube.com/watch?v=RlFW1gL6AGU&amp;ab_channel=SyntheticMessenger" rel="noopener noreferrer" target="_blank">https://www.youtube.com/watch?v=RlFW1gL6AGU&amp;ab_channel=SyntheticMessenger</a> </p><br><p>The Environment Is Not A System </p><p><a href="https://aprja.net/article/view/116062" rel="noopener noreferrer" target="_blank">https://aprja.net/article/view/116062</a>  </p><br><p>The viral false claim that nearly 200 arsonists are behind the Australia fires, explained</p><p><a href="https://www.vox.com/2020/1/9/21058332/australia-fires-arson-lightning-explained" rel="noopener noreferrer" target="_blank">https://www.vox.com/2020/1/9/21058332/australia-fires-arson-lightning-explained</a> </p><p><br></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>With Tega Brain — who, after many recommendations – I finally got to meet over Zoom, we discuss using bots to distort clicks related to climate and how the concept of ecology both informs and distorts our ideas about human-nature relationships. Art, addressing these concerns, requires a lot of nuance. Recorded May 23, 2023. Released July 24, 2023.</p><br><p>The Battle to Control the Carbon Media Cycle, Ding Magazine.</p><p><a href="https://dingdingding.org/issue-4/the-battle-to-control-the-carbon-media-cycle/" rel="noopener noreferrer" target="_blank">https://dingdingding.org/issue-4/the-battle-to-control-the-carbon-media-cycle/</a> </p><br><p>Synthetic Messenger</p><p><a href="https://syntheticmessenger.labr.io/#about" rel="noopener noreferrer" target="_blank">https://syntheticmessenger.labr.io/#about</a> </p><p>(on YouTube): </p><p><a href="https://www.youtube.com/watch?v=RlFW1gL6AGU&amp;ab_channel=SyntheticMessenger" rel="noopener noreferrer" target="_blank">https://www.youtube.com/watch?v=RlFW1gL6AGU&amp;ab_channel=SyntheticMessenger</a> </p><br><p>The Environment Is Not A System </p><p><a href="https://aprja.net/article/view/116062" rel="noopener noreferrer" target="_blank">https://aprja.net/article/view/116062</a>  </p><br><p>The viral false claim that nearly 200 arsonists are behind the Australia fires, explained</p><p><a href="https://www.vox.com/2020/1/9/21058332/australia-fires-arson-lightning-explained" rel="noopener noreferrer" target="_blank">https://www.vox.com/2020/1/9/21058332/australia-fires-arson-lightning-explained</a> </p><p><br></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Resurrected, with Tonia Sutherland</title>
			<itunes:title>Resurrected, with Tonia Sutherland</itunes:title>
			<pubDate>Mon, 10 Jul 2023 12:00:30 GMT</pubDate>
			<itunes:duration>56:13</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/646bf726bcb3130011a14b0d/media.mp3" length="67469499" type="audio/mpeg"/>
			<guid isPermaLink="false">646bf726bcb3130011a14b0d</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/019</link>
			<acast:episodeId>646bf726bcb3130011a14b0d</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>resurrected-with-tonia-sutherland</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxqRAp76zjmqFsZwAcTO//AF+qsfONAjvk1xilk35EiFMD5nKTcDVDG80Qy5prcyYaU+fUgflkpbljcGWsofaITP]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>19</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1671481599523-356f38441c9a3ffc416c1e8c237f365e.jpeg"/>
			<description><![CDATA[<p>“Resurrecting” means bringing back to life, raising from the dead, restoring to vibrancy, bringing into public view, reanimating… Tonia Sutherland does just this for the concept of the black body in the digital afterlife, through a critique of the archive, the HeLa cells legacy, holograms of entertainers, and emergent AI. An absolute honour to host this episode on an early read of Sutherland’s new book. Recorded May 22, 2023. Released July 10, 2023.</p><br><p>Resurrecting the Black Body: Race and the Digital Afterlife</p><p><a href="https://www.ucpress.edu/book/9780520383876/resurrecting-the-black-body" rel="noopener noreferrer" target="_blank">https://www.ucpress.edu/book/9780520383876/resurrecting-the-black-body</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>“Resurrecting” means bringing back to life, raising from the dead, restoring to vibrancy, bringing into public view, reanimating… Tonia Sutherland does just this for the concept of the black body in the digital afterlife, through a critique of the archive, the HeLa cells legacy, holograms of entertainers, and emergent AI. An absolute honour to host this episode on an early read of Sutherland’s new book. Recorded May 22, 2023. Released July 10, 2023.</p><br><p>Resurrecting the Black Body: Race and the Digital Afterlife</p><p><a href="https://www.ucpress.edu/book/9780520383876/resurrecting-the-black-body" rel="noopener noreferrer" target="_blank">https://www.ucpress.edu/book/9780520383876/resurrecting-the-black-body</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Neural, with Théo Lepage-Richer and Ranjodh Singh Dhaliwal</title>
			<itunes:title>Neural, with Théo Lepage-Richer and Ranjodh Singh Dhaliwal</itunes:title>
			<pubDate>Mon, 26 Jun 2023 12:00:57 GMT</pubDate>
			<itunes:duration>56:05</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/6467a4ff2c59a50011dc60e7/media.mp3" length="67309630" type="audio/mpeg"/>
			<guid isPermaLink="false">6467a4ff2c59a50011dc60e7</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/018</link>
			<acast:episodeId>6467a4ff2c59a50011dc60e7</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>neural-with-theo-lepage-richer-and-ranjodh-singh-dhaliwal</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxqoDT5V1I9petf7D3suwWMmWCVF7DkkXUdhzMC2wNtQh2Yre+rVJ/9SwApnQN0bsozh7oCuT7YMeZnX3o8Ck7Db]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>18</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1671481599523-356f38441c9a3ffc416c1e8c237f365e.jpeg"/>
			<description><![CDATA[<p>n this episode, I have a truly delightful conversation about neural networks with Théo Lepage-Richer and Ranjodh Singh Dhaliwal. We go through the many ways that the body gets ‘technologized’ and technology gets ‘biologized’ and how and why these conceptualization happen in specific historical and political contexts, and why they matter — and more than neural networks themselves, maybe? You’ll learn about neural media, parascientific media, and the various legacies of the brain-as-site and -model for various things. Recorded May 19, 2023. Released June 26, 2023.</p><br><p><em>Neural Networks</em> (by Ranjodh Singh Dhaliwal, Théo Lepage-Richer and Lucy Suchman)</p><p><a href="https://www.upress.umn.edu/book-division/books/neural-networks" rel="noopener noreferrer" target="_blank">https://www.upress.umn.edu/book-division/books/neural-networks</a></p><br><p>Mind-reading technology has arrived</p><p><a href="https://www.vox.com/future-perfect/2023/5/4/23708162/neurotechnology-mind-reading-brain-neuralink-brain-computer-interface" rel="noopener noreferrer" target="_blank">https://www.vox.com/future-perfect/2023/5/4/23708162/neurotechnology-mind-reading-brain-neuralink-brain-computer-interface</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>n this episode, I have a truly delightful conversation about neural networks with Théo Lepage-Richer and Ranjodh Singh Dhaliwal. We go through the many ways that the body gets ‘technologized’ and technology gets ‘biologized’ and how and why these conceptualization happen in specific historical and political contexts, and why they matter — and more than neural networks themselves, maybe? You’ll learn about neural media, parascientific media, and the various legacies of the brain-as-site and -model for various things. Recorded May 19, 2023. Released June 26, 2023.</p><br><p><em>Neural Networks</em> (by Ranjodh Singh Dhaliwal, Théo Lepage-Richer and Lucy Suchman)</p><p><a href="https://www.upress.umn.edu/book-division/books/neural-networks" rel="noopener noreferrer" target="_blank">https://www.upress.umn.edu/book-division/books/neural-networks</a></p><br><p>Mind-reading technology has arrived</p><p><a href="https://www.vox.com/future-perfect/2023/5/4/23708162/neurotechnology-mind-reading-brain-neuralink-brain-computer-interface" rel="noopener noreferrer" target="_blank">https://www.vox.com/future-perfect/2023/5/4/23708162/neurotechnology-mind-reading-brain-neuralink-brain-computer-interface</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Refusal, with Blair Attard-Frost</title>
			<itunes:title>Refusal, with Blair Attard-Frost</itunes:title>
			<pubDate>Mon, 12 Jun 2023 12:00:21 GMT</pubDate>
			<itunes:duration>48:11</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/645ab0ec21ff9000119b640a/media.mp3" length="57829793" type="audio/mpeg"/>
			<guid isPermaLink="false">645ab0ec21ff9000119b640a</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/017</link>
			<acast:episodeId>645ab0ec21ff9000119b640a</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>refusal-with-blair-attard-frost</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxp0VUU3FXeUFt57rhMR8ke8syefemknDwxuyI9jRZ6BIwmUok9/KA7msaYCbeKqwWBFdA7nLHaRpsMPXGWFxM5j]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>17</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1671481599523-356f38441c9a3ffc416c1e8c237f365e.jpeg"/>
			<description><![CDATA[<p>In this episode Blair Attard-Frost and I discuss AI value &amp; supply chains, AI policies and strategies in Canada and how queer and trans theories can help inform (and refuse) how we think and feel about AI governance. We also discuss the role of creative writing, as outlet and way to reach audiences that might be compelled by “AI”. Recorded May 9, 2023. Released June 12, 2023.</p><br><p>The Ethics of AI Business Practices: A Review of 47 AI Ethics Guidelines</p><p><a href="https://montrealethics.ai/the-ethics-of-ai-business-practices-a-review-of-47-ai-ethics-guidelines/" rel="noopener noreferrer" target="_blank">https://montrealethics.ai/the-ethics-of-ai-business-practices-a-review-of-47-ai-ethics-guidelines/</a>   </p><br><p>Once a promising leader, Canada’s artificial-intelligence strategy is now a fragmented laggard <a href="https://www.theglobeandmail.com/opinion/article-once-a-promising-leader-canadas-artificial-intelligence-strategy-is/" rel="noopener noreferrer" target="_blank">https://www.theglobeandmail.com/opinion/article-once-a-promising-leader-canadas-artificial-intelligence-strategy-is/</a> </p><br><p>Object Type 3</p><p><a href="https://objecttype3.com/" rel="noopener noreferrer" target="_blank">https://objecttype3.com/</a>  </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>In this episode Blair Attard-Frost and I discuss AI value &amp; supply chains, AI policies and strategies in Canada and how queer and trans theories can help inform (and refuse) how we think and feel about AI governance. We also discuss the role of creative writing, as outlet and way to reach audiences that might be compelled by “AI”. Recorded May 9, 2023. Released June 12, 2023.</p><br><p>The Ethics of AI Business Practices: A Review of 47 AI Ethics Guidelines</p><p><a href="https://montrealethics.ai/the-ethics-of-ai-business-practices-a-review-of-47-ai-ethics-guidelines/" rel="noopener noreferrer" target="_blank">https://montrealethics.ai/the-ethics-of-ai-business-practices-a-review-of-47-ai-ethics-guidelines/</a>   </p><br><p>Once a promising leader, Canada’s artificial-intelligence strategy is now a fragmented laggard <a href="https://www.theglobeandmail.com/opinion/article-once-a-promising-leader-canadas-artificial-intelligence-strategy-is/" rel="noopener noreferrer" target="_blank">https://www.theglobeandmail.com/opinion/article-once-a-promising-leader-canadas-artificial-intelligence-strategy-is/</a> </p><br><p>Object Type 3</p><p><a href="https://objecttype3.com/" rel="noopener noreferrer" target="_blank">https://objecttype3.com/</a>  </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Imaginaries, with Fenwick McKelvey</title>
			<itunes:title>Imaginaries, with Fenwick McKelvey</itunes:title>
			<pubDate>Tue, 30 May 2023 12:00:49 GMT</pubDate>
			<itunes:duration>51:08</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/644948def904a30010e7722c/media.mp3" length="61369385" type="audio/mpeg"/>
			<guid isPermaLink="false">644948def904a30010e7722c</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/016</link>
			<acast:episodeId>644948def904a30010e7722c</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>imaginaries-with-fenwick-mckelvey</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxq4p9COmKoPD8cZPQDn9AoKo+Ni4AJa69Go8BaUnWzt8LmT7vztK2RsL2WJEbH+o0LGc7HwDhuDXAPZRix9g3zu]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>16</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1671481599523-356f38441c9a3ffc416c1e8c237f365e.jpeg"/>
			<description><![CDATA[<p>A recent open letter by Future of Life Institute calling for a six-month ban on AI development has received widespread attention online. Fenwick McKelvey and I discuss the hyped-up ideas from tech leaders about “powerful digital minds” and how fear of AI functions as a marketing tool. We also go over regulation and policy initiatives in Canada. Recorded April 25, 2023. Released May 30, 2023.</p><br><p>Let’s base AI debates on reality, not extreme fears about the future</p><p><a href="https://theconversation.com/lets-base-ai-debates-on-reality-not-extreme-fears-about-the-future-203030" rel="noopener noreferrer" target="_blank">https://theconversation.com/lets-base-ai-debates-on-reality-not-extreme-fears-about-the-future-203030</a> </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>A recent open letter by Future of Life Institute calling for a six-month ban on AI development has received widespread attention online. Fenwick McKelvey and I discuss the hyped-up ideas from tech leaders about “powerful digital minds” and how fear of AI functions as a marketing tool. We also go over regulation and policy initiatives in Canada. Recorded April 25, 2023. Released May 30, 2023.</p><br><p>Let’s base AI debates on reality, not extreme fears about the future</p><p><a href="https://theconversation.com/lets-base-ai-debates-on-reality-not-extreme-fears-about-the-future-203030" rel="noopener noreferrer" target="_blank">https://theconversation.com/lets-base-ai-debates-on-reality-not-extreme-fears-about-the-future-203030</a> </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Chatter, with David M. Berry, Peggy Weil, Arthur Schwarz, Mark Marino and Jeff Shrager</title>
			<itunes:title>Chatter, with David M. Berry, Peggy Weil, Arthur Schwarz, Mark Marino and Jeff Shrager</itunes:title>
			<pubDate>Sat, 20 May 2023 12:00:45 GMT</pubDate>
			<itunes:duration>1:09:49</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/644947ae18e15100119ba113/media.mp3" length="83792373" type="audio/mpeg"/>
			<guid isPermaLink="false">644947ae18e15100119ba113</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/015</link>
			<acast:episodeId>644947ae18e15100119ba113</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>chatter-with-david-m-berry-peggy-weil-arthur-schwarz-mark-ma</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxo2dmyIichXCW5RRuSLs3Jbj+zhn3pJipXtNcmJ8YvmyomhI1W9MH3HJrysxLVhCliaGIam7UQ4VhQ2I7xxISKX]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>15</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1671481599523-356f38441c9a3ffc416c1e8c237f365e.jpeg"/>
			<description><![CDATA[<p>In this episode I speak with some members of the ELIZANGEN project: David M. Berry, Peggy Weil, Arthur Schwarz, Mark Marino and Jeff Shrager. ELIZA is the first “chatterbots”, emerging in the 1960’s. Listen to our conversation to learn more about finding the code for ELIZA in the archives, building various iterations of the bot, and how it helps us think about AI like Chat-GPT, today. Recorded April 24, 2023. Released May 20, 2023.</p><br><p>pr0c3ss1ng</p><p><a href="https://www.reddit.com/r/pr0c3ss1ng/" rel="noopener noreferrer" target="_blank">https://www.reddit.com/r/pr0c3ss1ng/ </a></p><br><p>Mr. Mind </p><p><a href="http://82.223.169.27/MrMind/index.html" rel="noopener noreferrer" target="_blank">http://82.223.169.27/MrMind/index.html</a></p><br><p>ELIZAGEN</p><p><a href="https://sites.google.com/view/elizagen-org/team-eliza?authuser=0" rel="noopener noreferrer" target="_blank">https://sites.google.com/view/elizagen-org/team-eliza?authuser=0</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>In this episode I speak with some members of the ELIZANGEN project: David M. Berry, Peggy Weil, Arthur Schwarz, Mark Marino and Jeff Shrager. ELIZA is the first “chatterbots”, emerging in the 1960’s. Listen to our conversation to learn more about finding the code for ELIZA in the archives, building various iterations of the bot, and how it helps us think about AI like Chat-GPT, today. Recorded April 24, 2023. Released May 20, 2023.</p><br><p>pr0c3ss1ng</p><p><a href="https://www.reddit.com/r/pr0c3ss1ng/" rel="noopener noreferrer" target="_blank">https://www.reddit.com/r/pr0c3ss1ng/ </a></p><br><p>Mr. Mind </p><p><a href="http://82.223.169.27/MrMind/index.html" rel="noopener noreferrer" target="_blank">http://82.223.169.27/MrMind/index.html</a></p><br><p>ELIZAGEN</p><p><a href="https://sites.google.com/view/elizagen-org/team-eliza?authuser=0" rel="noopener noreferrer" target="_blank">https://sites.google.com/view/elizagen-org/team-eliza?authuser=0</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Risk, with Émile P. Torres</title>
			<itunes:title>Risk, with Émile P. Torres</itunes:title>
			<pubDate>Wed, 10 May 2023 12:00:01 GMT</pubDate>
			<itunes:duration>54:42</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/643da45bae2be60011b38e5a/media.mp3" length="65645630" type="audio/mpeg"/>
			<guid isPermaLink="false">643da45bae2be60011b38e5a</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/014</link>
			<acast:episodeId>643da45bae2be60011b38e5a</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>risk-with-emile-p-torres</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxqq9jMLbFChnVTsg2e4hVVee+cMeXHIGFZJJV3W8cxeJP1vxP9+uzRz2LKXJAw1/dpAPC2WGRR93K6hiQD5prdI]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>14</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1671481599523-356f38441c9a3ffc416c1e8c237f365e.jpeg"/>
			<description><![CDATA[<p>Ever wonder what longtermism is and how it’s come to be such a big influence over the tech world? I speak with Émile P. Torres about all things connecting AI hype, ideologies about future humans, big tech’s concentration of power, the idea of human potential (or humanity’s potential) and the idea of moral consequence. Recorded Apr 17, 2023. Released May 10, 2023.</p><br><p>Longtermism Hub</p><p><a href="https://www.longtermism-hub.com/critiques" rel="noopener noreferrer" target="_blank">https://www.longtermism-hub.com/critiques</a> </p><br><p>Against longtermism </p><p><a href="https://aeon.co/essays/why-longtermism-is-the-worlds-most-dangerous-secular-credo" rel="noopener noreferrer" target="_blank">https://aeon.co/essays/why-longtermism-is-the-worlds-most-dangerous-secular-credo</a> </p><br><p>Understanding "longtermism": Why this suddenly influential philosophy is so toxic</p><p><a href="https://www.salon.com/2022/08/20/understanding-longtermism-why-this-suddenly-influential-philosophy-is-so/" rel="noopener noreferrer" target="_blank">https://www.salon.com/2022/08/20/understanding-longtermism-why-this-suddenly-influential-philosophy-is-so/</a> </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Ever wonder what longtermism is and how it’s come to be such a big influence over the tech world? I speak with Émile P. Torres about all things connecting AI hype, ideologies about future humans, big tech’s concentration of power, the idea of human potential (or humanity’s potential) and the idea of moral consequence. Recorded Apr 17, 2023. Released May 10, 2023.</p><br><p>Longtermism Hub</p><p><a href="https://www.longtermism-hub.com/critiques" rel="noopener noreferrer" target="_blank">https://www.longtermism-hub.com/critiques</a> </p><br><p>Against longtermism </p><p><a href="https://aeon.co/essays/why-longtermism-is-the-worlds-most-dangerous-secular-credo" rel="noopener noreferrer" target="_blank">https://aeon.co/essays/why-longtermism-is-the-worlds-most-dangerous-secular-credo</a> </p><br><p>Understanding "longtermism": Why this suddenly influential philosophy is so toxic</p><p><a href="https://www.salon.com/2022/08/20/understanding-longtermism-why-this-suddenly-influential-philosophy-is-so/" rel="noopener noreferrer" target="_blank">https://www.salon.com/2022/08/20/understanding-longtermism-why-this-suddenly-influential-philosophy-is-so/</a> </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Nostalgia, with Grafton Tanner</title>
			<itunes:title>Nostalgia, with Grafton Tanner</itunes:title>
			<pubDate>Sun, 30 Apr 2023 12:00:01 GMT</pubDate>
			<itunes:duration>43:41</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/6433482dd5ecd600119a3da7/media.mp3" length="52435508" type="audio/mpeg"/>
			<guid isPermaLink="false">6433482dd5ecd600119a3da7</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/013</link>
			<acast:episodeId>6433482dd5ecd600119a3da7</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>nostalgia-with-grafton-tanner</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxqyYb+X10Rn64eBC9oHrUQ6htQnMNnm5JAXdfPE4fapfcQlV1hHA9xFLMOCDAm+6C6ccFva+yYWuxSLVPh1zgH0]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>13</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1671481599523-356f38441c9a3ffc416c1e8c237f365e.jpeg"/>
			<description><![CDATA[<p>What is nostalgia? A memory? A feeling? A weapon?? Tune in to my conversation with Grafton Tanner about the relationship between tech, nostalgia, retrobate and much more! Recorded Mar 31, 2023. Released April 30, 2023.</p><br><p>Grafton Tanner</p><p><a href="https://graftontanner.com/" rel="noopener noreferrer" target="_blank">https://graftontanner.com/</a> </p><br><p>Yesterday Once More</p><p><a href="https://reallifemag.com/yesterday-once-more/" rel="noopener noreferrer" target="_blank">https://reallifemag.com/yesterday-once-more/</a> </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>What is nostalgia? A memory? A feeling? A weapon?? Tune in to my conversation with Grafton Tanner about the relationship between tech, nostalgia, retrobate and much more! Recorded Mar 31, 2023. Released April 30, 2023.</p><br><p>Grafton Tanner</p><p><a href="https://graftontanner.com/" rel="noopener noreferrer" target="_blank">https://graftontanner.com/</a> </p><br><p>Yesterday Once More</p><p><a href="https://reallifemag.com/yesterday-once-more/" rel="noopener noreferrer" target="_blank">https://reallifemag.com/yesterday-once-more/</a> </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Flesh, with Maya Indira Ganesh</title>
			<itunes:title>Flesh, with Maya Indira Ganesh</itunes:title>
			<pubDate>Thu, 20 Apr 2023 12:00:08 GMT</pubDate>
			<itunes:duration>52:25</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/64334529545d9b0011cb3fd7/media.mp3" length="62915312" type="audio/mpeg"/>
			<guid isPermaLink="false">64334529545d9b0011cb3fd7</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/012</link>
			<acast:episodeId>64334529545d9b0011cb3fd7</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:episodeUrl>flesh-with-maya-indira-ganesh</acast:episodeUrl>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxogWaKMczUdKG/FVPBNOLk3W4OHt86PUA4aSQd3/qVK0PJwFcp4NfzP3z51OWfel1KBTOrxRRuPL3K32zPQFrrz]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>12</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1671481599523-356f38441c9a3ffc416c1e8c237f365e.jpeg"/>
			<description><![CDATA[<p>Maya and I talk about [digital] flesh, [digital] intimacies, and how writing in and out of academic contexts helps us <em>feel</em> through our best ideas. We wonder what we owe the world in terms of representation and identity, and whether it’s okay to offer up an altered version of ourselves.  Recorded Mar 30, 2023. Released April 20, 2023.</p><p>Between Flesh: Tech Degrees of Separation</p><p><a href="https://13thgwangjubiennale.org/minds-rising/ganesh/" rel="noopener noreferrer" target="_blank">https://13thgwangjubiennale.org/minds-rising/ganesh/</a> </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Maya and I talk about [digital] flesh, [digital] intimacies, and how writing in and out of academic contexts helps us <em>feel</em> through our best ideas. We wonder what we owe the world in terms of representation and identity, and whether it’s okay to offer up an altered version of ourselves.  Recorded Mar 30, 2023. Released April 20, 2023.</p><p>Between Flesh: Tech Degrees of Separation</p><p><a href="https://13thgwangjubiennale.org/minds-rising/ganesh/" rel="noopener noreferrer" target="_blank">https://13thgwangjubiennale.org/minds-rising/ganesh/</a> </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Unsafe, with Gina Neff</title>
			<itunes:title>Unsafe, with Gina Neff</itunes:title>
			<pubDate>Mon, 10 Apr 2023 12:00:51 GMT</pubDate>
			<itunes:duration>49:13</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/63f3ba54cc3d9200112d084c/media.mp3" length="59069042" type="audio/mpeg"/>
			<guid isPermaLink="false">63f3ba54cc3d9200112d084c</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/011</link>
			<acast:episodeId>63f3ba54cc3d9200112d084c</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxqahlm4ADT6lQnATyXtXffm5RxP4dCZpNEYtlCEbPRRTHUu914ygfCv4VB+tfboDyv8/vDwtbtWpgBC+Yoc3J3u]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>11</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1671481599523-356f38441c9a3ffc416c1e8c237f365e.jpeg"/>
			<description><![CDATA[<p>Gina and I talk about her recent article(s) on Wired.com that track violence against women and girls on the internet. We discuss how the internet should be regulated for safety, and question the limits of technological solutions. Recorded Feb 20, 2023. Released April 10, 2023.</p><br><p><strong>Minderoo Centre for Technology and Democracy</strong></p><p><a href="https://www.mctd.ac.uk/" rel="noopener noreferrer" target="_blank">https://www.mctd.ac.uk/</a> </p><br><p><strong>The Internet Is at Risk of Driving Women Away</strong></p><p><a href="https://www.wired.com/story/online-harassment-women-internet/" rel="noopener noreferrer" target="_blank">https://www.wired.com/story/online-harassment-women-internet/</a> </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Gina and I talk about her recent article(s) on Wired.com that track violence against women and girls on the internet. We discuss how the internet should be regulated for safety, and question the limits of technological solutions. Recorded Feb 20, 2023. Released April 10, 2023.</p><br><p><strong>Minderoo Centre for Technology and Democracy</strong></p><p><a href="https://www.mctd.ac.uk/" rel="noopener noreferrer" target="_blank">https://www.mctd.ac.uk/</a> </p><br><p><strong>The Internet Is at Risk of Driving Women Away</strong></p><p><a href="https://www.wired.com/story/online-harassment-women-internet/" rel="noopener noreferrer" target="_blank">https://www.wired.com/story/online-harassment-women-internet/</a> </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Stale, with Zane Griffin Talley Cooper</title>
			<itunes:title>Stale, with Zane Griffin Talley Cooper</itunes:title>
			<pubDate>Thu, 30 Mar 2023 12:00:53 GMT</pubDate>
			<itunes:duration>59:46</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/63db1cfe02c87c0011fefb51/media.mp3" length="71725369" type="audio/mpeg"/>
			<guid isPermaLink="false">63db1cfe02c87c0011fefb51</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/010</link>
			<acast:episodeId>63db1cfe02c87c0011fefb51</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxpSgKqhIEgV7uHUHnVnj8jYHOTWofAwnvcQnuX0ZYKv47urzyYaHepPXuhsGU0yVdOJQnP2E6H1RhSp8+r7Xv9p]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>10</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1671481599523-356f38441c9a3ffc416c1e8c237f365e.jpeg"/>
			<description><![CDATA[<p>Zane and I delve deep into the materialities of the virtual world, and try to imagine who the Metaverse (in all its possible iterations) is for — with some attention to our uncomfortable, sweaty bodies, fussing with headsets and sensory overload. We conclude that Big Tech’s visions of the future are a bit stale. Recorded Jan 20, 2023. Released March 30, 2023.</p><br><p>The Metaverse</p><p><a href="https://about.meta.com/what-is-the-metaverse/" rel="noopener noreferrer" target="_blank">https://about.meta.com/what-is-the-metaverse/</a>  </p><br><p>Decenterland</p><p><a href="https://decentraland.org/download/" rel="noopener noreferrer" target="_blank">https://decentraland.org/download/</a> </p><br><p>Zane’s work</p><p><a href="https://www.zanegriffintalleycooper.com/" rel="noopener noreferrer" target="_blank">https://www.zanegriffintalleycooper.com/</a> </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Zane and I delve deep into the materialities of the virtual world, and try to imagine who the Metaverse (in all its possible iterations) is for — with some attention to our uncomfortable, sweaty bodies, fussing with headsets and sensory overload. We conclude that Big Tech’s visions of the future are a bit stale. Recorded Jan 20, 2023. Released March 30, 2023.</p><br><p>The Metaverse</p><p><a href="https://about.meta.com/what-is-the-metaverse/" rel="noopener noreferrer" target="_blank">https://about.meta.com/what-is-the-metaverse/</a>  </p><br><p>Decenterland</p><p><a href="https://decentraland.org/download/" rel="noopener noreferrer" target="_blank">https://decentraland.org/download/</a> </p><br><p>Zane’s work</p><p><a href="https://www.zanegriffintalleycooper.com/" rel="noopener noreferrer" target="_blank">https://www.zanegriffintalleycooper.com/</a> </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Greener, with Sasha Luccioni</title>
			<itunes:title>Greener, with Sasha Luccioni</itunes:title>
			<pubDate>Mon, 20 Mar 2023 12:00:13 GMT</pubDate>
			<itunes:duration>49:57</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/63e404569468540011147024/media.mp3" length="59959818" type="audio/mpeg"/>
			<guid isPermaLink="false">63e404569468540011147024</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/009</link>
			<acast:episodeId>63e404569468540011147024</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxplJnOtwq6IAvjXsstc9pVs6ZgtaafPNLoBcefnL4CiJWwVcDiZwnmUhH+vmXD+w+6SAc5kCIBjmT3AocD4faxI]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>9</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1671481599523-356f38441c9a3ffc416c1e8c237f365e.jpeg"/>
			<description><![CDATA[<p>Sasha Luccioni explains LLM to be their parameters and why these matter when thinking about the social and environmental ramifications of AI. Recorded Feb 8, 2022. Released March 20, 2023.</p><br><p>Publications (under “Alexandra Luccioni”)</p><p><a href="https://www.sashaluccioni.com/publications/" rel="noopener noreferrer" target="_blank">https://www.sashaluccioni.com/publications/</a> </p><br><p>ICLR 2023 Workshop: Tackling Climate Change with Machine Learning <a href="https://www.climatechange.ai/events/iclr2023" rel="noopener noreferrer" target="_blank">https://www.climatechange.ai/events/iclr2023</a> </p><br><p>We’re getting a better idea of AI’s true carbon footprint by Melissa Heikkilä <a href="https://www.technologyreview.com/2022/11/14/1063192/were-getting-a-better-idea-of-ais-true-carbon-footprint" rel="noopener noreferrer" target="_blank">https://www.technologyreview.com/2022/11/14/1063192/were-getting-a-better-idea-of-ais-true-carbon-footprint</a> </p><p><br></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Sasha Luccioni explains LLM to be their parameters and why these matter when thinking about the social and environmental ramifications of AI. Recorded Feb 8, 2022. Released March 20, 2023.</p><br><p>Publications (under “Alexandra Luccioni”)</p><p><a href="https://www.sashaluccioni.com/publications/" rel="noopener noreferrer" target="_blank">https://www.sashaluccioni.com/publications/</a> </p><br><p>ICLR 2023 Workshop: Tackling Climate Change with Machine Learning <a href="https://www.climatechange.ai/events/iclr2023" rel="noopener noreferrer" target="_blank">https://www.climatechange.ai/events/iclr2023</a> </p><br><p>We’re getting a better idea of AI’s true carbon footprint by Melissa Heikkilä <a href="https://www.technologyreview.com/2022/11/14/1063192/were-getting-a-better-idea-of-ais-true-carbon-footprint" rel="noopener noreferrer" target="_blank">https://www.technologyreview.com/2022/11/14/1063192/were-getting-a-better-idea-of-ais-true-carbon-footprint</a> </p><p><br></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Handled, with Hank Gerba</title>
			<itunes:title>Handled, with Hank Gerba</itunes:title>
			<pubDate>Fri, 10 Mar 2023 13:00:59 GMT</pubDate>
			<itunes:duration>51:04</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/63db1c3302c87c0011fed41f/media.mp3" length="61285271" type="audio/mpeg"/>
			<guid isPermaLink="false">63db1c3302c87c0011fed41f</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/008</link>
			<acast:episodeId>63db1c3302c87c0011fed41f</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxrp8qIHqvYF5pVn5UWyiSGLUoOaUFkjWASrKcxQEmznCP5OYN88aVcuJoTfBvJsGU6709C1hG29UPH8QPE5Q5WY]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>8</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1671481599523-356f38441c9a3ffc416c1e8c237f365e.jpeg"/>
			<description><![CDATA[<p>Hank Gerba was once one of Sophia The Robot’s handlers, and in this episode he reveals the magic (deception) behind social robots that promise to be one day autonomous. Recorded Jan 24, 2023. Released March 10, 2023. </p><br><p>Sophia</p><p><a href="https://www.hansonrobotics.com/sophia/" rel="noopener noreferrer" target="_blank">https://www.hansonrobotics.com/sophia/</a> </p><br><p>The Most Realistic Humanoid Robots in The World</p><p><a href="https://www.youtube.com/watch?v=QjHNFP_773k&amp;ab_channel=Motech" rel="noopener noreferrer" target="_blank">https://www.youtube.com/watch?v=QjHNFP_773k&amp;ab_channel=Motech</a> </p><br><p>Creators of famous Sophia robot reveal AI robotics for children, elderly | Nightline</p><p><a href="https://www.youtube.com/watch?v=JRHdnkUjcZg&amp;ab_channel=ABCNews" rel="noopener noreferrer" target="_blank">https://www.youtube.com/watch?v=JRHdnkUjcZg&amp;ab_channel=ABCNews</a> </p><br><p>Sophia the Robot and Jimmy Sing a Duet of "Say Something"</p><p><a href="https://www.youtube.com/watch?v=G-zyTlZQYpE&amp;ab_channel=TheTonightShowStarringJimmyFallon" rel="noopener noreferrer" target="_blank">https://www.youtube.com/watch?v=G-zyTlZQYpE&amp;ab_channel=TheTonightShowStarringJimmyFallon</a> </p><p><br></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Hank Gerba was once one of Sophia The Robot’s handlers, and in this episode he reveals the magic (deception) behind social robots that promise to be one day autonomous. Recorded Jan 24, 2023. Released March 10, 2023. </p><br><p>Sophia</p><p><a href="https://www.hansonrobotics.com/sophia/" rel="noopener noreferrer" target="_blank">https://www.hansonrobotics.com/sophia/</a> </p><br><p>The Most Realistic Humanoid Robots in The World</p><p><a href="https://www.youtube.com/watch?v=QjHNFP_773k&amp;ab_channel=Motech" rel="noopener noreferrer" target="_blank">https://www.youtube.com/watch?v=QjHNFP_773k&amp;ab_channel=Motech</a> </p><br><p>Creators of famous Sophia robot reveal AI robotics for children, elderly | Nightline</p><p><a href="https://www.youtube.com/watch?v=JRHdnkUjcZg&amp;ab_channel=ABCNews" rel="noopener noreferrer" target="_blank">https://www.youtube.com/watch?v=JRHdnkUjcZg&amp;ab_channel=ABCNews</a> </p><br><p>Sophia the Robot and Jimmy Sing a Duet of "Say Something"</p><p><a href="https://www.youtube.com/watch?v=G-zyTlZQYpE&amp;ab_channel=TheTonightShowStarringJimmyFallon" rel="noopener noreferrer" target="_blank">https://www.youtube.com/watch?v=G-zyTlZQYpE&amp;ab_channel=TheTonightShowStarringJimmyFallon</a> </p><p><br></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Promised, with Jake Pitre</title>
			<itunes:title>Promised, with Jake Pitre</itunes:title>
			<pubDate>Wed, 01 Mar 2023 13:00:05 GMT</pubDate>
			<itunes:duration>47:38</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/63db1b8aad3be6001014de72/media.mp3" length="57174642" type="audio/mpeg"/>
			<guid isPermaLink="false">63db1b8aad3be6001014de72</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/007</link>
			<acast:episodeId>63db1b8aad3be6001014de72</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxo34S9QhwoRoeRMWW8eQSUCr2A0CWkmFdvsMV12wjSRkZBmM/u++FziKQJhYc0YCB19l6B+oG3U4tOzVHsdOips]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>7</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1671481599523-356f38441c9a3ffc416c1e8c237f365e.jpeg"/>
			<description><![CDATA[<p>Jake Pitre and I discuss the concept of the Metaverse and the future it wants. Recorded Jan 18, 2023. Released March 1, 2023. </p><br><p>Who Wants the Metaverse?</p><p><a href="https://daily.jstor.org/who-wants-the-metaverse/" rel="noopener noreferrer" target="_blank">https://daily.jstor.org/who-wants-the-metaverse/</a> </p><br><p>Awkward Meetings</p><p><a href="https://twitter.com/satyanadella/status/1455624165201887234" rel="noopener noreferrer" target="_blank">https://twitter.com/satyanadella/status/1455624165201887234</a> </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Jake Pitre and I discuss the concept of the Metaverse and the future it wants. Recorded Jan 18, 2023. Released March 1, 2023. </p><br><p>Who Wants the Metaverse?</p><p><a href="https://daily.jstor.org/who-wants-the-metaverse/" rel="noopener noreferrer" target="_blank">https://daily.jstor.org/who-wants-the-metaverse/</a> </p><br><p>Awkward Meetings</p><p><a href="https://twitter.com/satyanadella/status/1455624165201887234" rel="noopener noreferrer" target="_blank">https://twitter.com/satyanadella/status/1455624165201887234</a> </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Fake, with Gabriele de Seta</title>
			<itunes:title>Fake, with Gabriele de Seta</itunes:title>
			<pubDate>Mon, 20 Feb 2023 13:00:21 GMT</pubDate>
			<itunes:duration>45:55</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/639d6730a006590011da59a3/media.mp3" length="55109402" type="audio/mpeg"/>
			<guid isPermaLink="false">639d6730a006590011da59a3</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/006</link>
			<acast:episodeId>639d6730a006590011da59a3</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxpUglcc4YTnNQ/6IBlD6aIllNhaXhZgeAXsecT8xqqyzDL7zkDjdO9jNCEl8MNjPB+g//gMqPJ0K7HZJFHhSB+V]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>6</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1671481809246-af9bc624219883cb52df3efa2ac64397.jpeg"/>
			<description><![CDATA[<p>Gabriele de Seta and I have a great conversation about deepfakes in a Chinese context. One of the big insights of this episode is that we may have to one day soon consider the agency of replicas in digital (human) form. Recorded Dec 15, 2022. </p><br><p>Huanlian, or changing faces: Deepfakes on Chinese digital media platforms</p><p><a href="https://journals.sagepub.com/doi/full/10.1177/13548565211030185" rel="noopener noreferrer" target="_blank">https://journals.sagepub.com/doi/full/10.1177/13548565211030185</a> </p><br><p>&nbsp;危 特朗普与蓬佩奥向中国示爱的绝密视频流出 我爱你中国</p><p><a href="https://www.bilibili.com/video/av754473414/" rel="noopener noreferrer" target="_blank">https://www.bilibili.com/video/av754473414/</a> </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Gabriele de Seta and I have a great conversation about deepfakes in a Chinese context. One of the big insights of this episode is that we may have to one day soon consider the agency of replicas in digital (human) form. Recorded Dec 15, 2022. </p><br><p>Huanlian, or changing faces: Deepfakes on Chinese digital media platforms</p><p><a href="https://journals.sagepub.com/doi/full/10.1177/13548565211030185" rel="noopener noreferrer" target="_blank">https://journals.sagepub.com/doi/full/10.1177/13548565211030185</a> </p><br><p>&nbsp;危 特朗普与蓬佩奥向中国示爱的绝密视频流出 我爱你中国</p><p><a href="https://www.bilibili.com/video/av754473414/" rel="noopener noreferrer" target="_blank">https://www.bilibili.com/video/av754473414/</a> </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Soulless, with Eryk Salvaggio</title>
			<itunes:title>Soulless, with Eryk Salvaggio</itunes:title>
			<pubDate>Fri, 10 Feb 2023 13:00:27 GMT</pubDate>
			<itunes:duration>1:05:56</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/639d6648b0a821001040e0e2/media.mp3" length="79125336" type="audio/mpeg"/>
			<guid isPermaLink="false">639d6648b0a821001040e0e2</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/005</link>
			<acast:episodeId>639d6648b0a821001040e0e2</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxobSDv/r6A21I0Xvil5rfX48eOoopWWBUWuBkQ6uVpA6EJ5RedSJT4e9UDD83GKa4TqmKxnceles/5kVytDAhOb]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>5</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1671481781078-5306b164ffd81a97b5f6327afb4a6822.jpeg"/>
			<description><![CDATA[<p>Eryk Salvaggio and I discuss what GANs, stable diffusion and neural networks are, and we take listeners through the process of “reading” an AI image. Recorded Dec 13, 2022.</p><br><p>How to Read an AI Image: The Datafication of a Kiss </p><p><a href="https://cyberneticforests.substack.com/p/how-to-read-an-ai-image" rel="noopener noreferrer" target="_blank">https://cyberneticforests.substack.com/p/how-to-read-an-ai-image</a></p><p> </p><p>This person does not exist</p><p><a href="https://thispersondoesnotexist.com/" rel="noopener noreferrer" target="_blank">https://thispersondoesnotexist.com</a> </p><br><p>@Suhail They say it's "soulless" </p><p><a href="https://twitter.com/Suhail/status/1575527122449231872?s=20&amp;t=NDTtKG0ET9c42oRmf3Q5-A" rel="noopener noreferrer" target="_blank">https://twitter.com/Suhail/status/1575527122449231872?s=20&amp;t=NDTtKG0ET9c42oRmf3Q5-A</a> </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Eryk Salvaggio and I discuss what GANs, stable diffusion and neural networks are, and we take listeners through the process of “reading” an AI image. Recorded Dec 13, 2022.</p><br><p>How to Read an AI Image: The Datafication of a Kiss </p><p><a href="https://cyberneticforests.substack.com/p/how-to-read-an-ai-image" rel="noopener noreferrer" target="_blank">https://cyberneticforests.substack.com/p/how-to-read-an-ai-image</a></p><p> </p><p>This person does not exist</p><p><a href="https://thispersondoesnotexist.com/" rel="noopener noreferrer" target="_blank">https://thispersondoesnotexist.com</a> </p><br><p>@Suhail They say it's "soulless" </p><p><a href="https://twitter.com/Suhail/status/1575527122449231872?s=20&amp;t=NDTtKG0ET9c42oRmf3Q5-A" rel="noopener noreferrer" target="_blank">https://twitter.com/Suhail/status/1575527122449231872?s=20&amp;t=NDTtKG0ET9c42oRmf3Q5-A</a> </p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Death, with Tamara Kneese</title>
			<itunes:title>Death, with Tamara Kneese</itunes:title>
			<pubDate>Mon, 30 Jan 2023 13:00:28 GMT</pubDate>
			<itunes:duration>45:50</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/639c090e124e4c0010157b83/media.mp3" length="55005434" type="audio/mpeg"/>
			<guid isPermaLink="false">639c090e124e4c0010157b83</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/004</link>
			<acast:episodeId>639c090e124e4c0010157b83</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxrQ7KTEJQsRm9LzWoMKbTb+9FBlVXln3q5j97Ndf/AFmjZdXdo/zSFV6tO3kRIF/JZDhR/7pgBKkkltqbGy/dPt]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>4</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1671481701756-d1dd6c65033c8a64d1a9789765432c6f.jpeg"/>
			<description><![CDATA[<p>Tamara Kneese and I have a truly delightful conversation about death and the personal data that complicates estate planning and digital doubles. The main takeaway for me from this episode — and there were many — is that we scrutinize our (and others’) likeness in digital form in ways that we didn't have to with memorabilia in material form.</p><br><p>Data Infrastructures of the Dead</p><p><a href="https://www.heliotropejournal.net/helio/data-infrastructures-of-the-dead" rel="noopener noreferrer" target="_blank">https://www.heliotropejournal.net/helio/data-infrastructures-of-the-dead</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Tamara Kneese and I have a truly delightful conversation about death and the personal data that complicates estate planning and digital doubles. The main takeaway for me from this episode — and there were many — is that we scrutinize our (and others’) likeness in digital form in ways that we didn't have to with memorabilia in material form.</p><br><p>Data Infrastructures of the Dead</p><p><a href="https://www.heliotropejournal.net/helio/data-infrastructures-of-the-dead" rel="noopener noreferrer" target="_blank">https://www.heliotropejournal.net/helio/data-infrastructures-of-the-dead</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Terror, with Olivia Snow</title>
			<itunes:title>Terror, with Olivia Snow</itunes:title>
			<pubDate>Fri, 20 Jan 2023 13:00:56 GMT</pubDate>
			<itunes:duration>47:42</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/639c029d6f529f0010232a28/media.mp3" length="57245173" type="audio/mpeg"/>
			<guid isPermaLink="false">639c029d6f529f0010232a28</guid>
			<itunes:explicit>true</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/003</link>
			<acast:episodeId>639c029d6f529f0010232a28</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxpU1AtsGWnRccVH7lB1kacI6R3yMoqzrilm2yl7EBCkhxXdCmuq0omB6jWUScc3V5S3sH4J0sy8BW39Ap4UXGuw]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>3</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1671481675423-1d2c8b58a4ccfc6ffe21848bdc0a6e40.jpeg"/>
			<description><![CDATA[Olivia Snow and I chat about the social implications of <a href="https://prisma-ai.com/lensa" rel="noopener noreferrer" target="_blank"><em>Lensa</em></a>, technically a picture editor for selfies and photo retouching, or “an all-in-one image editing app that takes your photos to the next level”. We unpack how AI image apps like <em>Lensa</em> amplify racism, misogyny, transphobia and hatred of sex workers, while also providing a potential way to thwart expectations of ‘real’ representation. Recorded Dec 10, 2022.<hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[Olivia Snow and I chat about the social implications of <a href="https://prisma-ai.com/lensa" rel="noopener noreferrer" target="_blank"><em>Lensa</em></a>, technically a picture editor for selfies and photo retouching, or “an all-in-one image editing app that takes your photos to the next level”. We unpack how AI image apps like <em>Lensa</em> amplify racism, misogyny, transphobia and hatred of sex workers, while also providing a potential way to thwart expectations of ‘real’ representation. Recorded Dec 10, 2022.<hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Allure, with Luke Munn</title>
			<itunes:title>Allure, with Luke Munn</itunes:title>
			<pubDate>Tue, 10 Jan 2023 13:00:24 GMT</pubDate>
			<itunes:duration>48:34</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/639a344cb090f900127ba6d7/media.mp3" length="58285369" type="audio/mpeg"/>
			<guid isPermaLink="false">639a344cb090f900127ba6d7</guid>
			<itunes:explicit>false</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/002</link>
			<acast:episodeId>639a344cb090f900127ba6d7</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxoaccNrJeeFHvbY0Jn9SWe870eY3mj7mbJx0AyqEjRf6uzsyDv1vAuVJGEDbKSKSUL0e6uHw5K3FWmguKLXHTwJ]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>2</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1671481635718-93848d28a22f574d3c35c1733489c226.jpeg"/>
			<description><![CDATA[<p>Luke Munn and I discuss the allure of ChatGPT technology. ChatGPT is a natural language processing tool driven by AI technology that allows you to have human-like conversations and much more with a chatbot. For me, one of the great insights from our conversation is that media scholars should pay attention to the meaning-making that happens culturally, even as technologies and their underlying logics are being/have been debunked. Recorded Dec 7, 2022.</p><br><p>Munn,&nbsp;<em>Ferocious Logics: Unmaking the Algorithm&nbsp;</em>(2018)</p><p><a href="https://meson.press/books/ferocious-logics/" rel="noopener noreferrer" target="_blank">https://meson.press/books/ferocious-logics/</a></p><br><p>Munn,&nbsp;<em>Logic of Feeling: Technology's Quest to Capitalize Emotion</em>&nbsp;(2020)</p><p><a href="https://rowman.com/ISBN/9781538148358/Logic-of-Feeling-Technology%27s-Quest-to-Capitalize-Emotion" rel="noopener noreferrer" target="_blank">https://rowman.com/ISBN/9781538148358/Logic-of-Feeling-Technology%27s-Quest-to-Capitalize-Emotion</a></p><br><p>Munn,&nbsp;<em>Automation is a Myth</em>&nbsp;(2022)</p><p><a href="https://www.sup.org/books/title/?id=34899&amp;bottom_ref=subject" rel="noopener noreferrer" target="_blank">https://www.sup.org/books/title/?id=34899&amp;bottom_ref=subject</a></p><br><p>Magee, Arora, and Munn, "Structured Like a Language Model: Analysing AI as an Automated Subject" (preprint)</p><p><a href="https://arxiv.org/abs/2212.05058" rel="noopener noreferrer" target="_blank">https://arxiv.org/abs/2212.05058</a></p><br><p>Munn, "The Uselessness of AI Ethics"&nbsp;</p><p><a href="https://link.springer.com/article/10.1007/s43681-022-00209-w" rel="noopener noreferrer" target="_blank">https://link.springer.com/article/10.1007/s43681-022-00209-w</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>Luke Munn and I discuss the allure of ChatGPT technology. ChatGPT is a natural language processing tool driven by AI technology that allows you to have human-like conversations and much more with a chatbot. For me, one of the great insights from our conversation is that media scholars should pay attention to the meaning-making that happens culturally, even as technologies and their underlying logics are being/have been debunked. Recorded Dec 7, 2022.</p><br><p>Munn,&nbsp;<em>Ferocious Logics: Unmaking the Algorithm&nbsp;</em>(2018)</p><p><a href="https://meson.press/books/ferocious-logics/" rel="noopener noreferrer" target="_blank">https://meson.press/books/ferocious-logics/</a></p><br><p>Munn,&nbsp;<em>Logic of Feeling: Technology's Quest to Capitalize Emotion</em>&nbsp;(2020)</p><p><a href="https://rowman.com/ISBN/9781538148358/Logic-of-Feeling-Technology%27s-Quest-to-Capitalize-Emotion" rel="noopener noreferrer" target="_blank">https://rowman.com/ISBN/9781538148358/Logic-of-Feeling-Technology%27s-Quest-to-Capitalize-Emotion</a></p><br><p>Munn,&nbsp;<em>Automation is a Myth</em>&nbsp;(2022)</p><p><a href="https://www.sup.org/books/title/?id=34899&amp;bottom_ref=subject" rel="noopener noreferrer" target="_blank">https://www.sup.org/books/title/?id=34899&amp;bottom_ref=subject</a></p><br><p>Magee, Arora, and Munn, "Structured Like a Language Model: Analysing AI as an Automated Subject" (preprint)</p><p><a href="https://arxiv.org/abs/2212.05058" rel="noopener noreferrer" target="_blank">https://arxiv.org/abs/2212.05058</a></p><br><p>Munn, "The Uselessness of AI Ethics"&nbsp;</p><p><a href="https://link.springer.com/article/10.1007/s43681-022-00209-w" rel="noopener noreferrer" target="_blank">https://link.springer.com/article/10.1007/s43681-022-00209-w</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
		<item>
			<title>Grievance, with Sarah T. Roberts</title>
			<itunes:title>Grievance, with Sarah T. Roberts</itunes:title>
			<pubDate>Sun, 01 Jan 2023 13:00:59 GMT</pubDate>
			<itunes:duration>59:42</itunes:duration>
			<enclosure url="https://sphinx.acast.com/p/open/s/63997541ed122a001195e286/e/639976ac59c4120011c6a5d3/media.mp3" length="71645434" type="audio/mpeg"/>
			<guid isPermaLink="false">639976ac59c4120011c6a5d3</guid>
			<itunes:explicit>true</itunes:explicit>
			<link>https://www.thedatafix.net/episodes/001</link>
			<acast:episodeId>639976ac59c4120011c6a5d3</acast:episodeId>
			<acast:showId>63997541ed122a001195e286</acast:showId>
			<acast:settings><![CDATA[FYjHyZbXWHZ7gmX8Pp1rmbKbhgrQiwYShz70Q9/ffXZMTtedvdcRQbP4eiLMjXzCKLPjEYLpGj+NMVKa+5C8pL4u/EOj1Vw4h5MMJYp0lCcFAe0fnxBJy/1ju4Qxy1fh8gO4DvlGA40yms2g0/hOkcrfHIopjTygHFqGwwOPKFIai4SuTvs86Lx3UYCyl6ZsEjdJwbhFywfGIuucL0kw0jFyw1Bi5rf6ZgaOokeAhxp2NzbHpWs7SyZr8zGHs9fFe3S04pgAjt9FHAKT1TQkcaGbuuBypSiuqzzricxxAyUHVovA3tT+EO/UrKOTwmUt]]></acast:settings>
			<itunes:episodeType>full</itunes:episodeType>
			<itunes:episode>1</itunes:episode>
			<itunes:image href="https://assets.pippa.io/shows/63997541ed122a001195e286/1672452347273-8113ffdc2de7d45a81130bbda5cf1b01.jpeg"/>
			<description><![CDATA[<p>In this first episode of <em>The Data Fix</em>, I speak with Dr. Sarah T. Roberts, an expert in commercial content moderation and THE cultural critic we need right now, on all things tech &amp; society related. We begin our discussion about how moderation on social media works, and what it presumes to parse out or let through, and explore for whose sake moderation is done. Because we focus on affect and feeling(s) in this series, we also discuss what it is about technology (and the Internet in particular) that has created such divisions in our worlds — specifically, what can we learn by asking about the legitimate grievances of the (lie-filled, meme-driven, bot-happy,) political right in the US and Canadian contexts? Recorded Dec 7, 2022.</p><br><p><em>Behind the Screen: Content Moderation in the Shadows of Social Media</em> </p><p><a href="https://yalebooks.yale.edu/book/9780300261479/behind-the-screen/" rel="noopener noreferrer" target="_blank">https://yalebooks.yale.edu/book/9780300261479/behind-the-screen/</a> </p><br><p>Modulating Moderation (Overton window mentioned) </p><p><a href="https://mediarxiv.org/wvp8c" rel="noopener noreferrer" target="_blank">https://mediarxiv.org/wvp8c</a></p><br><p>Algorithmic amplification of politics on Twitter </p><p><a href="https://blog.twitter.com/en_us/topics/company/2021/rml-politicalcontent" rel="noopener noreferrer" target="_blank">https://blog.twitter.com/en_us/topics/company/2021/rml-politicalcontent</a></p><br><p>Future Fetishists&nbsp;</p><p><a href="https://www.boundary2.org/2019/08/sarah-t-roberts-and-mel-hogan-left-behind-futurist-fetishists-prepping-and-the-abandonment-of-earth/" rel="noopener noreferrer" target="_blank">https://www.boundary2.org/2019/08/sarah-t-roberts-and-mel-hogan-left-behind-futurist-fetishists-prepping-and-the-abandonment-of-earth/</a></p><br><p>Digital Detritus </p><p><a href="https://firstmonday.org/ojs/index.php/fm/article/view/8283" rel="noopener noreferrer" target="_blank">https://firstmonday.org/ojs/index.php/fm/article/view/8283</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></description>
			<itunes:summary><![CDATA[<p>In this first episode of <em>The Data Fix</em>, I speak with Dr. Sarah T. Roberts, an expert in commercial content moderation and THE cultural critic we need right now, on all things tech &amp; society related. We begin our discussion about how moderation on social media works, and what it presumes to parse out or let through, and explore for whose sake moderation is done. Because we focus on affect and feeling(s) in this series, we also discuss what it is about technology (and the Internet in particular) that has created such divisions in our worlds — specifically, what can we learn by asking about the legitimate grievances of the (lie-filled, meme-driven, bot-happy,) political right in the US and Canadian contexts? Recorded Dec 7, 2022.</p><br><p><em>Behind the Screen: Content Moderation in the Shadows of Social Media</em> </p><p><a href="https://yalebooks.yale.edu/book/9780300261479/behind-the-screen/" rel="noopener noreferrer" target="_blank">https://yalebooks.yale.edu/book/9780300261479/behind-the-screen/</a> </p><br><p>Modulating Moderation (Overton window mentioned) </p><p><a href="https://mediarxiv.org/wvp8c" rel="noopener noreferrer" target="_blank">https://mediarxiv.org/wvp8c</a></p><br><p>Algorithmic amplification of politics on Twitter </p><p><a href="https://blog.twitter.com/en_us/topics/company/2021/rml-politicalcontent" rel="noopener noreferrer" target="_blank">https://blog.twitter.com/en_us/topics/company/2021/rml-politicalcontent</a></p><br><p>Future Fetishists&nbsp;</p><p><a href="https://www.boundary2.org/2019/08/sarah-t-roberts-and-mel-hogan-left-behind-futurist-fetishists-prepping-and-the-abandonment-of-earth/" rel="noopener noreferrer" target="_blank">https://www.boundary2.org/2019/08/sarah-t-roberts-and-mel-hogan-left-behind-futurist-fetishists-prepping-and-the-abandonment-of-earth/</a></p><br><p>Digital Detritus </p><p><a href="https://firstmonday.org/ojs/index.php/fm/article/view/8283" rel="noopener noreferrer" target="_blank">https://firstmonday.org/ojs/index.php/fm/article/view/8283</a></p><hr><p style='color:grey; font-size:0.75em;'> Hosted on Acast. See <a style='color:grey;' target='_blank' rel='noopener noreferrer' href='https://acast.com/privacy'>acast.com/privacy</a> for more information.</p>]]></itunes:summary>
		</item>
    	<itunes:category text="Education"/>
    	<itunes:category text="Society &amp; Culture"/>
    	<itunes:category text="Technology"/>
    </channel>
</rss>
