Wendy Hui Kyong Chun | Discriminating Data
2021, PFL 2021-2022, Media + Information, President's Faculty Lectures
In this President's Faculty Lecture, Wendy Hui Kyong Chun (間眅埶AV's Canada 150 Research Chair in New Media) will discuss themes from her forthcoming book about how big data and predictive machine learning currently encode discrimination and create agitated clusters of comforting rage ().
Chun will explore how polarization is a goalnot an errorwithin current practices of predictive data analysis and machine learning for these methods encode segregation, eugenics, and identity politics through their default assumptions and conditions. Correlation, which grounds big data's predictive potential, stems from twentieth-century eugenic attempts to breed a better future. Recommender systems foster angry clusters of sameness through homophily. Users are trained to become authentically predictable via a politics and technology of recognition. The predictive programs thus seek to disrupt the future by making disruption impossible.
6:30 p.m. (PT)
Online Event
Closed captioning in English will be available at this event.
The President's Faculty Lectures
The Presidents Faculty Lectures shine a light on the research excellence at 間眅埶AV. Hosted by 間眅埶AV president Joy Johnson, these free public lectures celebrate cutting-edge research and faculty that engage with communities and mobilize knowledge to make real-world impacts.
Each short lecture by an 間眅埶AV researcher will be followed by a conversation with Joy Johnson and an audience Q&A session livestreamed from the Djavad Mowafaghian World Art Centre at 間眅埶AVs Goldcorp Centre for the Arts.
This year, lecturers will approach the themes of equity and justice from a variety of disciplines.
Wendy Hui Kyong Chun is 間眅埶AV's in New Media and leads the . She is the author of several works including (forthcoming from MIT Press, 2021) and : Updating to Remain the Same: Habitual New Media (2016), Programmed Visions: Software and Memory (2011), and Control and Freedom: Power and Paranoia in the Age of Fiber Optics (2005). She was Professor and Chair of the Department of Modern Culture and Media at Brown University, and has held numerous visiting chairs and fellowships at institutions such as Harvard University, the Annenberg School for Communication at the University of Pennsylvania, the Institute for Advanced Study (Princeton, N.J.), the Guggenheim, the American Council of Learned Societies, and the American Academy in Berlin.
Discriminating Data with Wendy Hui Kyong Chun
By Kayla Hilstob, PhD Student,
In the first President's Faculty Lecture of the 2021/2022 series, Dr. Wendy Hui Kyong Chun discussed her forthcoming book, . The book reveals how algorithms encode legacies of segregation, eugenics and multiculturalism through proxies, latent factors, principal components, network neighbourhoods to create agitated clusters of comforting rage. This event was the first of six talks in the series around the theme of equity and justice from a variety of disciplinary perspectives.
Read more
Sxwpilema獺t Siy獺m (Chief Leanne Joe) from the Squamish Nation opened the event with a welcome to the territories of the S廎硬x戔w繳7mesh xwumixw (Squamish), slilwta优 (Tsleil-Waututh) and x妢m庛k妢ym (Musqueam) Nations. She invited us to open our hearts and minds and emphasized the importance of creating spaces of curiosity and transformation with the knowledge being shared at this event. 間眅埶AV President Dr. Joy Johnson noted that this talk was taking place the day before our first ever National Day for Truth and Reconciliation, during this collective reckoning with Indian Residential Schools and a time to stand in solidarity.
Dr. Chun opened with the premise that the equity and justice challenges we face today feel overwhelming to the point that they threaten the vision of ourselves and a multi-cultural society. Dr. Chuns forthcoming book, Discriminating Data, offers a timely exploration of these monumental questions in the context of social media and predictive algorithms. She reminded us that for this problem we blamed the internet, social media, polarization, misinformation, and the hate it espoused. Like in the view presented in the well-known Netflix documentary The Social Dilemma, the challenges social media present are new and unprecedented because recommender systems have hacked the human psyche. And, we have been manipulated like marionettes by evil tech dudes... and its game over for humanity," remarked Dr. Chun. On the other hand, she contrasted, others point out this is not new, and blaming tech for everything keeps us from taking on fundamental issues. For Dr. Chun, it is not either/or, but they are part of the same chorus, and through social media networks, we continue to relive this history.
What Dr. Chun means by this provocative statement is that in their current form, our predictive programs embed within them past mistakes: segregation, internment and eugenics that make it really difficult to imagine and live a different future. For example, images of mostly white U.S. celebrities make up the training data for facial recognition systems, not civil rights activists or interned Japanese children. She asked us to imagine a world where nothing could change from this deepfake past premised on the white ideal because learning and truth equaled repeating this past. For Dr. Chun, this is the nightmare of predictive machine learning. Applying the concept of Ariella Azoulay, we need to instead unlearn, not by ignoring history but engaging it differently.
Overall, Dr. Chun sees Discriminating Data in dialogue with scholars such as Cathy ONeil, Safiya Umoja Noble, Ruha Benjamin, Meredith Broussard, Kate Crawford, Virginia Eubanks, and others. In conversation with their work, Dr. Chun gave us a sample of the five main things that Discriminating Data does:
- Expose and investigate how ignoring difference amplifies discrimination both currently and historically. Dr. Chun explained how ignoring race promotes racism using a classic example of Compass, an algorithm used by some U.S. courts to predict the risk of recidivism. She critiqued the idea that pretending visual markers do not exist solves the problem of racism. This program has been shown by researchers to discriminate against racial minorities because [if] a program cant see race, it cant see racism.
- Interrogate default assumptions that ground algorithms and data structures. Her book explains how overblown promises about correlation tie 21st-century data analytics to 20th-century eugenics. She also shows how homophily, the notion that similarity breeds connection, ties social networks to U.S. segregation due to its emergence from studies on residential segregation. Thus, for Dr. Chun, echo chambers arent an accident, but the goal. Segregation is the default.
- Grasp the future machine learning algorithms put in place, focusing on when, why and how their predictions work. Dr. Chun explained that for machine learning, truth equals consistency, therefore the future equals the past. Programs like Compass are tested on their ability to predict the past, which reinforces inequalities. Therefore, according to Dr. Chun they do not offer new and unforeseen futures, they close the future [and] they automate past mistakes so we can no longer learn from them.
- Use existing AI systems to diagnose current inequalities. Dr. Chun suggests that we can read AI systems against the grain to understand discrimination, to use them as evidence. She returned to Compass, where the program showed that racial bias is institutional. For Dr. Chun, Compass shows us how racism works.
- Devise different algorithms and ways to verify machine learning programs that displace the eugenic and segregationist network structures of the present. As they currently are, our networks rely on two studies with serious ethical and methodological issues that Dr. Chun pointed out. First, homophily, where there is an overselection of white illiberal responses that led to the conclusion that similarly breeds connection. The second is sentiment analysis, a method developed in Japanese internment camps to manage people in occupied lands. She pointed out, powerfully, that these people were meant to disappear.
She ended with an earnest question as she showed us photos of Japanese internees: We live in their spaces when we reside in social media; how might we reside together with them?
In the question and answer period, Dr. Johnson and Dr. Chun conversed over some fundamental yet controversial questions of the contemporary networked world. From that, Dr. Chuns message to the audience was to not accept what is given to you as a default, but rather, constantly question, as there will always be biases in what is presented. The most salient question of the night came from an audience member that was relayed by Dr. JohnsonIs it possible to have a better internet?to which Dr. Chun replied, We can have a better world, and then a better internet! This answer lies at the heart of Discriminating Data, that undoing the legacies of eugenics and segregation all around us will build a better world, first offline, to build them online.
-
Wendy Hui Kyong Chun | Discriminating Data
In this President's Faculty Lecture, Wendy Hui Kyong Chun (間眅埶AV's Canada 150 Research Chair in New Media) discusses themes from her forthcoming book Discriminating Data about how big data and predictive machine learning currently encode discrimination and create agitated clusters of comforting rage.
Read More
-
Tammara Soma | Setting the Table for Food Justice
When it comes to issues like food insecurity, who gets to shape the solutions? Tammara Soma will share how 間眅埶AV's Food Systems Lab applies community-engaged research methods to achieving sustainable, decolonized and just food systems for all.
Read More
-
Taco Niet | Just Climate Policies
How do we address the climate crisis effectively and equitably? Taco Niet of 間眅埶AVs School of Sustainable Energy Engineering will discuss how evidence-based modelling tools are essential for making urgent climate policy decisions grounded in justice and equity.
Read More
-
Vaibhav Saria | Care and Crisis in India
Vaibhav Saria, assistant professor of gender, sexuality and womens studies at 間眅埶AV, will explore how the complex history of health care in India has led to a valorization of care providers work during COVID-19, but also to increased violence against them.
Read More
-
Kanna Hayashi | Harm Reduction in an Unprecedented Overdose Crisis
Kanna Hayashi, the St. Pauls Hospital Chair in Substance Use Research, will explain how harm reduction interventions grounded in lived experiences, scientific evidence and health equity are desperately needed to address B.C.s unprecedented drug toxicity crisis.
Read More
-
June Francis | Becoming an Anti-Racist, Decolonized University
June Francis, director of the 間眅埶AV Institute for Diaspora Research & Engagement, will challenge whether traditional universities, which have been key pillars in constructing racism, are prepared to truly decolonize and become anti-racist.
Read More
-
Tue, 12 Mar 2024
-
Tue, 13 Feb 2024
-
Tue, 23 Jan 2024
-
Wed, 29 Nov 2023
-
Wed, 27 Sep 2023
-
Tue, 22 Mar 2022
-
Wed, 02 Mar 2022
-
Wed, 02 Feb 2022
-
Tue, 25 Jan 2022
-
Tue, 23 Nov 2021
-
Tue, 28 Sep 2021
-
Tue, 01 Jun 2021
-
Tue, 11 May 2021
-
Wed, 07 Apr 2021
-
Tue, 09 Mar 2021
-
Tue, 09 Feb 2021
-
Tue, 19 Jan 2021
-
Tue, 10 Mar 2020
-
Tue, 25 Feb 2020
-
Thu, 06 Feb 2020
-
Tue, 26 Nov 2019
-
Thu, 14 Mar 2019
-
Thu, 21 Feb 2019
-
Wed, 21 Nov 2018
-
Thu, 11 Oct 2018