![]() ![]() Michael Hind is an IBM Distinguished Researcher who leads work on AI FactSheets and AI Fairness 360, which promote transparency in algorithms. “What’s important for me in my work is to make sure that these voices of these different groups of people come to the fore,” she added. “The moment I went to grad school, I didn’t see any Black people at all,” Gebru said. ’15 said that she was introduced to these issues as they relate to machine learning in AI after seeing a ProPublica article about recidivism algorithms, and a TED Talk by Joy Buolamwini, a graduate researcher at MIT, who discovered that an open-source facial recognition software did not detect her face unless she wore a white mask.Īs Gebru moved higher up in the tech world, she noticed a severe lack of representation - a pressing inequity she is working to redress. Qu, the moderator of the panel and a CCSRE Race & Technology Practitioner Fellow who developed a tool called AI Blindspot that discovers bias in AI systems, opened the conversation by discussing how there are two definitions of combating bias: ensuring that algorithms are “de-biased” and striving for equity in a historical and cultural context. The event, titled “Race, Tech & Civil Society: Tools for Combating Bias in Datasets and Models,” was sponsored by the Stanford Center on Philanthropy and Civil Society, the Center for Comparative Studies in Race and Ethnicity and the Stanford Institute for Human-Centered Artificial Intelligence. “I wanted to give them a sense that we’re spending time with each other in this very challenging period to learn as a community, to work together on group projects, and to achieve organic connections and authentic relationships between all our unique places during this pandemic."Īs artificial intelligence becomes increasingly common in several areas of public life - from policing to hiring to healthcare - AI researchers Timnit Gebru, Michael Hind, James Zou and Hong Qu came together to criticize Silicon Valley’s lack of transparency and advocate for greater diversity and inclusion in decision making. During synchronous class time, students were invited to sketch with Qu using Jamboard on the shared screen-a novel form of participation to draw out the inner artist/designer in every student. To foster a close-knit community among students from diverse backgrounds, Qu intentionally curated a set of online tools and learning exercises to generate an “ ambient telepresence.” For instance, he assigned group data visualization projects to promote peer learning and used VoiceThread for assigned peer critiques. ![]() ![]() Hong Qu, Adjunct Lecturer in Public Policy, taught Data Visualization virtually last spring to over 70 students from different Harvard Schools, levels of experience, and corners of the world. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |