BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:AI & Gender: A Practical Human Rights Toolbox
DTSTART:20200213T100000
DTEND:20200213T140000
DTSTAMP:20260315T144941Z
UID:d2653ac6c659121f8992f0427d52d251c5d3793d55173339156d37eb
CATEGORIES:Conferences - Seminars
DESCRIPTION:Caitlin Kraft-Buchman CEO/Founder of Women at the Table  & As
 ako Hattori\, human rights officer at the Women’s Human Rights and Gende
 r Section\, the Office of the United Nations High Commissioner for Human R
 ights (UN Human Rights– – OHCHR)\n“Bias is to AI what rust is to ste
 el. It corrupts decisions\, leaving us unsure of the integrity of our syst
 ems\, dooming them to failure.”  MIT Technology Review Dec 2019\n\nWorks
 hop:\n\nBackground\nWhy and where can algorithms be gender biased? How can
  a human rights based approach be applied to computer science\, engineerin
 g and innovation? Research shows that bias is in play in every aspect of m
 odern life and has substantial\, far-reaching impacts on our work environm
 ents\, private life and culture.\nWe are at a critical turning point. In o
 rder to innovate and thrive in a rapidly changing global environment\, new
  norms are needed. The “standardized male” is the default of flawed sy
 stems and cultural standards that currently control how we live and work -
  defaults so normalized we don’t even notice. From 20th century drug tri
 als\, international standards\, city transit systems and global trading ru
 les to 21st century algorithmic decision making and machine learning syste
 ms\, this default has proven to harm people - and the bottom line.\nIn thi
 s crucial moment when AI is transforming every aspect of our lives and the
  very fabric of our society - potentially the greatest global paradigm shi
 ft yet - it is crystal clear that the design and deployment of AI must be 
 grounded in human rights. Similarly\, gender equality - the very heart of 
 human rights - must be included in AI design and deployment.\nParticularly
  urgent given the scale at which Automated Decision-Making (ADM) systems a
 nd machine learning are being deployed\, we need scientists and engineers 
 that understand the gender dimensions of their work and the implications t
 heir work has for all citizens\, so that we all can thrive.\n\nWho is this
  workshop for?\nEPFL undergraduate and graduate students\n\nWorkshop\nThe 
 Digital Humanities Institute in collaboration with the Equal Opportunities
  Office will host a 3 hour practical\, agile and interactive workshop ‘A
 I & Gender: A Human Rights Toolbox’ for students on the EPFL Campus. Usi
 ng gender as a prism to understand a human rights framework that underscor
 es AI\, the interactive workshop will foster reflection on the stereotypes
 \, biases and gendered roles of both women and men\, with the intention of
  understanding what real-life constraints hinder equality in the working e
 nvironments and the output of computer scientists and engineers. The works
 hop will increase participant awareness of the relevance of gender and bia
 s in their work and to their workplace and provides a unique opportunity t
 o develop\, deepen\, and apply gender equality learnings\, putting learnin
 g into action\, ultimately leading to better decision-making\, excellence 
 in science\, and improved practices.\n\nObjectives \nApplying a human-righ
 ts based approach this workshop will develop and strengthen awareness as w
 ell as the understanding of gender equality and gender bias as a first ste
 p towards behavioural change\, and the integration of a gender perspective
  into everyday work of computer science and engineering.\nThroughout the w
 orkshop\, participants will complete a variety of interactive exercises\, 
 discussion and activities. The workshop will be supported by specific trai
 ning materials including a gender-responsive checklist tailored for comput
 er science and engineering students\, faculty and staff for use to embed g
 ender across their research and day-to-day work.\nFollowing the workshop (
 after 4-6 weeks)\, participants will be invited to attend a voluntary addi
 tional 1.5 hour session to focus on the application of the checklists to r
 eal-life research and design scenarios. This follow-up session will allow 
 participants to reflect on the initial training and lessons learnt and hav
 e the opportunity to share insights that have come up in their research\, 
 design\, development and learning environments.\n\nLearning outcomes:\nUpo
 n completion EPFL students will have the knowledge and skills to:\n\n	Expl
 ain a human rights based approach to AI \;\n	Identify relevance of differe
 nt biases and importance of gender equality to computer science and engine
 ering / institutional objectives\;\n	Analyze how gender bias has occurred 
 or can occur in the research\, design and development of AI\;\n	Apply how 
 and when to use gender inclusive tools and techniques to mitigate gender b
 ias in AI\;\n	Evaluate concrete methods to integrate gender into design\, 
 planning and implementation of AI projects.\n
LOCATION:BI A0 448 https://plan.epfl.ch/?room==BI%20A0%20448
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
