BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:AI & Gender: A Practical Human Rights Toolbox
DTSTART:20200213T100000
DTEND:20200213T140000
DTSTAMP:20260415T200423Z
UID:ba2a8c212cbb10105fe9dd0e96da35196eca52de6d26329c258b75b0
CATEGORIES:Conferences - Seminars
DESCRIPTION:Caitlin Kraft-Buchman is the CEO/Founder of Women at the Table
     Asako Hattori serves as a human rights officer at the Women’s Huma
 n Rights and Gender Section\, the Office of the United Nations High Commis
 sioner for Human Rights (UN Human Rights– – OHCHR).\nDescription\n“B
 ias is to AI what rust is to steel. It corrupts decisions\, leaving us uns
 ure of the integrity of our systems\, dooming them to failure.”  MIT Tec
 hnology Review Dec 2019\n\nWorkshop\nBackground\nWhy and where can algorit
 hms be gender biased? How can a human rights based approach be applied to 
 computer science\, engineering and innovation? Research shows that bias is
  in play in every aspect of modern life and has substantial\, far-reaching
  impacts on our work environments\, private life and culture.\nWe are at a
  critical turning point. In order to innovate and thrive in a rapidly chan
 ging global environment\, new norms are needed. The “standardized male
 ” is the default of flawed systems and cultural standards that currently
  control how we live and work - defaults so normalized we don’t even not
 ice. From 20th century drug trials\, international standards\, city transi
 t systems and global trading rules to 21st century algorithmic decision ma
 king and machine learning systems\, this default has proven to harm people
  - and the bottom line.\nIn this crucial moment when AI is transforming ev
 ery aspect of our lives and the very fabric of our society - potentially t
 he greatest global paradigm shift yet - it is crystal clear that the desig
 n and deployment of AI must be grounded in human rights. Similarly\, gende
 r equality - the very heart of human rights - must be included in AI desig
 n and deployment.\nParticularly urgent given the scale at which Automated 
 Decision-Making (ADM) systems and machine learning are being deployed\, we
  need scientists and engineers that understand the gender dimensions of th
 eir work and the implications their work has for all citizens\, so that we
  all can thrive.\n\nWho is this workshop for?\nEPFL undergraduate and grad
 uate students\n\nWorkshop\nThe Digital Humanities Institute in collaborati
 on with the Equal Opportunities Office will host a 3 hour practical\, agil
 e and interactive workshop ‘AI & Gender: A Human Rights Toolbox’ for s
 tudents on the EPFL Campus. Using gender as a prism to understand a human 
 rights framework that underscores AI\, the interactive workshop will foste
 r reflection on the stereotypes\, biases and gendered roles of both women 
 and men\, with the intention of understanding what real-life constraints h
 inder equality in the working environments and the output of computer scie
 ntists and engineers. The workshop will increase participant awareness of 
 the relevance of gender and bias in their work and to their workplace and 
 provides a unique opportunity to develop\, deepen\, and apply gender equal
 ity learnings\, putting learning into action\, ultimately leading to bette
 r decision-making\, excellence in science\, and improved practices.\n\nObj
 ectives \nApplying a human-rights based approach this workshop will develo
 p and strengthen awareness as well as the understanding of gender equality
  and gender bias as a first step towards behavioural change\, and the inte
 gration of a gender perspective into everyday work of computer science and
  engineering.\nThroughout the workshop\, participants will complete a vari
 ety of interactive exercises\, discussion and activities. The workshop wil
 l be supported by specific training materials including a gender-responsiv
 e checklist tailored for computer science and engineering students\, facul
 ty and staff for use to embed gender across their research and day-to-day 
 work.\nFollowing the workshop (after 4-6 weeks)\, participants will be inv
 ited to attend a voluntary additional 1.5 hour session to focus on the app
 lication of the checklists to real-life research and design scenarios. Thi
 s follow-up session will allow participants to reflect on the initial trai
 ning and lessons learnt and have the opportunity to share insights that ha
 ve come up in their research\, design\, development and learning environme
 nts.\n\nLearning outcomes:\nUpon completion EPFL students will have the kn
 owledge and skills to:\n\n	Explain a human rights based approach to AI \;\
 n	Identify relevance of different biases and importance of gender equality
  to computer science and engineering / institutional objectives\;\n	Analyz
 e how gender bias has occurred or can occur in the research\, design and d
 evelopment of AI\;\n	Apply how and when to use gender inclusive tools and 
 techniques to mitigate gender bias in AI\;\n	Evaluate concrete methods to 
 integrate gender into design\, planning and implementation of AI projects.
 \n
LOCATION:BI A0 448 https://plan.epfl.ch/?room==BI%20A0%20448
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
