AI & Gender: A Practical Human Rights Toolbox

Thumbnail

Event details

Date 13.02.2020
Hour 10:0014:00
Speaker Caitlin Kraft-Buchman CEO/Founder of Women at the Table  & Asako Hattori, human rights officer at the Women’s Human Rights and Gender Section, the Office of the United Nations High Commissioner for Human Rights (UN Human Rights– – OHCHR)
Location
Category Conferences - Seminars
“Bias is to AI what rust is to steel. It corrupts decisions, leaving us unsure of the integrity of our systems, dooming them to failure.” MIT Technology Review Dec 2019

Workshop:

Background
Why and where can algorithms be gender biased? How can a human rights based approach be applied to computer science, engineering and innovation? Research shows that bias is in play in every aspect of modern life and has substantial, far-reaching impacts on our work environments, private life and culture.
We are at a critical turning point. In order to innovate and thrive in a rapidly changing global environment, new norms are needed. The “standardized male” is the default of flawed systems and cultural standards that currently control how we live and work - defaults so normalized we don’t even notice. From 20th century drug trials, international standards, city transit systems and global trading rules to 21st century algorithmic decision making and machine learning systems, this default has proven to harm people - and the bottom line.
In this crucial moment when AI is transforming every aspect of our lives and the very fabric of our society - potentially the greatest global paradigm shift yet - it is crystal clear that the design and deployment of AI must be grounded in human rights. Similarly, gender equality - the very heart of human rights - must be included in AI design and deployment.
Particularly urgent given the scale at which Automated Decision-Making (ADM) systems and machine learning are being deployed, we need scientists and engineers that understand the gender dimensions of their work and the implications their work has for all citizens, so that we all can thrive.

Who is this workshop for?
EPFL undergraduate and graduate students

Workshop
The Digital Humanities Institute in collaboration with the Equal Opportunities Office will host a 3 hour practical, agile and interactive workshop ‘AI & Gender: A Human Rights Toolbox’ for students on the EPFL Campus. Using gender as a prism to understand a human rights framework that underscores AI, the interactive workshop will foster reflection on the stereotypes, biases and gendered roles of both women and men, with the intention of understanding what real-life constraints hinder equality in the working environments and the output of computer scientists and engineers. The workshop will increase participant awareness of the relevance of gender and bias in their work and to their workplace and provides a unique opportunity to develop, deepen, and apply gender equality learnings, putting learning into action, ultimately leading to better decision-making, excellence in science, and improved practices.

Objectives
Applying a human-rights based approach this workshop will develop and strengthen awareness as well as the understanding of gender equality and gender bias as a first step towards behavioural change, and the integration of a gender perspective into everyday work of computer science and engineering.
Throughout the workshop, participants will complete a variety of interactive exercises, discussion and activities. The workshop will be supported by specific training materials including a gender-responsive checklist tailored for computer science and engineering students, faculty and staff for use to embed gender across their research and day-to-day work.
Following the workshop (after 4-6 weeks), participants will be invited to attend a voluntary additional 1.5 hour session to focus on the application of the checklists to real-life research and design scenarios. This follow-up session will allow participants to reflect on the initial training and lessons learnt and have the opportunity to share insights that have come up in their research, design, development and learning environments.

Learning outcomes:
Upon completion EPFL students will have the knowledge and skills to:
  • Explain a human rights based approach to AI ;
  • Identify relevance of different biases and importance of gender equality to computer science and engineering / institutional objectives;
  • Analyze how gender bias has occurred or can occur in the research, design and development of AI;
  • Apply how and when to use gender inclusive tools and techniques to mitigate gender bias in AI;
  • Evaluate concrete methods to integrate gender into design, planning and implementation of AI projects.

Links

Practical information

  • Informed public
  • Registration required

Organizer

  • EPFL Digital Humanities & Equality Office Women at the Table OHCHR Office of the High Commission of Human Rights

Share