We Are Power Podcast

AI, Gender Roles, and the Future of Equality with Eleonore Fournier-Tombs

powered by Northern Power Women Season 17 Episode 19

Get ready for a captivating episode with Eleonore Fournier-Tombs, the brilliant mind behind 'Gender Reboot: Reprogramming Gender Rights in the Age of AI'.

Eleonore shares her path from a shy teenager in Model UN to a trailblazing data scientist at the UN. 

Listen as she dives deep into the motherhood penalty and the urgent need to challenge stereotypical gender roles in caregiving. 

Listen to learn:
- Why women’s participation in AI development is essential
- How AI tools often reinforce traditional gender roles 
- Ways that AI can be used for the greater good.
- How we can all fight gender biases and promote equality 

 

You can now nominate for the 2025 Northern Power Women Awards to be in with a chance of celebrating with changemakers, trailblazers and advocates on 6th March 2025! Nominate now at wearepower.net

Sign up to our Power Platform to check out our events calendar here.

Keep up to date on the latest news from We Are Power : Twitter, LinkedIn, Instagram & Facebook

Sign up to our newsletter.

Speaker 1:

Hello, welcome to the business you're in. Hello, welcome to the we Are Power podcast. This is the podcast all about highlighting role models. Every single week, I get the opportunity to speak to somebody from across the corner of the world doing something wonderful, with the hope that they will pass on their top tips, guidance, hacks, whatever that may be, that will help support you, whether it's your career, your life, whatever path you're on. We hope that we can help pass on some of that insight and help you navigate your way. And this week I am really excited because we have gone super global in many different ways, in many different ways, and I have got Eleanor Fournier-Tumes, who is the author of Gender Reboot, which is reprogramming gender rights in the age of AI. But not only that. It's super accomplished, super award winning Eleanor. I don't know where we're going to start today.

Speaker 2:

Eleanor, welcome to the podcast. Thank you so much. It's such a pleasure to be here, to be on such a wonderful podcast. I'm really excited to speak to you today and I'm looking forward to getting started.

Speaker 1:

And you are so accomplished. You can find out more. We'll put details of Eleonora's book and her connections right in the chat as well, so please do connect and find out the work that she's been doing. So how did you find your way into this world of where you are now? You're in New York, right, you know you're working with the UN and many other things accomplished author, but how did you get to there, to having this whole toolkit of accomplishment?

Speaker 2:

Well, that's such a big question. So I've always really been interested in global affairs, actually as a teenager. I was doing Model UN in high school and so I just thought it was fascinating. I actually had a little bit of background because, interestingly enough, my grandfather worked for the League of Nations back in the day a long time ago, and so there was a little bit of family interest in these international organizations and so when I had the opportunity to be exposed to that in high school I was the kind of student who did things like debate team and model UN and all those kinds of things.

Speaker 2:

I was extremely shy, which I still am a little bit, so that was always challenging because I was always trying to do debates and public speaking and all those kinds of things.

Speaker 2:

But I was very interested and so in my studies I studied political science at first and I started becoming very interested over time a little bit later, maybe in my mid 20s in this whole world of data and data science and data analytics. Really big UN experience was at UNDP in a research office called the Human Development Report Office, where every year they publish a report on the state of development of different countries and the quality of life of people in different countries and I was doing a website and data for them and I just really wanted to be a researcher and I wanted to have the kind of jobs that I was seeing actually many women in that office doing. It was so interesting and so that's when I started pursuing a PhD and I became a data scientist. I started doing more academic work and now I lead a research team inside the UN focusing in large part on AI policy and AI governance, which is just the hot topic right now.

Speaker 1:

Absolutely for sure, and I really hope you're able to bring some of those insights into this. And also, it's the demystification, I think isn't it around AI, because there's so many different forms of it and there's so so much. There's also a real fear of it. There's a, there's an excitement for it, but there's also a fear as well. Is what? What? What inspired you to write your book, which is gender reboot reprogramming gender rights in the age of AI?

Speaker 2:

well, um, during COVID it was just such a um, you know, everybody was just so stressed and anxious during that time and I was in a kind of unique place where I was working full time as a consultant, so I didn't have that much job stability, but I had small, small children, and so I was terrified, and I had. So my youngest was nine months old and then I had a four-year-old and so, with my husband, we were doing what many parents of young children were doing, which is taking care of the kids during the day and then working at night. And it happened that in my consulting work I ended up working a lot on predictions related to COVID-19. So particularly predicting the spread and the severity of the disease in countries experiencing humanitarian crises and looking at compound risks, which I thought was so interesting. So COVID-19 plus a natural disaster, plus already having poverty or food insecurity just trying to understand that for international organizations and I started becoming angry because I felt like how can I contribute to this really important global issue if I don't have time?

Speaker 2:

And it made me really think about gender roles and the pressure that we have on women to perform care work, although you know, as a mom, I love my kids and I love spending time with them.

Speaker 2:

I do feel that there's this expectation that women are going to do both, that they're going to succeed professionally and that they're going to be really, really engaged and take on the majority of the care activities and duties.

Speaker 2:

And so I started unpacking that and thinking that a little bit more during the pandemic and I came across the concept of AI and AI bias and discrimination against women and realizing that AI is being adopted around the world in so many different sectors and it's accelerating and accelerating, but that there are certain dimensions of AI that are actually could be harmful to women and could be discriminatory or stereotyping and actually regress some of the advances in gender equality that we've had in the last few decades of the advances in gender equality that we've had in the last few decades. And I was also thinking about how important it is for women to be engaged in these activities. As a data scientist, as someone who was developing AI models, that I faced also barriers to self-actualizing in that space, and how important it is to have women fully participating in this really new and powerful technology. So that's why I wrote the book.

Speaker 1:

Wow, and it kind of very much explores the impact on women's rights. It impacts on the bias against women as you just talked about. The stereotyping I know is sort of something that's so massive Because your book is exploring the history of gender dynamics in the workspace. What do you think is the most radical thing that has changed at work in the last generation?

Speaker 2:

I do think that there's been so many more opportunities for women, in the sense that this awareness of having the importance of diversity and having women in leadership has really changed In the last generation.

Speaker 2:

My mom was telling me a story about how she had applied for a senior position in her domain and how she had been told we don't want a woman in that role.

Speaker 2:

And I think and that's a huge barrier, and actually I put in the book that my grandmother was working as an analyst in the Secret Service in the US during the Second World War and when she got married they had a marriage ban which they had in the US and also in the UK.

Speaker 2:

So when you got married you would have to leave the civil service and you couldn't work anymore for the government. So that has been abolished from a long time and telling people that they can't access women, that they can't access positions of leadership, has also changed a lot, particularly in the West. So it depends in which country. There's still many countries around the world where there's enormous barriers for women's leadership and women's participation in the workforce, Like Afghanistan is an area with a place which has regressed a lot, but in the last generation, I think, where we're sitting right now, you can't really tell a woman because you're a woman, you can't apply for this position or we're not going to even consider you. So that's really really been very encouraging, and I think it's allowed us to have really different kinds of careers than our mothers and grandmothers might have had.

Speaker 1:

Wow, and it's fascinating that you've brought your personal experiences into the book. But what can we do about it? How can organizations out there or individuals, people listening to this podcast? How can we work better and smarter towards eliminating or mitigating against these gender biases in AI technologies and creating that much sought after equality in the?

Speaker 2:

workplace. So many different dimensions to the question. I think, if we focus on AI in particular, in the book I talk about three main categories of risks for women and gender equality in AI, and so one of them is discrimination. So, if an AI system has different outputs for women than for men based only on the gender, for example, ai is being used more and more in human resources algorithms. So, for example, you apply for a job, which many of us do you upload your CV onto a web platform. So now many, many companies use an AI to analyze the text in your CV and to put a hireability score on that.

Speaker 2:

So we've seen and news started coming out on this in 2018, so it's been quite a while that in some cases, discrimination based on group and so the big example was Amazon in 2018 was criticized because it was eliminating women's CVs that had applied. So if you have in your CV I was captain of the female rugby team you have the word female and they identify you as a woman, and then they were dropping that CV. And so also there in the US, they have women's colleges there's one called Barnard, which is affiliated to Columbia University, and there's another one called Smith and so they were also eliminating CVs where the applicants had gone there. It was not unintentional bias, but it had an enormous impact on women. If you think about how often these tools are used, they're not audited and how often they might discriminate against women and block us in having opportunities, so that's a really immense risk for us, and we also are seeing this also in loan attribution, unfortunately.

Speaker 2:

So not only do we have a risk related to applying for jobs, but also getting less money from the bank than men.

Speaker 2:

In many cases, there's been research shown that the algorithm will attribute a higher loan capacity and a higher capacity to repay to men than to women, with all other variables being equal, and that's because the tools are trained on data which shows historically that men made more money or that men were more present in the workforce, and then so they tend to bias towards men. So the most important thing that needs to be done is to raise awareness and to make sure that all of these tools are audited and tested before they're deployed, specifically for this, because otherwise we're not going to have access to jobs and we're not going to have access to capital, which has an immense lifelong effect. You know, all you need in your life is to get one job or one mortgage, you know, and then it completely transforms your life. So if you have all these barriers over the course of your life, the trajectory of your life as a woman could be so different than your trajectory as a man.

Speaker 1:

Wow, and you talk as well. I've heard you talk about the motherhood penalty as well and how the impact of that? Only the lack of equality in the distribution of roles, and there's a whole debate, I think, about whether caregiving could be given by robots and stuff like that. Can we unpack that a little bit around the motherhood penalty?

Speaker 2:

Of course. So I think one of the things I talk about a lot in the book is that it's not just about women, so it's about gender roles and this rigidity that we still have in our society about gender roles. So it is improving. But it's not just women accessing what I talk about the public sphere, so women accessing employment and leadership and so on, but it's also about men accessing care and the home and domestic labor. Often we have this idea that somehow women are better caretakers, so women should just do all the care work, and I fundamentally really disagree because I think that as long as women have to do all the care work, they're going to do it because we all love our children and we can do everything. So we're going to not be able to fully self-actualize if we have so much work at home. Be able to fully self-actualize if we have so much work at home.

Speaker 2:

And many men that I spoke to in this book really felt that they were suffering a lot of discrimination in relation to care work. So there were men I spoke to that have paid employment. So, for example, a midwife who's a man. There's one in the province of Quebec, which is where I'm from, where Montreal is, and there's only one male midwife, considering that there's many male OBGYNs, for example, so it's not like they're not in the field, but for a very care-related employment. There's only one, and he struggled a lot, particularly with women, to be allowed to accompany women in that process, and people were suspicious why would you be a man and want to be caring and want to have this kind of profession, to accompany women in that process? And people were suspicious why would you be a man and want to be caring and want to have this kind of profession?

Speaker 2:

And men in nursing, men, in even social work, face suspicion from their friends or from society in terms of why would you want to do this? This is not what men do, and so we have these really, really big biases. For, for example, in elementary school and and early childhood education, it's 95, 97% women. It's almost only women. And and I really think that if we're going to have true gender equality and that women are really going to be able to be leaders, be technology developers and so on, we have to open the door for men to be a babysitter, to be an early childhood educator, and let them also express their nurturing side, which is definitely there and which is still quite blocked up in our society and not acknowledged.

Speaker 1:

Absolutely, and we talk so much about those stereotypes being formed at that early age and so around our young children, around your young youngest, if you like, that's a six, seven those stereotypes are formed and if you're not seeing, you know sort of either men or women in all those roles, then our youngins are growing up with that in mind. One thing I I heard you talk about as well was um, really, um the apps. When you're building apps and technology, build them intentionally, don't try and retrofit them. And this, this went around us, um, you were talking about um ride sharing apps, I think environmentally, but you managed. You managed to do good with that in the end out of something that was so dreadful against women.

Speaker 2:

Yes, so this example was an example from my research in Indonesia, where they have a ride-sharing app which does a few other things. It's called Gojek. It's very similar to Uber or Lyft, and so the issues that they face with this app can be also faced with all kinds of other, with other apps, and basically what was happening is that women were using this application and getting rides and then suffering sexual violence, being kidnapped, those kinds of things, so real security risks from this AI tool, and so they were heavily criticized, of course, and they were requested to go back and rethink the design of the app to make it safe for women to use, and so they started adding features, for example, share my ride feature, which you know you just, we all know what it could be. It allows you to share where you are with your friends and family, for example, when all know what it could be, it allows you to share where you are with your friends and family, for example, when you're taking a ride. There was also better biometric identification of drivers, so it couldn't just be a random people picking person picking you up, and they added a few other features like that to make it safer, and they really thought about women's safety and security as kind of a core part of the company, and it really changed things a lot. And so in the book I talk about a lot of different examples. When you put women's rights and women's security and gender equality at the core of the application, what a big difference it can make. And also when you give the tools to women, because women can also develop tools with a different perspective in mind, so they can develop, you know, chatbots that are specific in helping women that you know have experienced domestic violence, or helping women in with women's health, for example, or even developing kind of tools and applications that would help in care work.

Speaker 2:

So one of the things that we've alluded to a little bit in our conversation is this stereotyping, and I do really want to address it. So when I was writing the book, there was starting to be this idea that some AI tools actually stereotype women or have gender stereotypes. So in translation algorithms, for example, we were seeing that if you write a certain text and you translate into another language, the translation application would default to stereotypical genders for your story, basically, or your text. So if it will give you gendered male gendered pronouns for phrases related to leadership and technology and philosophy, and being a captain, and for women women would be. You know you would use a female gendered pronoun for things like care, work and you know, dancing, which I think is cultural as well, or cleaning the house and so on.

Speaker 2:

And then since then this has really, really exploded because generative AI, which we're using in the last two years, is very, very, very stereotypical.

Speaker 2:

So when you have a tool, a generative AI tool that creates text or images or videos, they go along these historical kind of rigid gender roles to such a great extent where the texts are all about, you know, they are very gendered, where the images that they propose and create are very stereotyped.

Speaker 2:

And my worry is that it's going to start eroding the way that people think about gender roles and the progress that we've made so far and kind of put us back into that, because when we're exposed to stereotypical content, we start to internalize it. If I never see examples of women that are leaders and women that are innovators and explorers, how would I think, as a woman, that I can do that too? Right, and I think it's the same thing for men If you don't see examples of men that are stay-at-home dads or caretakers or nurses or midwives. As a man, you would feel so strange to have the inclination to do that. So we have to be really careful of the content that we put up there and we really need to make sure that the companies developing these tools are aware and are always testing for that and preventing it and not allowing these kinds of tools to be deployed if they're going to harm gender equality.

Speaker 1:

For sure, and I think people can be quite fearful out there about AI. But what can we be excited? What are the opportunities? I know you have this phenomenal role as research lead for the UN and the new AI advisory board, but what are the opportunities? Give us some hope. What's the opportunities out there as AI as that real tech for good, in the field around gender equality?

Speaker 2:

So I don't think that we should be afraid of AI. I think there is a very binary kind of discourse that either it's just risks or opportunities. I always try to think about it more as responsible or safe AI, in the sense that we have all these different technologies in the world, like aviation, and it's very powerful and it needs to be safe, otherwise you can't use it. If we had planes and they were just unsafe and falling out of the sky, we would never be able to harness the opportunity. We would never use them. So I think it's kind of the same with AI it needs to be safe and then, once we've addressed these different risks, we can use it. So in my work, I've used AI a lot in a field of predictive analytics. So it's predictions, and we can do a lot at the UN for predicting humanitarian crises, natural disasters and understanding with analytical tools, for example, what would be the impacts of climate change, who are going to be most affected, how can we help them. So AI is very powerful in analyzing big data sets and predicting what might happen, not always with extreme accuracy, but enough to give us ideas in terms of how to move forward. And I actually think that in the domain of the climate crisis, we really need to harness AI to help us preventing climate change and addressing carbon emissions and energy efficiency, and so on. I also think we should use AI in terms of adapting to the effects of climate change. So figuring out, using analysis, and figuring out where to move do we have to move communities, how do we use air conditioning, how do we support vulnerable peoples? And so I really think that these kinds of big global issues, we could use AI to really help us get out of it because of the immense capacity to analyze data and to predict.

Speaker 2:

So to me, those are the most powerful uses of AI, and if we have women front and center in the development of these tools, I think we'll have a very different perspective on how to develop them. What are the kinds of features that we could have? Women really have, in a lot of cases, different life experiences, and we want to address these tools so that they can really affect society positively. So we need to have a lot of diversity in who develops it. Otherwise, it ends up being inappropriate or kind of incomplete for the people that are being affected and, finally, I think, for women themselves, because it's so powerful. There's a lot of money involved in AI. There's a lot of economic opportunity and social opportunity, and so it's a really great for us to be involved, to really have our careers take off and be engaged in this transformational technology, because those in society that control technology really controls society, so it's a really big opportunity for us to have um a say in how our society is developed fast like totally fascinating and totally reinforces everything that we're always talking about.

Speaker 1:

Let's make sure we've got more seats at those tables. Let's have more people enabled in those conversations, putting women front and center in this evolution of technology, in these decisions, in when we're building this technology. Make sure that we have women at the heart of it. Eleanor, thank you so so much for enlightening us. Thank you for sharing your story with such passion and your research. What you're doing is literally changing the world. Please check out Eleanor's book, which is called Gender Reboot Reprogramming Gender Rights in the Age of AI, which is absolutely necessary, and I thank you so much for joining us. I really appreciate it.

Speaker 2:

Thank you so much. It was such a pleasure to be here with you. I really appreciate the work that you're doing on this podcast and it was a real honor to speak to you.

Speaker 1:

Oh, thank you, and please do keep this conversation going. Thank you for listening. Wherever you're listening, please pass on this. Share this with someone who can make that change or can be at that table. That's what this is all about. How can we keep role modeling the really great stuff that is happening over there, across the world? Please stay connected on all of our socials Facebook and LinkedIn. We are power. Tiktok, Insta and TikTok and Twitter. We are power. Underscore net. Thank you so much. This is the we Are Power podcast, a what Goes On Media production.

People on this episode