MARTINA MARA
Psychologist of technology. University professor. Science communicator.
 

The world's first professor of robot psychology. At Johannes Kepler University Linz, Martina Mara and her scientific team focus on the psychological conditions of human-centred technology development. Mara researches the question of how people perceive robots and artificial intelligence, how cooperation with intelligent machines can function or how old gender stereotypes are perpetuated in new technologies. These are practice-relevant topics, because in the future, more and more machines will participate in our lives. Mara is a co-founder of the Initiative Digitalisierung Chancengerecht (IDC), a member of the supervisory board of the Austrian Research Promotion Agency (FFG) and was active for many years in the Austrian Council for Robotics and Artificial Intelligence (ACRAI). Her awards include the Futurezone Award in the category "Women in Tech" (2018), the Vienna Women's Award in the category "Digitalisation" (2019) and the Käthe Leichter Prize of the BMDW (2021). In addition to her scientific work, Mara regularly comments on current technological events for a wide audience as a newspaper columnist.

Martina Mara / © Paul Kranzler

Intelligent machines should not copy humans, but complement and support them. It must be about cooperation, not competition.
— Martina Mara

How Siri and Alexa bring back outdated gender stereotypes – Interview with Martina Mara

Why are voice assistant programmes like Siri or Alexa always spoken by female voices and why is this problematic? We spoke to robot psychologist Martina Mara.

They are called Alexa and not Alexander, Alex or Franz. They fulfil wishes. 24 hours a day. Seven days a week. They never say no and are always friendly - no matter how you treat them. Assistance programmes like Siri from Apple or Alexa from Amazon are the assistants and servants of this age. Siri is also a Scandinavian female name. Their voices have female connotations and follow female speech patterns. At least in the default setting. Originally, they even responded in a friendly to flirty manner to sexual harassment and insults.

A recent UNESCO report criticises the default submissiveness and calls for action. How such artificial intelligence systems are programmed, it says, has an impact on society. This solidifies gender prejudices, it says. Martina Mara is a professor of robot psychology at the Johannes Kepler University Linz in Upper Austria. Among other things, she is concerned with how technology can be designed to be diverse and gender-neutral.

Ms Mara, you are a professor of robot psychology. What do you do in your profession? It sounds a bit like therapy for Wall-E and Co. But of course there are no robots on my couch. Just as occupational psychology is not about the desk but about the psyche of people, I deal with people who are increasingly involved with robots and artificial intelligence (AI). So far, human needs have been given far too little consideration in technology development. Currently, a new gadget is often developed and only at the end do social scientists or psychologists evaluate how humans will deal with it and what effects it will have.

Why are the voices of voice assistants like Siri or Alexa always spoken by voices with a female connotation? At least with Siri you can change that. But yes, the default setting is female. My Siri has long since been changed to male (laughs). There is still little empirical long-term research on the concrete effects of this. But there are studies that say that people perceive female voices as less dominant, more sympathetic and more caring, even with robots. We automatically transfer patterns from the perception of other people to our perception of machine interaction partners. Hence the question: Why should the way we communicate with these digital assistants not have an impact on how we interact with our fellow human beings? That would only be the case if we clearly perceive them as machines and not as human-like interlocutors. But these systems are heavily anthropomorphised. Siri and Alexa are also clearly based on certain female characteristics. Feelings, intentions and motives are attributed to these computer systems. This is also the case with service robots, robots in the service sector or in social areas. Of course, it is striking that the assistance systems often have features with female connotations, especially in these sectors. There are also chatbots or AI systems that have male names. However, these can be found in the areas of business, banking or legal advice.

Martina Mara / © Paul Kranzler

Why is this problematic? It's a problem because it rehashes role clichés in which the woman has a passive role, taking orders and carrying them out. Drastically speaking, our relationship with Siri or Alexa is a kind of master-slave relationship. These systems follow old role stereotypes that have been fought against for decades. Now they are finding their way back into our everyday lives via the tech gadget level. This is what we need to talk about.

What can artificial intelligence look like that does not serve these role clichés? There are several approaches to this. There is a project in which gender-neutral voices are already being developed. Here, researchers are trying to develop a voice that listeners perceive as male and female at the same time. I find this approach very exciting, but the challenge is that people receive this voice well.

Why? There are indications from research that some people have a hard time with dialogue partners whom they cannot assess. Fundamentally, I also find the question exciting: should these systems sound human at all? Another way would be to design the systems in such a way that humans clearly perceive them as machines and to counteract humanisation by design. Then the system would have a voice that simply sounds like a machine. One of my PhD students is currently trying to investigate the acceptance of such artificial voices.

How realistic is it for people to develop a real relationship with such programmes as in the film Her? There, the protagonist Theodore falls in love with a voice programme. The fact that we form a communicative bond with these systems and see them as important social beings is already happening today. Children are a good example of this. My four-year-old daughter started asking after a short time, "Mummy, how is Alexa?" For me, that was the point at which Alexa was banned from the flat. I think the emergence of feelings is realistic because humans don't need much for it. Even small stimuli are enough to give the impression of aliveness and empathy. It is enough when a hoover robot moves towards me and it seems that it does it out of its own intention. Already a simple, social relationship is created. Or people automatically find a service robot with round shapes and big eyes that falls into the childish mould cute and likeable. I'm sceptical about whether love will work out. People are complex beings and not as easy to simulate in all respects as is sometimes portrayed. Humour, creativity, spontaneity, sincere care for the other person, shared experiences with the partner - bots can't keep up with that.

If Siri and Alexa are perceived as humans, how important is it that they are feminists? I would prefer it if machines were portrayed and perceived as machines. But that's often not the case, and that's why it's important to talk about the fact that for years none of the developers - most of whom are male - thought about how Alexa would react adequately to sexual advances and insults from users.  To the question "Alexa, do you want sex?" - a question, by the way, that is asked more often than one might think - the system originally answered: "I'm not that kind of assistant". Of course, this implies that somewhere else there is already "that kind of assistant" for whom sex would be okay. Fortunately, something has since changed here. Alexa now says about herself that she is "woman power from the socket", calls herself a feminist and does not address harassment. The manufacturers have reacted to the massive criticism. This also shows how important it is to criticise such bad manners over and over again.

How important is it who develops technology? Most developers are still white men. It's not just about men and women, but a fundamental lack of diversity. People of colour, people with disabilities and older people are totally underrepresented. There are also more and more older users; why aren't they allowed to have a say, after all, they are also supposed to use the products? At lectures, I often show the example of the sensor-controlled soap dispenser that simply does not spit out soap for black people, but only when a white hand is held under the sensor. Videos of this were heavily shared on Twitter for a while. We need tech teams that are more diverse and interdisciplinary. But diversity is also missing in the data and is still talked about far too little.

What do you mean by that? AI systems learn from huge data sets. Speech assistance systems are also AI - they learn from thousands of human speech recordings. It is essential that these are diverse and that the most varied user groups are represented in them. So far, Alexa and co. do not understand all people equally well. US media reported that the voice commands of older women, for example, are systematically less well understood. This could be because there is little readily available recording of them. We need to talk about how to create artificial intelligence that does not discriminate against any group of users. There will have to be explicit new professions in the future - such as data curator. And also guidelines.

One of her favourite series as a child was Knight Rider. Could the cult car K.I.T.T. have a female voice today? (Laughs) Yeah, sure! It would just be important that K.I.T.T. continues to look just as sleekly black and is not made pink and round.

By Eva Reisinger / zeit.de

We need to think about how to design an automated future in which, firstly, important psychological needs such as autonomy and social connectedness remain fulfilled and, secondly, old social stereotypes are broken down rather than cemented in place.
— Martina Mara