You are on page 1of 3

and Andres Ferraro of Universitat Pompeu

Not OK, computer: music


Fabra in Spain analysed the publicly available

streaming’s diversity listening records of 330,000 users of one


service. This showed that female artists only
problem represented 25 per cent of the music listened
to by users. The authors wrote on The
‘IF YOU LOOK AT SPOTIFY’S TOP 10 MOST Conversation platform that “on average, the
POPULAR ARTISTS OF 2020, ONLY TWO ARE first recommended track was by a man, along
WOMEN’ with the next six. Users had to wait until song

BY GILLIAN TETT seven or eight to hear one by a woman.”

1 Sexism can be a subtle problem. In the


music industry, for example, we have not just 4 People come to their musical tastes in all
had #MeToo scandals, exposing the abuses of kinds of ways, but how most of us listen to
male singers, musicians and producers, but
music now offers specific problems of
have also seen less obvious ways where embedded bias. When a streaming service
women seem to be disadvantaged.
offers music recommendations, it does so by
studying what music has been listened to
before. That creates a vicious feedback loop, if

2 Take people’s listening patterns on it already offers more music by men, that has
startling consequences — which most of us
streaming services. If you look at Spotify’s top
listeners are unaware of.
10 most streamed artists of 2020, for example,
only two are women — and Billie Eilish is the
highest in seventh place. This might not seem a
case of discrimination, but the way we got 5 Is there any solution? The researchers
here raises some important questions.
offered one: they did a simulation of the
algorithm and tweaked it a few times to raise
the rankings of female artists 1 2 3 4 5 (ie they

3 Now a team of European computer get more exposure by being recommended


earlier) and lower the male ones. When they
scientists have explored this tendency by
let this system run, a new feedback loop
looking at streaming services’ algorithms.
emerged: the AI indeed recommended female
More specifically, Christine Bauer of Utrecht
artists earlier, making listeners more aware of
University in the Netherlands and Xavier Serra
that option; and when the AI platform learnt discrimination. Personally, I have often felt
that the music was being chosen, it was wary of this concept, since I have built my
recommended more often. career trying to avoid defining myself by
gender. But today, after years working in the

6 Bauer tells me it was “a positive surprise” media, I also realise the power of the
“demonstration effect”: if a society only ever
to change the streaming service’s apparent
sees white men in positions of power (or on
bias so much with a few tweaks to the
the pages of newspapers), it creates a cultural
algorithm. “Of course, it’s always easier to fix
feedback loop, not unlike those streaming
something in theory rather than in practice,”
services.
she says, “but if this effect was similar in the
real world, that would be great.” She adds that
the group is now exploring how to use the
same approach to address ethnic and other 9 This affects many corners of business.
forms of discrimination in media platforms.
Consider venture capital: research from a
multitude of groups shows that diverse teams
outperform homogeneous ones. Yet according

7 The team stress that this work is still at an to Deloitte, 77 per cent of venture capitalists
are male and 72 per cent white, while black
early stage, but the study is thought-provoking
and Latino investors received just 2.4 per cent
for two reasons. First, and most obviously, it
of funding between 2015 and 2020, according
shows why it pays to have a wider debate on
to Crunchbase.
how now-pervasive AI programs work and,
above all, whether we want them to
extrapolate from our collective pasts into our
futures. “We are at a critical juncture, one that 10 This pattern has not arisen primarily
requires us to ask hard questions about the
because powerful people are overtly sexist or
way AI is produced and adopted,” writes Kate
racist; the subtler issue is that financiers prefer
Crawford, who co-founded an AI centre at New
to work with colleagues who are a good
York University, in a powerful new book, Atlas
“cultural fit” (ie are like them) and to back
of AI.
entrepreneurs with a proven track record —
except most of those entrepreneurs happen to
look like them.

8 Second, music streaming should also make


us ponder the thorny issue of positive
11 “Mainstream investors generally
consider funds led by people of colour and
women as higher risk, despite widely available
evidence that diversity actually mitigates risk,”
point out financiers Tracy Gray and Emilie
Cortes in the Stanford Social Innovation
Review. You could address this by using
something akin to a music algorithm rejig:
foundations could deliberately elevate diverse
employees and overinvest in funds run by
diverse groups to change the feedback loop.

12 Would this work? Nobody knows yet


since it has never been done at scale, or at
least not yet in finance. The reality is that it is
probably even harder to shift human bias than
it is to tweak an algorithm.

13 But if you want a reason to feel


hopeful, consider this: while computer
programs might entrench existing bias, the
amazing levels of transparency that Big Data
can provide are able to illuminate the problem
with clarity. That, in turn, can galvanise action,
if we choose it — in music and elsewhere.

Gillian Tett, 7 April 2021. ©The Financial Times


All rights are reserved. Articles republished
from the Financial Times

You might also like