You are on page 1of 1

Discrimination by Correlation

show that the algorithm did not discriminate which incentivizes companies
to avoid discrimination in the first place. Previously, the CJEU was reluctant
in terms of access to information when it excluded in Meister 43 a “right […]
to have access to information indicating whether the employer has recruit-
ed another applicant” even when the job applicant “claims plausibly that he
meets the requirements listed in a job advertisement and whose application
was rejected”44. This represented an obstacle for job applicants that were re-
fused by algorithms to get access to the underlying data that influenced the
decision outcome, making proof of algorithmic discrimination more difficult
than classic discrimination. The CJEU had no opportunity (yet) to clarify its
interpretation in a case of AI45, but would dispose of tools to facilitate confi-
dential access to data, for example via in camera procedures to protect busi-
ness secrets or consulting AI experts to give expert evidence.
Statistical analysis is used for risk assessment by insurance compa-
nies to deal with complexity, sometimes to the detriment of accuracy. In-
surance companies used gender to distinguish between different risks, to
establish price differentiation by gender in car insurance contracts46. The
case “Test-Achats” concerned the practice of using gender for insurance pre-
miums47. The CJEU ruled that considering gender for calculating insurance
premiums is discriminatory, obliging the firms to introduce gender neutral
insurance contracts. Despite not being directly linked to AI, the case gives
guidance to assess potential discriminations for situations of statistical data
and data sets used by algorithms where a similar process of generalization
exists. Even if the CJEU “banned “using gender-specific insurance contracts,
algorithms can easily circumvent this prohibition by using criteria or so-
called proxies, to infer the gender of a person. Consequently, it remains to be
seen how courts would decide a case involving algorithms and if the concept
of discrimination is still well equipped to “grasp” the essence of algorithmic

43 CJEU, C-415/10 Galina Meister v Speech Design Carrier Systems GmbH EU:C:2012:217.
44 Ibid, para. 49.
45 Some guidance was received from the CJEU in Seymour-Smith, that “mere general-
izations concerning the capacity of a specific measure to encourage recruitment are
not enough to show that the aim of the disputed rule is unrelated to any discrimina-
tion based on sex nor to provide evidence on the basis of which it could reasonably
be considered that the means chosen were suitable for achieving that aim.” CJEU,
Case C-167/97, Seymour-Smith, EU:C:1999:60.
46 CJEU, C-236/09.
47 Ibid.

257

You might also like