Professional Documents
Culture Documents
Uni PM Algorithmic Management Guide en
Uni PM Algorithmic Management Guide en
MANAGEMENT-
A TRADE UNION
GUIDE
ALGORITHMIC
MANAGEMENT-
A TRADE UNION
GUIDE
3 EXECUTIVE
SUMMARY
17 PERFORMANCE
MANAGEMENT
ALGORITHMS
4 BACKGROUND
23 KEY DEMANDS
FOR UNIONS
6 RECRUITMENT
ALGORITHMS
25 REFERENCES
12 WORKPLACE
DECISION
ALGORITHMIC MANAGEMENT - A TRADE UNION GUIDE
ALGORITHMS
EXECUTIVE
This report looks at the growing use
of algorithmic management tools in
workplaces across the globe. Its aim is
SUMMARY
to provide guidance for trade unions on
how to approach negotiations over the
development and application of these tools
at work.
BACKGROUND
For several years algorithmic management international companies’ human resources
has been growing around the world, as (HR) functions now use A.I. to assist their
employers invest in digital monitoring, recruitment,i with estimates suggesting
analysis, and decision-making tools over two-thirds of CVs in the USA are no
to inform, advise and in some cases longer seen by humans.ii Furthermore, with
completely replace decision-making the coronavirus pandemic having led to
by human managers. Algorithmic an unprecedented rise in remote working
management tools range from very simple across the globe, the desire of managers to
software that tracks employee working find digital tools that can support them in
time or scans CVs for keywords, to much leading their teams will only increase.
more sophisticated tools that use machine
learning or other forms of Artificial There are widespread concerns from
Intelligence (A.I.) to predict customer defenders of workplace rights that this
footfall in shops, allocate shift patterns, trend towards workplace analytics and
allocate tasks to workers, or even decide automating the management process
who to hire, promote or reassign based heralds a new age of Digital Taylorism.
on a potentially huge amount of collected This suggests a return to the command
data. and control ‘scientific management’
style of early 20th Century Taylorism that
These algorithms have for many years emphasises analysis, efficiency, elimination
ALGORITHMIC MANAGEMENT - A TRADE UNION GUIDE
been widespread in the disintermediated of waste and standardisation, but with new
gig economy where they allowed platforms powerful algorithmic tools that can raise
such as Uber, TaskRabbit or Deliveroo these principles to new heights for a digital
to manage a workforce of purportedly age. While the prophets of this new digital
‘self-employed’ workers without the need management creed promise vast potential
for a traditional line manager relationship. gains in productivity, critics fear that it will
In recent years, however, these practices come at a huge cost in terms of increased
have been increasingly spreading to surveillance, erosion of personal privacy,
the regular economy. Over 40% of loss of autonomy, decline of workplace
UNI GLOBAL UNION PROFESSIONALS & MANAGERS
relationships and a resulting dehumanising the hype and the dangers there are real
of work. There are also concerns that some opportunities as well to reduce rather
algorithmic management tools can be than increase bias and discrimination, to
prone to serious biases and discrimination. improve worker flexibility and autonomy
– for instance in giving workers more
Furthermore, the promised productivity control over their shift patterns or annual
gains might not even materialise. leave – and to improve the quality and
Many of the algorithmic management fairness of management decisions by
tools on the market today have been providing managers with independent,
criticised as untested, and in some cases data-driven advice rather than simply
uninformed corporate executives, with having them trust their gut instincts. It
little understanding of the technology they is vital to remember that algorithmic
are buying, are investing in over-hyped management tools are just that – tools.
Like most tools, they are neither inherently
good or bad; from a hammer to a steam
being used, and the key risks and are both of good quality and used well.
bear in mind. are being used, and the key risks and
opportunities that unions need to bear
in mind. By the term ‘algorithm’ here we
mean any kind of digital, data-driven
products that are based on little more than system – from simple keyword recognition
pseudoscience. As one example, which to very complex machine learning systems
is further developed below, the facial – used to carry out tasks such as sorting,
scanning algorithms marketed by HireView filtering, ranking or otherwise converting
to advise companies such as Hilton inputs to outputs in a systematic way
and Unilever on recruitment decisions according to a set of internal rules. The
have been criticised by tech experts as main three areas of algorithmic use in
duping companies into buying a product management – recruitment, performance
that has little basis in science. Similarly, management and other kinds of workplace
technology being marketed by AC Global decisions such as task or shift allocation –
Risks purports to use voice analysis of a are outlined in more detail below.
10-minute interview to determine the level
of trustworthiness and ‘risk’ of potential
employees with a 97 percent accuracy
– something described as impossible by
independent experts.iii The founder of the
A.I. Now Institute comments that “It’s a
profoundly disturbing development that we
have proprietary technology that claims to
ALGORITHMIC MANAGEMENT - A TRADE UNION GUIDE
RECRUITMENT
ALGORITHMS
One of the fastest growing areas of • Marketing algorithms used in the
algorithmic management in all countries targeted placement of job adverts
and across all sectors of the economy is online
in recruitment. There are several quite
ALGORITHMIC MANAGEMENT - A TRADE UNION GUIDE
Some companies that have Perhaps the biggest concern that has
been expressed about automated
biases. Humans are perfectly capable of recruitment trend that is becoming more
demonstrating terrible biases (whether popular. Various kinds of personality tests
conscious or unconscious) in our own are on offer, claiming to help companies to
recruitment decisions. In one 2004 find a good ‘fit’ with their corporate culture.
experiment using human recruitment Third party vendors like Pure Matching
in the USA, researchers found that test are selling these services that claim to
CVs sent out with Anglo-Saxon sounding ‘map your neuro-personality as to gain an
names on the top received 50 percent overall picture of your biological identity’,
more interview offers than identical CVs taking inherently subjective concepts and
with black-sounding names on them.vii If dressing them up in scientific language to
used responsibly, an algorithmic approach make them appear like objective measures
to deciding which candidates to put of candidate quality. In these cases, the
through to an interview can actually help discrimination against those with the
reduce overall bias and give candidates ‘wrong’ personality types is baked into the
from disadvantaged backgrounds a premise behind the technology.
better chance of a fair hearing. However,
to achieve this companies need to take While it may on the face of it seem
extraordinary care when developing or reasonable for companies to want
purchasing new algorithmic tools; pick the to prioritise candidates who appear
right model for the job, use good training conscientious, organised or calm under
data and avoid giving the algorithm pressure, these kinds of assessments can
irrelevant or too many variables to work seriously harm the employment prospects
with. Companies that work closely with of people for example with certain kinds
trade unions in making these decisions of mental health conditions that struggle
will be much more likely to come to the to pass the personality tests; a potentially
right decisions, both for them and for the unlawful form of disability discrimination
workers. in some countries, having led to lawsuits
in the USA.viii These kinds of tests are also
It is also worth considering that there likely to be culturally insensitive, potentially
are competing measures of bias in any discriminating against applicants from
automated decision making. Considering backgrounds different from those who
gender bias in recruitment, for example, wrote the software. This is a general
there are a whole host of types of fairness problem with many kinds of management
you could assess, from the ratio of algorithms – if they are designed by and
unqualified men/women being wrongly tested on a group of homogenous race,
recruited (‘false positive rate’), to the ratio culture, age or gender, they might not
of properly qualified men/women being function as intended when applied to a
wrongly rejected (‘false negative rate’), more diverse population, as seen with
to simply the overall gender balance of technology like automatic bathroom soap
those ultimately recruited (‘demographic dispensers that don’t detect non-white
parity’). Minimising all of these biases skin.ix
is certainly desirable, but it has been
proven mathematically impossible to Some companies are also using what they
make decisions, for most real-world describe as ‘game-based assessments’
characteristics, that are equally ‘fair’ by which supposedly test problem solving
all of these definitions. Companies need skills. While these might correlate to
to have open debates about how they applicant suitability for certain jobs, there
are going to assess the fairness of their are risks of companies rushing to embrace
ALGORITHMIC MANAGEMENT - A TRADE UNION GUIDE
algorithms’ decisions, including workers these tools for roles where these kinds of
and their union representatives in these tests aren’t really relevant and only serve
discussions. to discriminate against those who might
struggle with the particular format of the
The risk of discrimination is perhaps harder test.
to overcome when it comes to the use of
personality testing (sometimes described Another flaw with this kind of algorithmic
as psychometric testing) – another approach to CV screening is that as it
UNI GLOBAL UNION PROFESSIONALS & MANAGERS
additional skills and key words in white histories for a wide range of companies
text invisible to human readers but that from finance to retail to tech.
gets picked up by algorithmic filtering tools
that accordingly rate the CV more highly. Obviously, this kind of scanning of
people’s social media histories might
The last few years have also seen the seem uncomfortably intrusive to some.
beginnings of commercial algorithms for While Fama takes care to stress their
conducting background checks in some ethical consent-based approach, allowing
UNI GLOBAL UNION PROFESSIONALS & MANAGERS
SUMMARY OF KEY • Algorithms used early in the • There should be transparency for both
POINTS FOR UNIONS recruitment process, particularly job applicants and unions as to where
TO BEAR IN MIND: in helping write and advertise jobs recruitment algorithms are being used.
and guide candidates through the Unions should also have access to
process, are most likely to improve the information about the criteria used
experience for job applicants. to assess candidates and the data
sources employed.
• The use of algorithms for CV screening
can be necessary when the volume of • Data from job applicants should only
applications is high. When done well be collected so far as it is relevant for
this can reduce bias but careless use the application. It should always be
can embed and increase bias instead. stored and processed securely, and
deleted once a final decision is made.
• Where machine learning algorithms
are used in recruitment, remember that • Human managers should be involved,
if the training data used to determine monitor and be able to intervene at
who good applicants are is the result of any stage of the application process.
biased real-world human judgements, Any ultimate decisions on accepting
the algorithm will learn to display the candidates should always be made by
same biases. a human being. Automatically rejecting
candidates via an algorithmic CV
• Online tests can be a valuable screen may be permissible in some
component of the process for some jurisdictions but could contravene
jobs, but it is important to make sure GDPR in the EU and regardless should
that they are competency- or skill- still be supervised by human managers.
based tests relevant to the job at
hand. Personality and psychometric
tests are inherently subjective and risk
discrimination.
WORKPLACE
DECISION
ALGORITHMS
Perhaps the broadest category of • Routine self-service HR activity such as
algorithmic management is that which algorithms that approve/deny requests
covers day-to-day workplace decisions, for annual leave, log sick leave or
what might be considered typical line process work expense claims.
manager activity but where the line
manager is now either being supported, • Using algorithms to help redesign
advised or entirely replaced by computer workplace structures, such as
algorithms. The full list of what these kinds allocating workers between teams or to
ALGORITHMIC MANAGEMENT - A TRADE UNION GUIDE
distribution warehouse to new jobs for think or reflect for a few moments outside
delivery or taxi drivers. their officially scheduled breaks during the
day. Similarly, most distribution warehouse
One of the great strengths of algorithms workers increasingly receive their tasks
is that they can process far more data allocated by a tablet or other device that
more quickly than human managers tells them step by step where to walk
ever could. This allows not just faster and which shelf to reach for, optimising
analysis but also whole new kinds of to ensure they are constantly moving at
analyses to become possible. Shift maximum speed and never having to stop
allocation algorithms can use weather and think about where to go. Amazon has
forecasts, economic data, their own also begun trialling the deployment of
observations about past consumer activity, wearable haptic feedback devices for its
and knowledge about the availability of warehouse workers, using vibrations to
workers to generate shift patterns far guide their arm movements to the correct
shelf as quickly as possible in order to
be even more efficient. Not only can this
level of hyper-efficiency be extremely
stress on the human workers. own limb movements or about which box
size to use or which length of tape to cut
to seal it.
Some of the shift allocation software in ways. Managers may well welcome the
particular can be quite empowering for opportunity to spend more time on such
frontline workers, as it enables them to do activities rather than routine tasks such as
things like swap shifts with one another filling out duty rosters and spreadsheets.
directly through the software without
needing to run such requests by a line
manager first.
SUMMARY OF KEY • Self-service software for scheduling • Workers and their union
POINTS FOR UNIONS shifts, annual leave or similar, providing representatives need to be informed
TO BEAR IN MIND: it functions well, can greatly improve about and have access to data
convenience for workers and avoid being used by employers to make
favouritism. decisions affecting the workforce, and
workers should have the opportunity
• Such software, however, needs to be to challenge any inferences that
transparent in its workings and allow algorithms are making about them and
room for human compassion. It should their behaviour.
always be possible to appeal to a
human manager with the authority • Using automation to replace the role
to override the software and make an of the human manager would be ill-
exception. advised; algorithms should be used
to advise managers but not to replace
• Unions should be wary of gamification them.
techniques that use algorithms to
manipulate workers into making
choices about when and how they
work that might not be in their long-
term interests.
PERFORMANCE
MANAGEMENT
ALGORITHMS
The final kind of management algorithms • Algorithms that measure and assess
that are seeing widespread use are those workers against output or performance
which cover surveillance and evaluation targets or other benchmarks.
of the workforce; what might broadly be
termed as performance management • Algorithms that use customer ratings to
algorithms. These can include: measure employee performance.
• Algorithms that track physical or digital • Algorithms that take all of the above
ALGORITHMIC MANAGEMENT - A TRADE UNION GUIDE
surveillance they are prepared to accept advice of the kind that human managers
and what they consider excessive and would traditionally have given.xvii These
unjustified. In order to come to these kinds of call-monitoring systems come
decisions though, workers and unions with serious flaws, however. Complaints
first need to make sure they are aware have been made about the Call Miner
of what surveillance tools are actually software used to monitor Santander’s
in use in their workplace. In some US call centres, noting that it fails to
jurisdictions, particularly in the USA, recognise the words of employees with
certain accents or speech impediments,
hurting their ratings or forcing them to
Other call centre workers have adopt unnatural speech patterns.xviii Other
call centre workers have complained of
complained of “Feeling like the “Feeling like the only appropriate way
to display emotion is the way that the
emotion is the way that the Worse, these kinds of software are
computer says”.
often used in a so-called ‘rank and
yank’ strategy of ranking all employees’
performance against one another and then
automatically disciplining or firing those in
the bottom percentiles every month, with
regulations allow employers in many little regard for personal circumstances
cases to surveil and to monitor employees and without making any effort to invest
covertly without their awareness, let in training and support to correct
alone their consent. In other areas such shortcomings. This use of relative rather
as the EU, regulations are stricter as a than absolute rankings is particularly
result of both GDPR and Article 8 of the pernicious – half of the workforce by
European Convention of Human Rights definition will always be below average,
concerning the right to respect for private and someone is always going to be at the
life and correspondence. As a result, bottom no matter how hard they work
employees must be informed about data This kind of constant pressure, of knowing
gathering about them, and requirements that you are being constantly monitored
of proportionality and data minimisation and risk losing your job if those around
apply in law.xv xvi Unions should be aware, you work faster than you, has major risks
however, that many off-the shelf employee for both your psychological and physical
surveillance tools and other management health.
algorithms are designed with the US
market in mind and may come packaged This practice is particularly notorious in
with features that are not lawful for use Amazon distribution warehouses, where
in other jurisdictions such as the EU – the ease of replacing workers who burn
something that employers and the retailers out from overwork has led to the target
themselves might not be as aware of as rates of items that workers have to process
they should be. per hour being ratcheted up relentlessly,
under the watchful eye of algorithmic
Many performance monitoring algorithms monitors on the lookout for those who
are common in call-centres, which have fall behind. The result: increased stress,
clear metrics for productivity and success repetitive strain injuries, back and knee
ALGORITHMIC MANAGEMENT - A TRADE UNION GUIDE
that can be tracked. While older systems pain and an overall serious injury rate
would simply track the number of calls affecting 10% of the full-time workforce –
each worker made per hour, modern double the national average in the US for
algorithmic tools such as Cogito or Voci similar kind of work.xx The level of injuries
use voice analytics A.I. to provide real-time is so great that Amazon has installed
feedback to workers about whether they painkiller vending machines in some
are speaking too fast, sound too tired or warehouses, and in the UK, ambulances
insufficiently empathetic, or to offer other were reportedly called to warehouses once
UNI GLOBAL UNION PROFESSIONALS & MANAGERS
every two days in 2018 due to workers to improve systems and processes without
regularly collapsing on the job.xxi individual employees being pressured
to meet the algorithm’s expectations
These kinds of real-time continuous out of fear for their jobs. Insofar as this
feedback and performance review tools continuous data gathering on employee
are spreading to more types of workers in performance supports a shift away from
different sectors, beyond the traditional the single annual appraisal towards a
call-centre and warehouse deployment. model of continuous constructive feedback
In some cases, they are now applied to and development it may be welcomed –
home-based workers and self-employed but only if the role of algorithms is limited
contractors as well. WorkSmart, Time to advising and supporting human line
Doctor, and Microsoft’s Workplace managers in having those conversations.
Analytics are examples of the kind of Where the human line manager is taken
productivity software that is increasingly out of the performance review process
used to monitor remote as well as office- altogether there is a serious risk that
based workers, tracking things such workers become disengaged, grow to feel
as mouse clicks, keystrokes and other that their employer is cold and impersonal,
computer activity, and taking regular and skew their behaviour towards
webcam photos to ensure workers are at hitting only the targets measured by the
their desks. For some workers, the use of algorithm rather than caring about job
such software by their employers has been performance and personal development
intolerably oppressive, with the system in a more holistic way. In the Santander
docking pay automatically every time they case mentioned above, for example,
are detected to be idle or discouraging workers reported that the excessive focus
workers from listening to music or taking on performance metrics limited customer
bathroom breaks in their own homes for interaction and put consumers at risk.
fear that the algorithm will mark them
down for what it considers unproductive What algorithms can do, however, is to
activity.xxii collect factual data about workplace
performance and present it in an impartial
This technology can also be used in way that might help line managers
positive ways if properly deployed and overcome their own unconscious biases
regulated. Where data collected is about their team members and deliver
aggregated and anonymised it can be used their feedback in a fairer and more
informed way. Algorithms can also have algorithms where the criteria used to
a role in spotting patterns that help to assess workers are not transparent. The
uncover issues line managers might not claim that algorithms are unfathomable
have noticed – if for example an employee black boxes should not be permitted for
is regularly having difficulties with a important decisions about workers’ lives.
particular task because they don’t have the All too often this can lead to employers
right equipment, or if a certain employee’s adopting an ‘information alibi’ where
performance and punctuality always they claim that it is not the employer
suffers on dates that correspond to school criticising an employee, firing them, or
holidays, which might point to childcare denying them an opportunity, but simply
struggles at home for which an employer the algorithm saying so. Attempts to pass
could make allowances or suggest more the buck of responsibility for decisions in
flexible or remote working options. this way should be strenuously resisted
– algorithms are not responsible agents
Another positive use of performance and there should always be a human
management algorithms can be to manager able to both explain and take
identify workers who need extra training responsibility for any ultimate decisions.
or assistance. Using algorithms to ‘Because the algorithm said so’ should
identify workers who are struggling is never be an acceptable explanation for
not necessarily a threat to those workers, why an employee has been fired, denied a
providing the response of management promotion or pay rise, or faced any other
is to provide extra support rather than significant consequences.
attempt to get rid of them. Once again,
the ethics question is determined not by
the tool, but the purpose for which it is
used. IBM’s Watson computer system has
been used to identify with high accuracy
workers likely to quit in the near future,
so they can be targeted for extra training,
promotions or pay rises to improve staff
retention rates.xxiii Team Space, an in-house
app at Cisco, is used to help managers
understand how their direct reports
work best and what management style
or coaching tips are mostly likely to be
welcomed by them.xxiv The role for unions
is to make sure that this data-collection
and analysis is always proportionate,
transparent, and leads to constructive
human-led feedback from the employer.
SUMMARY OF KEY • Unions need to make sure they • Make sure that algorithms are
POINTS FOR UNIONS are aware of what surveillance and never used by employers to avoid
TO BEAR IN MIND: performance monitoring tools are responsibility for their decisions.
in use at their workplaces, bearing Decisions affecting employees should
in mind that some might be covert. always be for an explainable reason,
Check the legality of data collection according to transparent criteria and
without awareness or consent in your open to appeal.
jurisdiction.
KEY DEMANDS
Below is a final summary of key demands
that unions should make when negotiating
FOR UNIONS
about algorithmic management with
employers. Ideally unions should aim to
secure an ‘algorithmic use agreement’ with
employers that covers all of these points.
9 Companies investing in
5
algorithmic tools should also
draw up a ‘people plan’ to invest
Anyone programming or in parallel in their workforce, mapping skill
purchasing management profiles and upskilling workers in areas
algorithms needs to be that will become more important after
properly aware of the risks of bias and algorithms are introduced, while helping
discrimination and take all possible steps workers reallocate their time or move on
to mitigate them. Algorithms should also to new roles where their work is being
be regularly audited by independent third automated.
parties, chosen jointly by both employers
and unions, to check them for biases or
discriminatory outcomes. The results
of such audits should be available to
10
anyone affected by algorithmic decisions,
including union representatives.
Before adopting any
algorithmic management
tools, employers should first
carefully reflect on why and whether they
6
are actually needed at all. If the answer is
simply ‘because we can’, the project should
Any data collection or not go ahead. Algorithmic management
surveillance of the workforce tools should never be adopted simply
should be for a clearly justifiable because they are trendy or because
purpose. Personal or other sensitive competitors are doing so. Even where
data such as the content of emails, there is a genuine workplace problem to
conversations or location tracking should solve, in some cases the problem could
not be collected without explicit consent. be addressed through genuine human
discussions with the workforce instead.
7
ALGORITHMIC MANAGEMENT - A TRADE UNION GUIDE
iii Kofman, A. (2018, November 25). The xv See for example the case of Bărbulescu v
Dangerous Junk Science of Vocal Risk Romania: European Court of Human Rights
Assessment. The Intercept. Retrieved from (2016, January 12). Bărbulescu v Romania, no
https://theintercept.com/2018/11/25/voice-risk- 61496/08.
analysis-acglobal/
xvi De Santis, F., & Neuman, K. (2017, October
iv Harwell, D. (2019, November 6). A face-scanning 4). Takeaways from the ECHR Employee
algorithm increasingly decides whether you Monitoring Decision. Retrieved from https://
deserve the job. Washington Post. iapp.org/news/a/takeaways-from-the-echr-
employee-monitoring-decision/
v Regulation (EU) 2016/679 of the European
Parliament and of the Council of 27 April 2016 xvii Roose, K. (2019, June 23). A Machine May Not
on the protection of natural persons with regard Take Your Job, but One Could Become Your
to the processing of personal data and on the Boss. New York Times. Retrieved from https://
free movement of such data, and repealing www.nytimes.com/2019/06/23/technology/
Directive 95/46/EC (General Data Protection artificialintelligence-ai-workplace.html
Regulation)
xviii Marvin, R. (2019). The Best Employee
vi Dastin, J. (2018, October 10). Amazon scraps Monitoring Software 2019. Retrieved from
secret AI recruiting tool that showed bias PC Magazine: https://uk.pcmag.com/
against women. Reuters. Retrieved from cloud-services/92098/the-best-employee-
https://www.reuters.com/article/us-amazon- monitoring-software
com-jobsautomation-insight/amazon-scraps-
secret-ai-recruiting-tool-that-showed-bias- xix Dzieza, J. (2020, February 27). How Hard Will
againstwomen-idUSKCN1MK08G the Robots Make Us Work? The Verge.
vii Bertrand, M., & Mullainathan, S. (2004). Are xx Evans, W. (2019, November 25). Ruthless
Emily and Greg More Employable than Lakisha Quotas at Amazon Are Maiming Employees.
and Jamal? A Field Experiment on Labor Market
Discrimination. American Economic Review xxi Urwin, R., & Ellis, R. (2019, October 6).
Ambulances for Amazon warehouse workers
viii O’Neil, C. (2016). Weapons of Math Destruction. injured every other day. Sunday Times.
Crown Books
xxii Dzieza, J. (2020, February 27). How Hard Will
ix IFL Science (2017). This Viral Video Of A Racist the Robots Make Us Work? The Verge.
Soap Dispenser Reveals A Much, Much Bigger
Problem. Retrieved from https://www.iflscience. xxiii Fisher, A. (2019, July 14). An Algorithm May
ALGORITHMIC MANAGEMENT - A TRADE UNION GUIDE
www.uniglobalunion.org