You are on page 1of 20

Group 6.

5 48210 AT2 – Stakeholder Consultation

48210 Assignment 2 – Stakeholder


Consultation
Group 6.5

Technology: Facial Recognition

Page 1 of 20
Group 6.5 48210 AT2 – Stakeholder Consultation

Groupwork Declaration
Group member name Document sections contributed to Signature
Benjamin Hartmann  Office of Australian
Information Commissioner B.Hart
 Stakeholder Consultation
strategies
 Referencing & Review
Eliza Tam  Minorities
 Stakeholder Consultation
Strategies
 Editing & Review
Peter Ortigas  General Public
 Introduction of technology
 Stakeholder Consultation
Strategies
Mutian Shangguan  Software Users
 Stakeholder Consultation
Strategies
Alexander Robinson  Tech Companies
 Short Summary of
Technology Topic:
Sustainability
 Stakeholder Consultation
Strategies
 Document Formatting

Page 2 of 20
Group 6.5 48210 AT2 – Stakeholder Consultation

Table of Contents
Technology: Facial Recognition.....................................................................................................1
Groupwork Declaration.................................................................................................................2
Short Summary of Technology Topic..............................................................................................4
Stakeholder mapping....................................................................................................................6
Key stakeholders...........................................................................................................................8
Tech Company – Alexander Robinson.........................................................................................................................8
Office of Australian Information Commissioner – Benjamin Hartmann....................................................................10
Minorities – Eliza Tam...............................................................................................................................................12
Software Users – Mutian Shangguan........................................................................................................................14
General Public – Peter Ortigas..................................................................................................................................16

Stakeholder Consultation Strategies............................................................................................18


References:..................................................................................................................................20

Page 3 of 20
Group 6.5 48210 AT2 – Stakeholder Consultation

Short Summary of Technology Topic

Facial recognintion technology verifies a person's identity using their face, typically through photos, videos,
or cameras (Kasperky, 2024). Major tech companies like Meta (formerly Facebook), Apple, Microsoft,
Amazon, and Alphabet (formerly Google) have developed their own facial recognition systems. For instance,
Apple uses it for security features in iPhones, while Amazon has used it to assist law enforcement, although
with misidentification issues (Thales.com, 2021). Despite its various applications, facial recognition
technology still requires further development to improve reliability and address complexities such as ethical
issues, sustainability, and discrimination.

Social media and other platforms are expected to prompt action from the Australian Information
Commissioner's Office and international data protection agencies to manage illegal data collection (OAIC,
2023). Australians prioritize privacy, particularly concerning facial recognition and AI, especially the need to
protect children’s privacy (OAIC, 2023). Although facial recognition technology on social media ensures
personal privacy, its use by retailers and police without transparency raises concerns (Field, 2023).
Conducting public consultation, consent, and independent usage reviews are essential in developing a
peaceful and inclusive environment, aligning with UN SDG 16.

Facial recognition has experienced large growth and relevance over the past decade, offering new business
opportunities to emerging and established tech companies. While widely accepted for phone security, it has
been scrutinised for its use in security, surveillance, and policework due to concerns over bias and
discrimination. These concerns have been brought from questions surrounding training data and algorithm
bias that may implicate certain groups within the general public (Peng, 2022). Despite this, acceptance of the
technology is increasing based on the use case and the software user. A study found that people in the USA
were more accepting of tracking citizens, more accepting of private companies’ use of recognition, and less
trusting of the police using facial recognition (Ritchie et al., 2021). Sustainable development of facial
recognition, aligned with the United Nations Sustainable Development Goal 10 on reducing inequality,
requires an increase in company transparency regarding use, equal training data regardless of race, and
biases that may be present in machine learning algorithms.

The complexity of this issue extends to its impact on the livelihoods of minorities in Australia’s diverse
society, challenging the idea of technology’s neutrality. Research has revealed higher error rates when
detecting darker-skinned faces (Buolamwini & Gebru, 2018). Furthermore, it also raises the question of how
this technology will accommodate religious attire, such as hijabs and burkas, that cover areas of an
individual’s face. Similarly, how will the technology approach individuals with craniofacial differences? Due
to the nature of disability, these unique differences can vary broadly between individuals and be difficult to
account for. These issues intersect with human rights concerns highlighted by the Australian Human Rights
Commission regarding the use of AI, engaging the rights to privacy, equality, and non-discrimination
(Farthing et al., 2019). These considerations are vital amid the rise of predictive policing in Australia, aligning
with UN SDG 16, “Peace, Justice, and Strong Institutions” and SDG 10, “Reduced Inequalities” due to
accuracy disparities leading to biased outcomes.

Page 4 of 20
Group 6.5 48210 AT2 – Stakeholder Consultation

Stakeholder mapping

Key stakeholders Description


Office of Australian An independent national regulatory body that protects privacy and information
Information freedom, promotes and safeguards the right of Australian citizens to access
Commission government held information, and protects personal information.
General Public These are the normal functioning members of society that are constantly
exposed to facial recognition technology.
Tech Companies Companies enlisting current or bleeding edge technologies to differentiate or
advance their product line or the product line of partnering companies.
Software Users Individuals and groups using facial recognition technology for work.
Minorities Individuals who possess culturally, racially, or physically different characteristics
from the statistically predominant social groups in Australia.

Page 5 of 20
Group 6.5 48210 AT2 – Stakeholder Consultation

Key
Office of Australian Tech Companies
Information Commission
Software Users
General Public
Minorities

Page 6 of 20
Group 6.5 48210 AT2 – Stakeholder Consultation

Key stakeholders

Tech Company – Alexander Robinson


Tech Company’s Stake in Facial Recognition

Tech companies have a large stake in facial recognition as it is a key piece of technology that is allowing them
to find new business opportunities, differentiate products, and improve current offerings. Small start-up
companies such as Clearview AI, as well as larger more established companies such as Apple are both
leveraging current and emerging iterations of facial recognition through facial search engines for law
enforcement (O’Sullivan et al., 2020) and Face ID to unlock smartphones. Established companies have a large
amount of power and influence regardless of their size as often Apple is a market leader in innovation and
trend setting. This allows them to pioneer the technology and it’s use case for mass adoption quickly and
easily. This evident by the release of the iPhone X selling 29 million units in the final quarter of 2017 after it
was released in November (Kundu, 2018). Comparatively, Clearview AI has much less power and influence
but a far greater interest in the technology as they are developing a core use case of facial recognition and
their success as a business relies heavily on the adoption of the technology and the direction of the industry.

Page 7 of 20
Group 6.5 48210 AT2 – Stakeholder Consultation

Tech Company Persona

Purchases new phone annually to test


apps and latest technology.

Refuses to post photos of her


daughter on social media out of fear
of privacy due to increased knowledge
of industry.

Doesn’t use smart-home features in


front of daughter.

Reads blogs daily to maintain


knowledge on bleeding edge tech.

Sofia Jacobs

Age 32 Needs:

Tech Start-Up Founder after Knowledge that emerging


graduating from software engineering technologies are ethically and socially
at UTS. designed to protect her daughter.

Married with a 9-month-old daughter. Integrate facial recognition


technologies into her companies’
Has built a smart home into her products.
house.
Pains:
Relies on automation in many aspects
of her busy life. Facial recognition suppling companies
not being clear on training algorithms
and ethical practices.

Fear over daughter’s place in a world


with these emerging technologies that
she is contributing to.

Page 8 of 20
Group 6.5 48210 AT2 – Stakeholder Consultation

Office of Australian Information Commissioner – Benjamin Hartmann


OAIC Workers Stake in Facial Recognition

The Office of Australian Information Commissioner (OAIC) is a government entity established in 2010
responsible for strategic functions relating to information management in the Australian Government. They
ensure the proper handling of personal information according to the Privacy Act 1988 and protecting the
public’s right to access documents under the Freedom of Information Act 1982 (OAIC, N.D.). In this role the
OAIC has a very high stake and power in facial recognition technology. Information captured by facial
recognition falls under a category known as biometric information scanning (Kaspersky, 2024). The Privacy
Act 1988 protects biometric information at a higher level due to its sensitivity (Azzopardi & Greenslade,
2024). Due to the sensitivity of this information,

“An organisation or agency may only scan your biometric information as a way to identify you or as part of
an automated biometric verification system, if the law authorises or requires them to collect it or it’s
necessary to prevent a serious threat to the life, health or safety of any individual.” (OAIC, N.D.).

As facial recognition technology develops and becomes more used in the public space, the OAIC must adapt
to ensure that the privacy legislation and laws are upheld and relevant.

Page 9 of 20
Group 6.5 48210 AT2 – Stakeholder Consultation

OAIC Persona

Very aware of privacy laws and the


risks associated with information leaks
having worked for the OAIC most of
his professional life.

Dissables camera and tracking on all


personal devices unless needed.

Avoids Self-serve checkouts at


supermarkets since introduction of
cameras.

Rob Thomas
(Married, Male, 37)

Studied data science at University Anxious to enter venues where facial


before getting a job with the OAIC recognition technology is in use.
after graduation.
Is concerned with the increase in
Very privacy consious & won’t give smart devices listening to his
out personal information easily. conversations and targeted adds.

Very tech-savy. Needs to feel that his personal


information is secure and being used
Married with two children at school.
appropriately.

Worried about facial recognition


technology being used on his children.

Page 10 of 20
Group 6.5 48210 AT2 – Stakeholder Consultation

Minorities – Eliza Tam


Minorities’ stake in the issue

Minorities have a large stake regarding the use of facial recognition technology, especially women and
people of colour, as it has been shown that the algorithms work more accurately with lighter skin tones. The
error rate difference between dark-skinned females and light-skinned males ranged between 20.8% and
34.4% (Buolamwini & Gebru, 2018). This learned algorithmic bias creates an issue of equity as it means that
minorities are more likely to be disadvantaged by the technology and more prone to be subjected to unfair
treatment, and thus would be rightfully concerned. This is especially true in the context of criminal justice
where it is currently being used by the NSW Police to “identify potential suspects of crime, unidentified
deceased and missing persons” (Hendry, 2022). This concern about the potential negative implications that
arise from the inaccuracies of the technology is further corroborated by the Australian Human Rights
Commission who write, “Where these biometric technologies are used in high-stakes decision making, like
policing, errors in identification can increase the risk of injustice and other human rights infringement”
(Farthing et al., 2021). However, they don’t have very high influence in the topic as they are a minority group
and are thus often marginalised in society.

Page 11 of 20
Group 6.5 48210 AT2 – Stakeholder Consultation

Minority Persona

Attends workshops and events held by


the UTS Cybersecurity society

Is wary of the police due to their


increased presence at the train station

Strict follower of the rules

Engages in Indigenous cultural events


and celebrations

Greta

23 years old Wants to learn more about the future


of facial recognition technology and
Indigenous Australian woman
its uses
Lives with her mum, grandma, and
Needs more transparency over the
younger brother
use of AI technology by police
4th year computer science student at
Feeling of distrust due to increased
UTS
police surveillance
Values her family and culture
Needs to know how cultural practices
Easily anxious are impacted e.g. face painting

Concerned about privacy and data


security

Page 12 of 20
Group 6.5 48210 AT2 – Stakeholder Consultation

Software Users – Mutian Shangguan


Software User’s stake in the issue
Software users play a crucial role in the development and deployment of facial recognition technology. They
directly interact with this technology, relying on it for various purposes, which gives them significant power
and influence. Their feedback drives improvements in usability, accuracy, and privacy protection measures,
impacting market demand and influencing developers and policymakers. Additionally, users are directly
affected by issues such as privacy breaches and potential misuse, making them seek assurances regarding
transparency, security, and ethical standards. Their concerns and demands for accountability shape
regulatory frameworks and industry standards governing facial recognition technology's responsible use.

Page 13 of 20
Group 6.5 48210 AT2 – Stakeholder Consultation

Software User Persona

Stay vigilant, calm and responsible

Valuing safety within the venue and


actively addresses safety issues

Often taking the lead in taking measures


to prevent potential threats

Jeff

43 years old, security guard at a Faced with work pressure to ensure


shopping center in Singapore safety and identify suspects

Middle and lower level wages Aging makes it difficult to complete


high-intensity work
The sudden high-intensity work made
him feel tired. Need stable income to support the
family
The development of technology has
reduced workload, but faces the risk of Need tools and technologies that can
layoffs. help him enhance monitoring
capabilities and simplify security
operations

Page 14 of 20
Group 6.5 48210 AT2 – Stakeholder Consultation

General Public – Peter Ortigas


General Public’s Stake

The general public has the most stake in facial recognition technology as they are the majority audience of
the many companies that make use of the technology. According to a survey by Monash University (2020),
49% of those surveyed felt that facial recognition technology being used in public spaces is an invasion of
privacy, and more than half would prefer that they are given a choice of consent. However, there are also
those surveyed who agreed that facial recognition does help in public safety. In general, those surveyed
agreed that facial recognition could be helpful in security and policing; however, they did show concerns
about whether the technology is accurate or biased toward certain minorities. This survey shows the
divisiveness of facial recognition technology. Though the technology may be used against criminals, the data
gathered by this technology can be used to invade the general public’s privacy without their consent.

Page 15 of 20
Group 6.5 48210 AT2 – Stakeholder Consultation

General Public persona

Attends daily online progress


meetings

Spends his nights editing videos of his


social basketball games

Takes public transport to get to work

Uses an idea card to enter the building


of his company

Alex

32 years old and lives in the city Worried about a data leak and his
information being used by tech
Works at a startup company
companies
Tech Savvy
Concerned that someone can see him
Worried about general data privacy from the camera from his laptop

Edits videos as a hobby Needs to feel safe about that his


image is not being put in a database

Page 16 of 20
Group 6.5 48210 AT2 – Stakeholder Consultation

Stakeholder Consultation Strategies

Our stakeholder consultation strategies are based around Arnstein’s Ladder of Citizen Participation. Below is
the timeline for stakeholder engagements.

Collaboratively with the OAIC, we will involve Tech Companies in the development of the policy. This will
influence how we engage Software Users by gaining a greater understanding of the technologies uses and
restrictions in relation to the Privacy Act 1988. Concerns surrounding consent, security, period of data
storage and ownership will also be discussed influencing how we inform the General Public and Minorities.

Framework Stakeholder Strategy


Partnership Office of Description
Australian  Collaborate with OAIC to research and outline policy goals and
Information relevance in relation to the Privacy Act 1988.
Commission Relevance
(OAIC)  The OAIC is responsible for enacting the Privacy Act 1988 making
them invested in the data security and use of facial recognition
technology.
Evolution
 Work collaboratively to update the privacy act 1988 to modern
technologies.
Informing General Description
Public  Inform through feedback forums, websites, and seminars.

Relevance

Page 17 of 20
Group 6.5 48210 AT2 – Stakeholder Consultation

 The general public’s opinions are important as they are ultimately


the targets of facial recognition use in public spaces. It is their
biometric data that is being recorded, as such, they are concerned
with how it is being used.
Evolution
 Conduct a Post Implementation Review to determine policy
effectiveness.
Placation Tech Description
Companies  Involve members of tech companies in policy writing as technical
advisors throughout the development process.
Relevance
 Shows that they were a part of the policy writing procedure and
ensures their awareness of the risks and restrictions, shaping
technology development to align with their established
framework.
Evolution
 Continue to involve in policy changes as technology develops.
Consultation Software Description
Users  Consult about current facial recognition use and company policies,
public notification methods and entry conditions. Assess the
technology’s usefulness and intentions of future use.
Relevance
 As the interface between the public and the technology, it is
important to discover its usage and impacts. They are the first
point of contact for concerns surrounding consent and privacy.
Evolution
 Continue to consult as the policy develops.
Consultation Minorities Description
 Host online forums that provide space for minorities to discuss
concerns surrounding facial recognition. Gather these insights to
inform policy decisions.
Relevance
 Their deeper understanding of the technology can ease fears of
the unknown.
 Consultation will offer more diverse perspectives, fostering more
inclusive and secure policies.
Evolution
 Continue to inform about developments and build a platform for
support where people can ask questions or report concerns.

Page 18 of 20
Group 6.5 48210 AT2 – Stakeholder Consultation

References:

Adjabi, I., Ouahabi, A., Benzaoui, A., & Taleb-Ahmed, A. (2020). Past, present, and future of face
recognition: A review. Electronics, 9(8), 1188. https://doi.org/10.3390/electronics9081188

Adjabi, I., Ouahabi, A., Benzaoui, A., & Teleb-Ahmed, A. (2020, 23 July). Past, present, and future of
face recognition: a review. Electronics 2020. https://doi.org/10.3390/electronics9081188

Andrejevic, M., Fordyce, R., Li, L., & Trott, V. (2020). Australian Attitudes to Facial Recognition: A
National Survey. Monash University.
https://www.monash.edu/__data/assets/pdf_file/0011/2211599/Facial-Recognition-
Whitepaper-Monash,-ASWG.pdf

Azzopardi, M., & Greenslade, G. (2024, 23 January). The Australian Government's response to the
Privacy Act review: the use of facial recognition technology. Clayton UTZ.
https://www.claytonutz.com/insights/2024/january/the-australian-governments-response-
to-the-privacy-act-review-the-use-of-facial-recognition-technology

Buolamwini, J., & Gebru, T. (2018). Gender Shades: Intersectional Accuracy Disparities in
Commercial Gender Classification. Proceedings of the 1st Conference on Fairness,
Accountability and Transparency, Proceedings of Machine Learning Research 81:77-91.
https://proceedings.mlr.press/v81/buolamwini18a.html

Farthing, S., Howell, J., Lecchi, K., Paleologos, Z., Saintilan, P., Santow, E. (2019). Human rights and
technology: discussion paper. Australian Human Rights Commission.
https://apo.org.au/sites/default/files/resource-files/2019-12/apo-nid271981.pdf

Farthing, S., Howell, J., Lecchi, K., Paleologos, Z., Saintilan, P., Santow, E. (2021). Final Report:
Human Rights and Technology. Australian Human Rights Commission.
https://humanrights.gov.au/our-work/technology-and-human-rights/publications/final-
report-human-rights-and-technology

Field, S. (2023, 28 September). Facial recognition is everywhere – but Australia’s privacy laws are
‘falling way behind’. Forbes Australia. https://www.forbes.com.au/news/innovation/facial-
recognition-is-everywhere-but-australias-privacy-laws-are-falling-way-behind/

Hendry, J. (2022, October 11). Facial recognition use “misunderstood”: NSW Police. InnovationAus.
https://www.innovationaus.com/facial-recognition-use-misunderstood-nsw-police/

Kaspersky. (2024, March 14). What is facial recognition – definition and explanation.
www.kaspersky.com. https://www.kaspersky.com/resource-center/definitions/what-is-facial-
recognition

Kundu, K. (2018, 24 Janurary). Apple iPhone X was the best-selling smartphone in Q4 2017 ith 29
million units shipped. Beebom. https://beebom.com/iphone-x-bestselling-smartphone-q4-
2017-29-million/

Page 19 of 20
Group 6.5 48210 AT2 – Stakeholder Consultation

Monique A., Gabby G. (2024) The Australian Government’s response to the Privacy Act review: the
use of facial recognition technology. (n.d.).
https://www.claytonutz.com/insights/2024/january/the-australian-governments-response-
to-the-privacy-act-review-the-use-of-facial-recognition-technology

O’Sullivan, D., Naik, R., & General, J. (2020, 10 February). Clearview AI’s founder defends
controvertial racial recognition app. CNN Business.
https://edition.cnn.com/2020/02/10/tech/clearview-ai-ceo-hoan-ton-that/index.html

Office of the Austrlain Information Commissioner (OAIC). (2023, 8 August). Australian community
attitudes to privacy survey 2023. Australian Government. https://www.oaic.gov.au/engage-
with-us/research-and-training-resources/research/australian-community-attitudes-to-
privacy-survey/australian-community-attitudes-to-privacy-survey-2023

Office of the Australian Information Commissioner (OAIC). (2023, 24 August). Global expectations of
social media platforms and other sites to safeguard against unlawful data scraping.
Australian Government. https://www.oaic.gov.au/newsroom/global-expectations-of-social-
media-platforms-and-other-sites-to-safeguard-against-unlawful-data-scraping

Office of the Australian Information Commissioner (OAIC). (N.D.). Biometric scanning. Australian
Government. https://www.oaic.gov.au/privacy/your-privacy-rights/surveillance-and-
monitoring/biometric-scanning

Ouahabi, A., Adjabi, I., Ouahabi, A., & Taleb-Ahmed, A. (2020). The pros and cons of facial
recognition technology. IT Pro. https://www.itpro.com/security/privacy/356882/the-pros-
and-cons-of-facial-recognition-technology

Peng, Y. (2022). The role of ideological dimensions in shaping acceptance of facial recognition
technology and reactions to algorithm bias. Public Understanding of Science, 32(2), 190–
207. https://doi.org/10.1177/09636625221113131

Ritchie, K. L., Cartledge, C., Growns, B., Yan, A., Wang, Y., Guo, K., Kramer, R. S. S., Edmond, G.,
Martire, K. A., San Roque, M., & White, D. (2021). Public attitudes towards the use of
automatic facial recognition technology in criminal justice systems around the world. PloS
one, 16(10), e0258241. https://doi.org/10.1371/journal.pone.0258241

Thales. (2021, June 24). Facial recognition: Top 7 trends (tech, vendors, use cases). Thales Group.
https://www.thalesgroup.com/en/markets/digital-identity-and-security/government/
biometrics/facial-recognition

Page 20 of 20

You might also like