You are on page 1of 16

Ethics and Privacy in Data Analytics

Table of Contents
Introduction......................................................................................................................................2

Task-1..............................................................................................................................................2

a) Data privacy and Ethics impacts...........................................................................................2

b) Implementing a proposed solution based on Chelsea's decision..........................................3

c) K-Anonymity Evaluation for dataset....................................................................................5

Task-2..............................................................................................................................................6

a) Ethics benefits of using big data...........................................................................................6

b) Ethics harms from loan denial for Josh and Hannah............................................................7

c) Broader Societal harms from a widespread loan evaluation process...................................8

d) Mitigating Broader harms from the evaluation process.......................................................9

Conclusion.....................................................................................................................................12

References......................................................................................................................................13

1
Introduction
During a period characterized by swift technological growth and the widespread incorporation of
data-focused systems, lending institutions are increasingly depending on artificial intelligence
and intricate algorithms to transform the loan application procedure. These state-of-the-art
technologies can completely change the lending environment by offering previously unheard-of
speed, efficiency, and impartiality in creditworthiness assessments (Al-Ababneh et al., 2022).
This assessment is mainly designed based on the importance of ethics and privacy of data
analytics that play a pivotal role in the decision-making process and outcomes. The shift to data-
driven loan assessments raises several complex ethical and cultural questions that need to be
carefully thought through. The delicate topics of data privacy, accountability, transparency, and
informed permission are at the center of these moral conundrums, each of which presents
different difficulties and obligations for the people and institutions concerned. This assessment
report explores the complex world of data-driven loan assessments, illuminating the ethical
conundrums and societal fallout that come with their broad use. It thoroughly investigates both
the advantages of new technologies—such as increased inclusivity and operational efficiency—
and their possible drawbacks, such as privacy issues and the reinforcement of prejudice (Manresa
et al., 2021). Similarly, the financial industry must use technology advancements while
upholding the fundamental principles of fairness, transparency, and privacy as it continues to
explore the new region of data-driven lending. In addition to providing suggestions for reducing
possible risks and promoting fair lending practices, this report sheds light on the complex moral
and societal problems that underlie the use of data-driven loan assessments.

Task-1
a) Data privacy and Ethics impacts
As per the case scenario, based on the Australian Computer Society's Code of
Professional Conduct and Ethics and the Australian Privacy Act, the CTO and IT
Manager's decision to go with a less secure solution in the case study has significant
ramifications for ethical and data privacy problems.
Australian Privacy Acts (Data privacy)
Data security is given a lot of weight by the Australian Privacy Act, which requires
businesses to take appropriate precautions to secure personal data. The CTO and IT

2
Manager can violate the law if they select a less secure solution, even if they are aware of
the sensitivity of the data at stake. If their choice results in a data breach, they may
violate the Act's statutory reporting obligations for data breaches, which might result in
fines from the government and legal repercussions (Paltiel et al., 2023). Similarly,
informed permission and openness regarding data processing practices are also crucial,
according to data privacy laws like the Australian Privacy Act. Organizations are
encouraged under the Privacy Act to gather just the personal data required to fulfill their
objectives (Chua et al., 2017). When it comes to handling sensitive data, the CTO and IT
Manager can gather more information than is necessary for their justifiable business
objectives if they don't use a more secure approach. Additionally, the Privacy Act grants
individuals the right to view and update their data. Moreover, individuals may lose faith
in the organization's ability to safeguard their data and exercise their rights if a security
breach results from the less secure alternative, which might potentially violate their right
to privacy.
ACS Code of Professional Conducts and Ethics (Ethical Consideration)
According to the ACS Code of Professional Conduct and Ethics, the public interest takes
precedence over the interests of a person or organization. The choice made by the CTO
and IT Manager to put financial concerns ahead of security, particularly when handling
sensitive data, might be seen as a violation of this ethical rule (Bowern et al., 2007). The
Code places a strong emphasis on the necessity of upholding competence and acting with
diligence when carrying out professional responsibilities. The CTO and IT Manager
might be perceived as not carrying out their professional duties if they ignore Chelsea's
professional advice on the need for a better secure solution and minimize the security
flaws that have been found. Similarly, the ACS Code emphasizes how crucial it is to
uphold confidentiality and respect privacy and also expects to act with honesty, integrity,
and sincerity.

b) Implementing a proposed solution based on Chelsea's decision


Chelsea ought to use a methodical technique that blends her dedication to offering
knowledgeable counsel with her readiness to comprehend and attend to the client's issues.
With the customer selecting a less secure option, Chelsea is faced with a difficult
decision of whether to accept or reject the suggested solution. It's important to take into

3
account several variables, communication techniques, and ethical issues to resolve this
conflict.
o Communication and documentation
Chelsea should keep lines of open communication and be transparent with the IT
manager and CTO. Keep a record of every conversation, including their choice,
her professional advice, and any security concerns that must be suggested. In the
case of any future conflicts or security issues, written recordings of these
conversations will be essential (Adegbonmire, 2016).
o Risk assessment and mitigation
Perform a thorough risk assessment, outlining the weaknesses and possible
repercussions of the solution the customer has selected. The security flaws and
possible hazards to one's finances, legal standing, and reputation that might result
from this choice should be emphasized in this evaluation.
o Cost-benefit analysis
To prove the worth of the suggested enhanced security system, Chelsea ought to
carry out a cost-benefit analysis. By protecting sensitive data and averting
possible security breaches, the long-term advantages of a secure system should
outweigh its initial greater cost, as this study should demonstrate.
o Awareness and education
Chelsea's responsibilities include training the CTO and IT Manager concerning
the value of data security, legal and regulatory standards, and any repercussions of
their choices. Providing instruction or bringing in specialists to discuss the
security ramifications could help in bridging the knowledge gap (Omisore &
Abiodun, 2014).
o Legal and regulatory compliance
Make certain that the customer is well-informed on the legal and regulatory
obligations for data security, particularly concerning the APA. Emphasize the
possible legal repercussions of non-compliance, which might involve penalties
and lawsuits here.
o Alternative solution

4
Chelsea should look into alternate options that strike a balance between security
and affordability, all the while preserving the significance of the sophisticated
security system. This might entail making a compromise whereby specific
security precautions are given top priority while expenses are efficiently
managed. Offering a variety of solutions can demonstrate adaptability and a
dedication to fulfilling the objectives of the client.
o External advice seeking
Chelsea may be considering consulting outside specialists or authorities in the
field if the customer is still unsure about the security dangers and the requirement
for a more secure solution (Guo, 2011). Her worries might be validated, and her
proposals given more weight by independent perspectives.
o Professional Ethics
Chelsea needs to follow the rules of her organization and the regulations of
behavior for the sector. This entails acting in the client's and the public's best
interests first and foremost and with honesty and transparency. She might need to
think carefully about whether she can finish the job with a clear conscience if she
feels that the less secure option presents a serious ethical problem.

c) K-Anonymity Evaluation for dataset


When a dataset satisfies the k-anonymity criterion for a given k value, it indicates that at
least k people share the same quasi-identifiers, making it difficult for an individual's
sensitive attribute—in this example, income—to be readily re-identified. By guaranteeing
that every combination of quasi-identifiers appears in a dataset at least k times, k-
anonymity seeks to safeguard people's privacy by making it more difficult to re-identify
particular persons.
Now let's assess the dataset for various values of k (Ayala-Rivera et al., 2014):
1-Anonymity: This implies that at least one set of people should have the same quasi-
identifiers. The data makes it clear that there are no unique combinations of age, sex, or
postcode. Since every person in the dataset has a different combination of quasi-
identifiers, we may conclude that the dataset is not 1-anonymous.

5
2-Anonymity: For this to occur, at least two individuals must share the same quasi-
identifiers. As per the examination of the whole dataset, we find that few groups have the
same quasi-identifiers such as {groups 1 (20-25, 308*, male)}, {groups 2 (30-35, 308*,
female)}, {groups 3 (40-45, 318*, male)}, and {groups 4 (30-35, 318*, female)}. Thus,
we can say that the dataset is 1-anonymous.
3-Anonymity: For this to occur, at least three people must have the same quasi-
identifiers. As indicated by the 2-anonymity evaluation, we can observe that some groups
within the dataset satisfy this requirement. As a result, the dataset is 3-anonymous.
4-Anonymity: For this to occur, at least four people must have the same quasi-
identifiers. We have already discovered several groups in the dataset that satisfy this
criterion. As a result, the dataset is 4-anonymous.
5-Anonymity: For this to occur, at least five people must have the same quasi-identifiers.
The fact that there are several groups in which at least five members have the same quasi-
identifiers indicates that the dataset satisfies this requirement. Therefore, this dataset is 5-
anonymous.
6-Anonymity: For this to occur, at least six people must have the same quasi-identifiers.
This criterion is satisfied by the dataset as it contains several groups where at least six
members share the same quasi-identifiers. Thus, it also supports 6-anonymity.

Task-2
a) Ethics benefits of using big data
Using big data-driven technologies to assess loan applications can have
several noteworthy ethical advantages. Some of them are given below (Chang et al.,
2020):-
o Enhanced objectives
Systems powered by big data can reduce the impact of human biases that could
unintentionally influence loan choices. The lending process may be made more
objective by depending on data-driven algorithms, which guarantee that consistent
standards rather than arbitrary assessments are used to determine loan approvals.
o Improved efficiency

6
By processing loan applications more quickly, these technologies improve the
efficiency of the lending process. Both borrowers and lenders gain from this since
wait periods are shortened, and administrative expenses are decreased. In an
emergency, prompt loan approvals might be extremely crucial for rapid access to
finances (Asamani & Majumdar, 2023).
o Financial inclusion
Even borrowers without typical credit histories who are creditworthy can be
found with the use of big data analytics. This can provide an equal opportunity to
credit and financial inclusion by giving credit chances to underbanked or
marginalized persons who can repay loans but do not fit the traditional credit
profile.
o Risk management
Big data may improve risk assessment by taking a wider range of elements into
account. A more precise risk profile results from this, which may lower default
rates and eliminate the need for lenders to raise interest rates to make up for lost
revenue.
o Customized loan products
Data-driven solutions can customize loan packages for each borrower according
to their unique financial situation. Matching borrowers with goods that suit their
requirements and financial situation can lower the likelihood of over-indebtedness
and promote more ethical lending (Hara et al., 2019).
o Transparency and accountability
Transparency may be improved by using these technologies to record the
variables influencing loan decisions. Lenders are held more accountable when
borrowers can comprehend the reasons behind their approval or denial.

b) Ethics harms from loan denial for Josh and Hannah


The refusal of Josh and Hannah's loan might have several detrimental ethical effects that
impact their important life interests. As per the case study, the three ethical harms from
the denial for Josh and Hannah include (Adams, 2023):
o Emotional and psychological distress

7
Emotional and psychological discomfort may arise from the loan application
being denied. It can be discouraging and lead to tension, worry, and inferiority
complexes. These psychological and emotional impacts go beyond their
professional goals and may have an impact on their general well-being. They
could doubt the system's justice and their deservingness, which might make these
sentiments worse when the decision-making process is opaque.
o Dignity and equal opportunities
Their feeling of equal opportunity and dignity may also be harmed by the loan
rejection. Everyone should be able to follow their entrepreneurial goals in a just
society, regardless of their upbringing or personal traits. It damages a person's
sense of dignity to imply that their value is decided by things beyond their
control, including assumptions about their race or health concerns, if the loan
refusal is founded on erroneous data or unfair algorithms. These choices can
cause emotions of helplessness and annoyance as well as sustain structural
inequities (Grincevičienė et al., 2019).
o Financial setback and shifting entrepreneurial dream
Their financial loss is the biggest and most immediate damage. Their long-held
ambition of owning and running a fashion business is either postponed or
threatened (Henry, 2019). Their general sense of happiness with life, their career
goals, and their financial security are all impacted by this setback. If the loan is
denied, they will not be able to start a business, which might put them under
financial strain or prevent them from investing in their idea.

c) Broader Societal harms from a widespread loan evaluation process


As seen in this instance, the widespread adoption of an opaque and data-driven loan
appraisal procedure may have more detrimental effects on society as a whole, impacting
not just specific borrowers such as Josh and Hannah but also the whole fabric of the
economy and society. Some of them are mentioned below:-
o Economic inequality
By refusing loans to creditworthy borrowers—such as would-be business owners
—who do not have established credit histories or who come from

8
underrepresented groups, such a system might worsen economic inequality. This
may further widen the gap in wealth by dividing the population between others
who have access to fair financing possibilities and those who do not (Rawhouser
et al., 2017).
o Reduce entrepreneurship and innovation.
Suppressing entrepreneurship and innovation is the act of refusing financing to
those who have innovative and feasible company concepts. Technological
innovation, employment creation, and economic growth are all significantly
influenced by startups and small firms. Limiting access to finance can impede the
growth of new companies and impede society's overall advancement.
o Privacy Erosion
Extensive personal data collection for loan appraisal has the potential to violate
people's privacy and encourage widespread data exploitation. The personal lives
of borrowers are impacted, but this also feeds into a culture of monitoring and
data monetization, which can erode public confidence in technology and financial
institutions.
o Discrimination and Social Injustice
Discrimination and social unfairness may be sustained by depending on
unorthodox data sources and opaque algorithms. Prejudicial lending decisions can
be made based on assumptions about borrowers' race, health, or lifestyle choices,
which goes against the ideas of justice and equal opportunity.
o Loss of accountability
A decline in confidence in financial institutions may result from these systems'
lack of accountability and transparency. Borrowers' faith in the justice of the
financial system may be damaged if they are unable to comprehend or contest
loan decisions. This might result in a decline in the number of people using
conventional banking and lending channels (D’Anselmi et al., 2017).

9
d) Mitigating Broader harms from the evaluation process
Several significant policies and best practices may be put into place to mitigate or stop
the negative effects of the widespread adoption of opaque and data-driven loan appraisal
procedures. Some of them are mentioned below, along with their challenges and
justification:-
o Transparency and Algorithm Accountability
Justification: Financial firms should be compelled to provide the methods and
information sources utilized in their decision-making to solve the lack of
openness in loan appraisal procedures. This would promote justice and trust by
letting borrowers know why a loan was granted or declined. In addition,
mandating that organizations uphold responsibility for their algorithms guarantees
that they be consistently examined for impartiality and precision (Lee et al.,
2019).
Challenges: Protecting trade secrets and private information can be difficult as
businesses might not want their algorithms to be public. It's crucial to strike a
balance between intellectual property rights and openness. Furthermore, it might
take a lot of resources for regulatory bodies to adopt and enforce transparency
laws, and proficient auditing of algorithms requires specialized knowledge.
o Fair lending compliance and oversight
Justification: Discriminatory lending practices can be avoided with the strict
enforcement of current fair lending legislation and the implementation of
supervision procedures. It guarantees that borrowers receive fair and equitable
treatment regardless of their history or features by making financial institutions
responsible for abiding by anti-discrimination laws.
Challenges: The requirement that regulatory bodies possess the means and know-
how to carry out efficient audits and investigations is a challenge. Regulations
governing fair lending must be continuously updated to keep up with changing
lending practices since certain lenders may also try to avoid or exploit gaps in the
current legislation.
o Ethical data usage and informed consent

10
Justification: Strict laws governing the gathering and use of personal information
must be put into place. To guarantee the protection of their privacy, borrowers
must give their informed consent before any data is used. Institutions are
prohibited from using any data not directly related to the loan review process,
including personal characteristic inferences that might introduce bias.
Challenges: The difficulty is in formulating and implementing laws about
informed consent and adequate data protection here. To guarantee that debtors
comprehend the complete ramifications of exchanging data, it may be necessary
to provide transparent and comprehensible disclosures (Hildt & Laas, 2022).

11
Conclusion
The report's evaluation concludes that there are certain advantages for the banking sector when
data-driven technologies are included in loan assessments. These benefits include increased
operational effectiveness and inclusiveness without the associated moral and ethical dilemmas.
Due to the bias and discriminatory nature of algorithmic decision-making, there are concerns
over the sustainability of historical inequities. In a similar vein, insufficient data protection may
result in data breaches and grave privacy violations. The use of transparent algorithms that
display their decision-making processes makes it possible to identify and address biases.
Accountability measures have to be used to hold financial institutions and IT companies
responsible for their actions and the impact they have on individuals. Finding a balance between
innovation and moral norms is necessary to achieve an equitable and inclusive economic system
that upholds the fundamental principles of accountability, privacy, and fairness while
utilizing technology as a tool for growth.

12
References
Adams, M. (2023). The Importance of Ethics in Big Data: Ensuring Responsible Use and Trust
in Data-driven Decision Making. https://www.businesstechweekly.com/operational-
efficiency/data-management/ethics-of-big-data/

Adegbonmire, J. (2016). Conflict Situations and Ways to Resolve Conflicts.


https://doi.org/10.13140/RG.2.1.3617.8002

Al-Ababneh, H., Borisova, V., Zakharzhevska, A., Tkachenko, P., & Andrusiak, N. (2022).
Performance of Artificial Intelligence Technologies in Banking Institutions. WSEAS
TRANSACTIONS ON BUSINESS AND ECONOMICS, 20, 307–317.
https://doi.org/10.37394/23207.2023.20.29

Asamani, A., & Majumdar, J. (2023). A STUDY ON THE AUGMENTATION OF DIGITAL


LENDING AND THE FACTORS PROPELLING ITS METEORIC RISE. YMER Digital,
22, 836. https://doi.org/10.37896/YMER22.10/63

Ayala-Rivera, V., Mcdonagh, P., Cerqueus, T., & Murphy, L. (2014). A Systematic Comparison
and Evaluation of k-Anonymization Algorithms for Practitioners. Transactions on Data
Privacy, 7, 337–370.

Bowern, M., Burmeister, O., Gotterbarn, D., & Weckert, J. (2007). ICT Integrity: bringing the
ACS code of ethics up to date. Australasian Journal of Information Systems; Vol 13, No 2
(2006), 13. https://doi.org/10.3127/ajis.v13i2.50

Chang, V., Xiao, L., Xu, Q., & Arami, M. (2020). A review paper on the application of big data
by banking institutions and related ethical issues and responses. FEMIB 2020 - Proceedings
of the 2nd International Conference on Finance, Economics, Management and IT Business,
Femib, 115–121. https://doi.org/10.5220/0009427701150121

Chua, H. N., Herbland, A., Wong, S. F., & Chang, L. Y. (2017). Compliance to Personal Data
Protection Principles: A Study of How Organizations Frame Privacy Policy Notices.
Telematics and Informatics, 34. https://doi.org/10.1016/j.tele.2017.01.008

D’Anselmi, P., Chymis, A., & Di Bitetto, M. (2017). The Consequences of Non-accountability

13
(pp. 75–96). https://doi.org/10.1007/978-3-319-32591-0_7

Grincevičienė, V., Bareviciute, J., Asakaviciute, V., & Targamadze, V. (2019). Equal
Opportunities and Dignity as Values in the Perspective of I. Kant’s Deontological Ethics:
The Case of Inclusive Education. Filosofija. Sociologija, 30. https://doi.org/10.6001/fil-
soc.v30i1.3919

Guo, B. (2011). The scope of external information-seeking under uncertainty: An individual-


level study. International Journal of Information Management, 31, 137–148.
https://doi.org/10.1016/j.ijinfomgt.2010.08.005

Hara, T., Sakao, T., & Fukushima, R. (2019). Customization of product, service, and
product/service system: what and how to design. https://doi.org/10.1299/mer.18-00184

Henry, P. (2019). Challenging Excessive Fashion Consumption by Fostering Skill-Based


Fashion Education. Journal of International Education and Practice, 2.
https://doi.org/10.30564/jiep.v2i1.403

Hildt, E., & Laas, K. (2022). Informed Consent in Digital Data Management (pp. 55–81).
https://doi.org/10.1007/978-3-030-86201-5_4

Lee, N. T., Resnick, P., & Barton, G. (2019). Algorithmic bias detection and mitigation: Best
practices and policies to reduce consumer harms.
https://www.brookings.edu/articles/algorithmic-bias-detection-and-mitigation-best-
practices-and-policies-to-reduce-consumer-harms/

Manresa, A., Bikfalvi, A., & Simon, A. (2021). Investigating the impact of new technologies and
organizational practices on operational performance: evidence from Spanish manufacturing
companies. Central European Journal of Operations Research, 29.
https://doi.org/10.1007/s10100-020-00692-8

Omisore, B., & Abiodun, A. (2014). Organizational Conflicts: Causes, Effects and Remedies.
International Journal of Academic Research in Economics and Management Sciences, 3.
https://doi.org/10.6007/IJAREMS/v3-i6/1351

Paltiel, M., Taylor, M., & Newson, A. (2023). Protection of genomic data and the Australian

14
Privacy Act: when are genomic data ‘personal information’? International Data Privacy
Law, 13(1), 47–62. https://doi.org/10.1093/idpl/ipad002

Rawhouser, H., Cummings, M., & Newbert, S. L. (2017). Social Impact Measurement: Current
Approaches and Future Directions for Social Entrepreneurship Research. Entrepreneurship
Theory and Practice, 43(1), 82–115. https://doi.org/10.1177/1042258717727718

15

You might also like