You are on page 1of 42

1

CAPSTONE PROJECT “Build

REPORT SUBMITTED TO
DR. VENKAT REDDY
WOXSEN UNIVERSITY
SUBMISSION DATE: 28/04/2023
TERM- 3
2

DESIGN THINKING
“Building Fairness into Job Recommendation AI
Systems (JRS): A Design Thinking Approach”
Team -2 (A2, B2)

Sl.No Student Name Student Id


Team- A2
1 Bharat Sethia 22WU0202054
2 Tezovalli Bodhimisetty 22WU0202099
3 Shivani Prabhu 22WU0202113
4 Gauri Singh 22WU0202060
5 Deepthi Reddy 22WU0202068
6 H B Shradha 22WU0202100
Team-B2
1 Mugil Vijey 22WU0202189
2 Bishal Poddar 22WU0202129
3 Shreyansh Mathur 22WU0202147
4 Nirmalya Kundu 22WU0202131
5 Atrij Jadon 22WU0202125
6 Akashdeep Singh 22WU0202138
3

Building Fairness into Job Recommendation Ai


Systems (JRS): A Design Thinking Approach (2023)
First A. Author, Tezovalli Bodhimisetty, Second B. Bishal Poddar, Third C. Atrij Jadon, Fourth D.
Nirmalya Kundu, Fifth E. Shivani Prabhu, Sixth F. Mugil Vijey, Seventh G. Shreyansh, Eighth H.
Gauri Singh, Nineth I. Bharat Sethia, Tenth J. H B Shradha, Eleventh K. Akashdeep Singh, Twelth.
Deepthi Reddy, (Woxsen University).

Abstract - An example of a software program that Therefore, both human recruiters and candidates
suggests job openings to job seekers based on their call for an AI-based algorithm that treats every
qualifications, experience, and preferences is a job candidate fairly. The former is concerned with
recommendation system. By offering individualized appearing fair, whilst the latter are worried about
and pertinent job recommendations, these systems being treated fairly. In this context, a tool is deemed
hope to improve the job-search experience for both
fair if it reliably predicts performance without
job seekers and businesses. A major concern is that
recommendation systems may be biased and
being affected by candidates' membership in a
discriminatory because artificial intelligence operates minority group. Although fairness may not initially
on pre-established norms. A fair employment come to mind when considering the problems with
suggestion system will be created using design contemporary AI-based algorithms because it is
thinking as the technique. We created a working widely believed that they are more objective than
prototype utilizing the blind-fairness approach, which their human counterparts, recent studies revealed
may be used to remove sensitive information that the opposite: they found disparities between men
might lead to bias. Fairness can be put into practice and women in the algorithms' targeting of job
by altering the system's algorithms. The approach has
advertisements and gender bias in their search
its own limitations that it cannot remove all the biases
possible. Further research studies can be carried out algorithm.
on additional modifications in the algorithm and the
Cognitive prejudices like the home bias,
job recommendation system to achieve complete
fairness.
resemblance bias, or stereotypes that lead recruiters
to favor certain applicants over others are
Keywords: job recommendation system, Artificial frequently the basis of unfairness. Because AIs rely
intelligence, bias, design thinking, on data that retains the preconceptions and biases
algorithms, fairness. of prior judgements, they are inherently unfair. An
AI largely picks up knowledge from observations,
but if those observations are tainted with
I. INTRODUCTION prejudices, the AI adopts that viewpoint. Biases in
data frequently relate to characteristics like sex,
The applicant pool is extremely large for human ethnicity, or age that hurt specific groups or people
resource (HR) departments. Artificial intelligence if an AI learns to favor them. It has been stated that
(AI)-based algorithms are now a common tool for a it is impossible to create generic fairness standards
growing number of organizations to process for AI-based algorithms because they frequently
candidate data. The workload of the staff members require human interaction and are incompatible
involved in recruiting can be lessened by the with one another. Therefore, we think it's even
deployment of AI-based algorithms. For instance, more crucial to look for upstream and downstream
an AI can assess applicant data and filter resumes methods that can change an AI-based algorithm's
to lessen the volume of information that needs to be input or output and to provide guidance for people
reviewed manually. Following, we define AI-based using these algorithms to get better.
algorithms as "the cutting edge of computational
advancements that refers to human intelligence in We set out to discover the most prevalent,
order to address ever-more-difficult design thinking applicable, and fruitful strategies to improve
problems." fairness in AI, motivated by the previously
mentioned issues of unfairness in recruiting
Second, since some applications are rejected by the procedures brought on by AI-based algorithms.
system, the employment judgements made by Previous studies mostly concentrated on a
algorithms directly impact the applicants.
4

metalevel of fairness in AI, for example. An on a variety of factors, including the worker's traits,
overview of the technological options for past behaviors, and compatibility with the job's
implementing fair AI-based recruiting algorithms needs. While job recommendation algorithms can
does not yet exist, but one is urgently required to assist employees and businesses in finding better
increase hiring fairness. Therefore, we declare the matches more quickly, they have also raised serious
following study hypothesis: Which strategies for issues about fairness because, even if designers had
enhancing fairness in AI-based algorithms exist in no intention of discriminating, the suggested
the literature about hiring and recruiting? positions could perpetuate gender and other
stereotypes. For instance, gender may be linked to
We conducted a thorough literature study with an job kinds and work styles in content-based
emphasis on features in AI classification and recommendation algorithms, which results in
ranking to answer this topic. By responding to this gender segregation in job suggestions.
query, we offer a summary of appropriate actions
about the investigated problem that can help Additionally, item-based collaborative filtering
academics and professionals create intelligent apps algorithms and algorithms that take into account
for HR. Additionally, the sociotechnical perspective the previous behaviors of recruiting agents can
on algorithmic fairness incorporates our work. Our establish and maintain historical gender inequalities
research focuses on the technical sub-system that is in the suggestions that workers receive based on
thought to be the "origin of discrimination" and job seekers' application behaviors’ order to
typically only produces conceptual outcomes. investigate the algorithm's "black box" of features
Therefore, our contributions include (a) a study and determine whether algorithms lead to harmful
plan for additional research to adequately address discrimination by using fictitious correspondence
the issue and (b) tactics for professionals who in online platforms, a new research approach
create and produce AIs for hiring purposes. known as an algorithm audit was recently
proposed. This approach is used in this paper to
A. Background and Motivation determine whether, to what extent, and how job
It has become more difficult for users to digest board algorithms systematically treat male and
massive amounts of data and locate desired female job seekers differently. More specifically,
information, products, and labor because of the we made otherwise similar worker profiles for men
Internet's enormous expansion and quick and women on the four most popular job sites in
development. By prioritizing the transmission of China, and we watched to see which jobs were
information and presenting each user with a suggested to those profiles. Then, we constructed
different list of new items that are tailored to her pairs of identical resumes that were qualified for
interests and preferences, the personalized the positions, except for the applicant's gender. We
recommender system, which was first presented in created two versions of each profile pair—a
the 1990s, is an effective tool for reducing the "young" version and an "older" version—where the
problem of information overload. Online websites older applicants have 10 more years of work
and e-commerce services have effectively experience than the younger applicants because
implemented recommender systems on a large Chinese employers' gender preferences appear to
scale. For instance, a client may come across a interact strongly with the worker's age. My
page on Amazon called "Customers Who Bought imaginary workers then applied for the top
This Item Also Bought," which lists the items she is positions on the algorithms' recommendation lists
probably going to find interesting. to observe how the algorithms altered their
rankings in response to the workers' application
Additionally, because so many people search for behaviors. We compared the job suggestions
jobs and apply for jobs online, internet job boards received by male and female applicants after
may analyze the behaviours and actions of both job repeating this application process up to three times
seekers and employers which support the growth of (responding to each new batch of
job recommendation systems collectively. Applying recommendations). We discover that identical male
the idea of personalized recommendation to the job and female applicants do not always receive the
recruiting space, job recommender systems make same job recommendations: on average, 12.3 jobs
better suggestions for matches between job seekers were individually shown to male or female
looking for opportunities and recruiters looking for applicants out of 100 job recommendations that my
applicants online. Now suggest positions to the candidates got. Younger employees, those in
employees that utilize their platforms. Algorithms gender-neutral job categories, and those in middle
provide these personalized recommendations based and high-level positions were more likely to get
5

recommendations that were particular to their significantly across male-only and female-only
gender. Importantly, since fictional individuals occupations.
began submitting job applications, gender
disparities in recommendations are even greater: In Gender-based disparities in job recommendations
the first round of applications, there is an 8.9% are also reflected in the other needs listed in the job
overall difference between male and female descriptions. Words associated with a feminine
applicants; however, after three rounds of personality, such as patient and careful, are
applications, 19.2% of recommendations are frequently used in positions that are recommended
gender specific. Although job boards don't always for women. These jobs also tend to include more
show the same jobs to our identical male and descriptions of the physique, temperament, and
female workers, it is especially worrying that the facial attributes of ideal employees. Men are
computation randomness of the system is to blame advised to choose jobs that favor self-driven,
for the gender gap in job recommendations. To entrepreneurial workers. and can function under
address this issue, we compare the caliber of strain. Additionally, the gendered words
exclusively male and exclusively female summarized in earlier works in language, political
employment in two stages. science, psychology, and labor economics are
consistent with the male and female words in
First, we concentrate on the characteristics of the recommended jobs. We ran two surveys on Amazon
jobs that are specific to each gender. We discover MTurk and on Chinese workers to get data on the
that, on average, only jobs for men, which are seen gendered views of words. We discovered that
by men rather than women, posted wages that were occupations seen by female candidates tend to use
2,616. RMB higher than jobs recommended to more feminine words, while those recommended to
women, which equates to 1.8% of the average wage men tend to use more masculine phrases. Finally,
of our fictitious workers. Jobs suggested only to we try to pinpoint the precise mechanisms causing
men require 0.17 more years of working experience gender bias in employment suggestions. Because
than exclusively-to-female jobs (7.5% of the words referring to gender-related personality traits
average requested working experience in (e.g., patient in female, work under pressure in
recommended positions), despite the gender male) and gender stereotypes in the workplace
disparity in requested education being close to nil (e.g., women are good at literacy skills, men have
in jobs recommended to male and female leadership) occur differently in gender-specific
candidates. In the second, we take words from job recommendations, content-based recommendations
descriptions that reflect six qualities of a job: skills, that link gender with jobs' features must play a role.
work timing and location, benefits, company, other
qualifications and personality, age, and appearance, Additionally, it appears that hiring representatives'
and we estimate the relationship between the actions also lead to biased job recommendations.
word's appearance in the job description and the The paired male and female applicants would
gender of candidates who are hired. We discover receive more varied job advertising in their
that administrative duties and literacy tasks are recommendations when more hiring agents
more frequently cited in jobs that exclusively examine their profiles, demonstrating that human
employ women, whereas influencing qualities like bias may be preserved in and interact with
leadership and decision-making are more recommender systems. Finally, we discover that
frequently indicated in jobs that only employ men. item-based collaborative filtering, which suggests
Contrarily, compared to males with the same jobs based on workers' application histories, may
attributes, female applicants are advised to apply reinforce, and amplify the gender bias in the system
for more jobs with flexible working hours and by comparing occupations proposed before and
regular breaks, whereas male applicants see more after workers apply for employment.
positions that need night work and overtime. Only-
to-female jobs prioritize family-related benefits like
maternity leave and marriage leaves, whereas only- B) Research problem and objectives
to-male jobs prioritize higher performance
incentives like bonuses and company shares. The problem is Biasness in Job Recommendation
Except for the fact that orientation training is more AI Systems (JRS).
common in female-only positions and male-only
Objective: To design a Job Recommendation AI
jobs are more likely to be in publicly traded
System (JRS) that eliminates biases and provides
corporations, company-related words do not differ
equal opportunities to all job applicants regardless
6

of their race, gender, background, or any other applicants with fairness and supply every of them
factors. an equal opportunity to be considered for the
vacancy. The system should not advocate toward
C) Overview of the paper structure the job applicants based on unique additives
consisting of candidate’s age, candidate’s race,
 Literature Review candidate’s gender, candidate’s training level and
candidate’s paintings revel in. Rather, it should
 Definition and importance of
address all of the procedure candidates in
fairness in JRS
analogous way irrespective of the criteria.
 Types of bias in JRS
 Existing approaches to address The job recommendation system ought to be
bias in JRS smooth, transparent, honest, open, and capable of
getting scrutinized by way of the auditors. As a
 JRS System Architecture result of this, the candidates can recognize the
 Overview of a typical JRS system primary reasons at the back of why they were
architecture nominated or why they have been no longer
 Identification of potential sources of nominated for a job position. Fairness in the job
bias in the JRS system recommendation system will also allow the
recognition and mitigation of all the biases, flaws
 Methodology and prejudices that might be present inside the
 Design thinking process for developing recruitment system. A fair activity advice system
fair JRS. must contribute to the marketing of range, equity,
 Data collection and analysis and inclusion inside the recruiting way. This may
 System design and implementation be achieved through considering candidates for a
role based totally on their competencies,
 Results and Analysis credentials, and experience.
 Overview of the fair JRS prototype
 Evaluation of the fair JRS prototype B) Importance of fairness in Job
 Comparison of the fair JRS with existing Recommendation System
systems
According to Al-Otaibi and Ykhlef (2012), study
on the subject of fairness in job recommendation is
 Discussion
a crucial one because it can be of use to both job
 Interpretation of the results searchers and employers. Its responsibility is
 Implications of the fair JRS for processes making job suggestions to the possible applicants.
 Limitations and future research directions Making recommendations to job searchers is
fundamentally distinct from making
 Conclusion recommendations to customers for conventional
 Summary of the research findings commodities such as films or music. When solving
 Contributions and practical implications of classic recommendation issues, its far widespread
the study practice to count on that the numbers of objects to
 Recommendations for future development be had are limitless. The expectation among
of JRS. suppliers is that as many customers as viable will
advocate their products in an effort to boost
conversions and revenues. The system should
II. LITERATURE REVIEW
refrain from suggesting a specific job posting to an
excessive number of job seekers, however, in
A) Definition of Fairness in Job relation to job recommendations.
Recommendation System A job advertisement will indicate the purpose to fill
one or many of the open positions from the
When discussing the priority of fairness in job perspective of the organisation searching for to fill
recommendation system, the term "fairness" refers it. As an end result of an immoderate variety of
back to the approach of making sure that the candidates for a job opening, the workload
suggestions provided with the useful resource of associated with analysing resumes and undertaking
the device do not unfairly benefit or downside any interviews may additionally come to be being more
precise set of job seekers based definitely at the than expected. When looking for activity, a person
demographics or specific attributes they own. A can typically receive one of the to be had job
method that is simply and fair for supplying offers. There may be fewer opportunities for human
candidates for process have to deal with all beings searching for jobs to be recruited if there are
7

an excessive range of humans making use of for the certain fairness in job recommendation system, it's
same job because of elevated competition. massive to consider the factors which includes
Additionally, modern-day recommender systems capacity, training, and experience. Audits and
are primarily based on collaborative filtering. As an exams should be finished regularly to make sure
end result, they suffer from issues related to that the job recommendation device isn't biased,
recognition bias (Abdollahpouri, Burke, and and it may be further superior. To ensure that
Mobasher 2017, 2019). This method that famous
corporations select candidates for vacant positions
items will get more exposure than people who have
primarily based on their abilities and abilities, in
acquired much less interaction. In the case of job
recommendations, job providers might decide to preference to on variables beyond their manager,
depart the advice platform in the event that they process thought systems must be truthful.
consistently gain minimal visibility and acquire an
insufficient number of applications whilst their
C) Types of bias in Job Recommendation
posts end. This might make it even difficult for job System
seekers to find satisfying process possibilities.
i. Algorithmic Bias: It is possible for job
Therefore, it is also vital to guarantee that job posts
recommendation systems to exhibit
receive a good amount of exposure with a view to
algorithmic bias if the system's
promote the lengthy-time period improvement of
recommendations are weighted greater
job recommendation systems.
heavily in want of particular groups of
As those systems have the capability to reinforce
individuals or occupations. This might also
and perpetuate prejudices that already exist in the
occur for some of causes, which include
recruiting technique, it is far imperative that
skewed schooling facts, skewed person
procedure advice structures be truthful. Unfair
remarks, or the creators' own subjective
consequences, which incorporates discrimination in
options when they advanced the device.
opposition to unique businesses of people, along
For example, if the training records of the task
with minorities and girls, can be the end result of
advice system is biased towards certain industries
job recommendation systems that were not created
or professions, the system would advise the ones
with fairness in thoughts from the start of the
professions greater often, even though if different
design approach.
jobs are more appropriate for the applicant’s
One could possibly get information about the
interests and abilities. This takes place although the
significance of fairness in job recommendation
applicant has a combination of interests and
systems from several viewpoints. Discrimination in capabilities that might be utilised in an exceptional
the direction of the candidates of interest need to no purposeful region. In the same way, if the set of
longer be achieved based totally on their ethnicity, rules gets candidates’ feedback that is biased in the
faith, and gender by using manner of the direction of sure professions or classes of
organization. Engaging in such behavior at some vocations, this can additionally affect the
stage in the hiring manner is likewise unethical. It recommendations.
is unlawful to discriminate against job applicants It is important to employ training data set et that is
based totally on the characteristics listed above in various and representative in order to remove
many countries. algorithmic bias in job recommender structures. To
take a look at and decrease biases and to replace the
Companies and job seekers alike stand to algorithms, it is far enormous to conduct regular
advantage in phrases of fairness in job audits of the job recommendation system. To
recommendation systems. Employers can reduce the feasible subjective biases, it is far
accumulate advantages from a candidate pool. This essential to encompass diverse stakeholders and
is constituted of a numerous form of those who builders into job recommendation system’s layout
bring with them an entire lot of viewpoints and and testing ranges from the initial stages.
Additionally, candidates have to be properly
ideas. Fair job recommendation systems can also
informed about the operation of the job
help job searchers from underrepresented groups recommendation system and must be given a
discover challenging possibilities that they chance to offer feedback for the improvement of
otherwise may not have been able to access. This the system’s accuracy and mitigation of bias.
may be a sizeable gain to job seekers from the ones
companies who're actively looking for work. It is ii. Sampling Bias: A sampling bias could emerge in
possible to construct sincere job recommendation job recommender systems if the records used
systems with the aid of ensuring that they're created to educate the system is not representative of
to eliminate any possibilities of biases and to offer the entire population or consumer base. This
identical variety of opportunities to all the method has to bring about inaccurate outcomes for
applicants from numerous backgrounds. To make job applicants. The system will not be able to
8

capture the entire range of abilities, hobbies, bias that can occur indoors task advice algorithms.
and alternatives of the user community, Utilizing standardized procedure magnificence
resulting in incorrect or prejudiced structures which can be based on purpose standards
recommendations. The job recommendation and are robotically up to date to reflect shifts inside
system will not be capable of suggesting jobs the exertions marketplace is one method for
accurately for users from one-of-a-type carrying out this purpose. It is likewise essential to
geographical areas or demographic hire labels that do not comprise any terminology
organizations if it is trained on facts from one that is construed as discriminatory or biased, and
geography or demographic institution. that accurately reflect the actual obligations of the
position.
Due to this, the platform can also provide Furthermore, labels ought to now not encompass
recommendations which can be geared closer to any terminology that may be construed as
sure occupations or industries, which may not be discriminatory or biased and need to replicate the
relevant to all clients. It is vital that at the same actual responsibilities of a function because it need
time as training a job recommendation system, a to be. Performing ordinary tests of the job
sample of data that is both diverse and recommendation system and making suitable
representative is used. This will assist in dealing changes to the labels or classes, if wished, is
with sampling bias in the system. This reason can essential to dispose of mistakes or biases in the
be served with the loads of sources, which recommendations. To ensure that the machine
encompass online process boards, client surveys, correctly shows the needs and alternatives of the
and extraordinary resources. The job customer populace, it is important to assess user
recommendation system that is updated feedback and usage records and engage with
periodically with smooth records this is big to variety of groups.
ensure that it offers particular effects and is
relatable over the years. In order to decide whether iv. Feedback bias: When the machine gets feedback
the recommendations are erroneous, whether or from users that is biased or unrepresentative,
now not the suggestions are not misguided, it is its consequences in recommendations which
vital to examine the job recommender system might be defective or prejudiced. This can
overall performance primarily based mostly on also additionally arise even as particular
diverse subgroups of job candidates. professions of candidates are extra willing to
offer feedback than others, or whilst the
To ensure that the system accurately captures the feedback provided is not indicative of the job
skills, hobbies, and picks of the person community, candidates actual possibilities or pastimes.
user testing and feedback sessions may be done Both of those situations have the capability to
with numerous groups of users. reason the identical result.
To count on, if a sure demographic group or
industry is much more likely to provide feedback to
iii. Label bias: It occurs whilst the labels or the job recommendation system than others, this
categories used to classify technique listings are could motive tips which can be more favourable
wrong or prejudiced, which in flip ends in biased within the route of the demographic organization or
tips being sent to users. This can show up while the industry. In a comparable vein, if the enter
approach used to categorise jobs is preceding or furnished via way of applicants does now not
discriminatory, or whilst the labels do not properly reflect the picks or pursuits that they in reality
mirror the real nature of the roles involved. have, this will also result in hints that are wrong or
To deliver an example, If the job recommendation prejudiced. It is crucial to accumulate comments
system is based totally on job titles or categories from a wide and consultant collection applicant as a
that are biased toward positive genders or racial manner to combat the feedback bias that can stand
businesses, this may bring about biased up in task advice structures. Additionally, it is far
suggestions. Another instance could be if the critical to make sure that the remarks as it should
system specializes in hobby names or classes that display the person's true possibilities and interests.
are prejudiced closer to high-quality ethnic This can contain accumulating input from
corporations. In a similar vein, if the job numerous assets, together with customer surveys,
recommendation system is out of date and does no interviews, and utilization information, after which
longer appropriately reflect the current state of the making use of statistical strategies to assess the
project market, this may additionally result in comments and locate any biases.
recommendations which are probably erroneous or It is likewise essential to emphasise to applicants
prejudiced. the significance of providing remarks that is every
It is crucial to appoint a varied and consultant accurate and representative, and it is equally vital
series of labels or categories to classify interest to facilitate the supply of remarks via applicants in
advertisements so as to overcome the issue of label a manner that is every handy and effects to be had.
9

In addition, it is important to examine the overall also referred to as "de-biasing strategies."


performance of the job recommendation system on Adjusting the burden given to inclinations or
everyday basis and make any critical changes to the capabilities which may be diagnosed to be
algorithms which will accurate any inaccuracies or associated with prejudice is one way to
biases that may be present in the suggestions. perform this cause. For instance, if the tool has
a choice for male applicants, the set of tips
v. Implicit Bias: Implicit Bias is the
may be changed to provide the algorithm can
unintentional incorporation of bias right
be modified to give less weight to gender as an
into a recommendation system due to
societal or cultural prejudices and interest in the concept. This may be completed
stereotypes. This form of bias can emerge if the tool has this form of desire.
when a recommendation system is used. iii. Limitations on fairness: This strategy requires
If the algorithm assumes that guys are better which include fairness barriers into the
appropriate to positive kinds of jobs than ladies, recommendation system. These boundaries may
then its recommendations may additionally prefer be defined in several ways, including making
men. It is feasible to lessen the outcomes of sure that the recommendations are scattered
implicit bias in job advice structures by way of pretty across the distinct businesses or ensuring
reviewing and auditing their algorithms on that the job recommendation system does not
everyday basis. This will save you algorithms from give priority to someone of the groups over the
perpetuating stereotypes or with the exception of others.
sure businesses. It is vital to apply inclusive and
iv. Human being in the loop: Human beings are
various records units when schooling algorithms, as
taken into consideration into the procedure of
well as to encompass a wide organization of
stakeholders inside the layout and checking out creating technique tips in this method, that is
method. Additionally, to lessen the results of known as “person in the loop approach”. The
implicit bias, Decision making system of the job number one goal of this approach is to search
recommendation system needs to be as obvious as for bias and to clear up any biases or mistakes
viable. that could were added. Either through asking
applicants for their input at the
C) Existing approaches to address bias in Job recommendations or through hiring a fixed of
Recommendation Systems human reviewers this objective may be fulfilled.

As a job recommendation system could boom the v. Transparent and explainable algorithms: The
stereotypes and biases which can be already current reason of this technique is to make the
inside the statistics that they are told upon, they recommendation system more open to scrutiny
have the capacity to contribute to the prolongation by providing a proof of the cause at the return of
of bias in the job marketplace. Various techniques every recommendation this is generated. This
could assist applicants in comprehending why
have been contemplated, evolved, and put into
they are being endorsed for numerous jobs, and
exercise as feasible solutions to the prejudice in it could additionally be useful in recognising
task advice systems. Below stated are strategies: and decreasing biases which are already present
i. Recommendation with an emphasis on inside the gadget.
diversity: This method places number one
These methods are not associated with each
emphasis on broadening participation of different, they may be combined to offer answers
underrepresented businesses inside the job which is dependable and efficient to eliminate bias
possibilities which can be suggested. It is in job recommendation systems.
based on the concept that increasing the variety
of professions which might be advocated to
human beings can reduce the risk of bias being
perpetuated. One method for placing this III. JRS System Architecture
strategy into motion is to include diversity into
the recommendation set of regulations in a A. Overview of a typical JRS system
more open manner. architecture
ii. De-biasing algorithms: It includes techniques Since the early days of the internet's
that try to take away or get rid of bias from the commercialization in the late '80s, many have
facts this is used to educate the wondered how it could be used to better match
recommendation device. These algorithms are potential employees with open positions.
10

Vega presented an employment matching error metrics may vary considerably among
system that could "be consulted by Minitel, datasets.
using telephone number 3615 and selecting the
LM/EMPLOI service" before the advent of the The following outline is for this paper. It also gives
internet. In other words, users of a computer a quick introduction to some of the datasets and
terminal called Minitel1 could send text terminology that will be used throughout this paper,
messages over the phone line in the form of as well as an explanation of the process through
search queries or their digital resume. The which relevant material was gathered and selected.
service would employ a preset job taxonomy to Our findings are discussed and organizes the job
compare keywords from the query/resume to a literature into the well-known taxonomy of
database of relevant job openings and provide recommender systems while also dividing the
a list of matches. sizable class of hybrid contributions into
subclasses. Other topics, such as the impact of data
Although more than 30 years have passed since this science competitions\, JRS validation, JRSs with a
initial contribution, the use of a fixed job taxonomy temporal and/or reciprocal perspective, ethical
to extract information from a resume, such as considerations in JRS, and contributions discussing
"branch of industry" (industry) and "qualification" job search and recommendation on LinkedIn.
(skill), using "(a) dictionary specialized in the
universe of employment", seems vaguely like Despite the size of Freire and de Castro's
LinkedIn's query processing method which can be contribution, we find fault with the way they
queried using your mobile phone2. Naturally, this categorize the various forms of employment
oversimplifies the situation: Vega's 200 concurrent recommendation software. These are typically
Minitel connections could not support the world's categorized as either Content-based (CBR),
over 750 million LinkedIn users. Despite Collaborative (CF), Knowledge-based (KB), or
significant advancements over the previous three Hybrid (HRS) in the relevant literature. Given the
decades, the need to assist people in finding prevalence of hybrid approaches, however, we will
employment is still a top priority. further differentiate between monolithic and
ensemble hybrid recommender systems and
In this work, we will survey research published on subdivide the class of ensemble hybrids based on a
JRS (job recommendation systems) between the taxonomy proposed by. Different types of ensemble
years 2011 and 2021. We'll look at the many hybrids, as well as the interpretation of monolithic
approaches used in these systems from multiple and ensemble hybrids.
angles, including temporal, ethical, and mutual.
The large class of hybrid recommender systems In the context of JRS, content-based recommender
utilized in this application domain will be systems (CBRs) are models that rely solely on a
investigated, and the impact of data availability on semantic similarity measure between the user
method selection and validation will be given profile and the set of available openings to create a
special attention. recommendation. In other words, the semantic
similarity of job postings is utilized as a surrogate
Since the reciprocal and temporal nature of JRS are to predict how relevant each opening is to the job
rarely mentioned in the literature, our findings seeker. Unsupervised in CBRs, one generates
imply that a more application-oriented perspective vector representations of the open position and the
would be beneficial. Even when authors do think user profile, whose dimensions may or may not be
about fairness in the context of job recommender readily interpretable. Bag of Words (BoW) with
systems, they typically conclude that merely TF-IDF weighting is popular among authors,
removing discriminatory elements from the data is however Latent Dirichlet Allocation and the newer
adequate. However, there has been significant word2vec are also employed. Curiously, CBR
scholarly focus on fairness, especially in candidate contributions have stayed around the same over the
search engines, which has resulted in the past decade, yet they also weren't among the best in
introduction of several criteria and techniques that the RecSys competitions in 2016 and 2017.
we think might be applied in the job recommender
area as well. According to the literature on The fact that "job seekers and recruiters do not
recommendation systems, deep language models always speak the same language" is a significant
have also seen increased use in the field of barrier to the widespread adoption of content based
employment recommendations. Only one study has JRS, as stated by Schmitt et al. Applicants and
looked at the question of how effectively JRS employers may, therefore, refer to the same
results generalize across datasets, and it found that occupations, knowledge, and abilities in different
11

ways. Therefore, a recruiter and a job candidate may include models that are, taken individually,
may each describe the same entity using a distinct rather simplistic.
vector representation. Unfortunately, this disparity
is frequently ignored in JRS that focuses on Instead of defining "knowledge-based
content. recommender systems" as those whose overarching
goal is to "Give me recommendations based on my
Collaborative filtering (CF) is a recommendation explicit specifications of the kind of content
method in which ratings of various products are (attributes) I want," we'll stick with the definition
collected from various users and then used to make provided by Freire and de Castro: "[Recommender
further recommendations. This matrix is typically systems] which rely on deep knowledge about the
populated with click behavior in the e-recruitment product domain (job) to figure out the best items
context, as it is in the e-commerce setting. (job vacancies) to recommend to a user" (or
However, if such interaction data is lacking, the "candidate"). Jobs and applicants are matched in
rating matrix can also be filled in using the job recommender systems by first having their
sequence of past job occupations. In the latter profiles mapped to a job ontology (the knowledge
scenario, jobs must be organized into distinct base).
groups before they can be used in a ratings matrix.
To keep things simple, we'll just call everything in To create job recommendations, it is usual practice
the rating matrix a "rating," even if it's not always a to first map out the ontology space, where the
numerical value. overlap can be calculated using tools like the
Jaccard index. Although such ontologies may take
Hybrid recommender systems integrate elements time to build, they have been put to good use in the
from multiple models into a single algorithm. In real world by companies like LinkedIn and
this case, we hope to divide these procedures into Textkernel. Improved suggestions can also be
more manageable subsets. Separate hybrid achieved by encouraging users to fill out their
recommender systems into those with a monolithic profiles by suggesting relevant skills from the
architecture and those with an ensemble design to ontology. The gap between candidates' and
accomplish this. The term "monolithic design" is employers' job definitions can be bridged using a
used to describe hybrid recommender systems common ontology to map job descriptions and
when it is not viable to isolate a single component, candidate profiles.
modify the algorithm as needed, and then construct
suggestions based on that component alone. But Improved suggestions can also be achieved by
ensemble approaches do permit decomposing the encouraging users to fill out their profiles by
hybrid recommender system into at least two parts suggesting relevant skills from the ontology. The
that can be (individually) used in the gap between candidates' and employers' job
recommending process. That is to say, the ensemble definitions can be bridged using a common
ontology to map job descriptions and candidate
profiles.

A typical job recommendation system includes the


 Components Of a Job Recommendation following:
System:
12

feedback
data ingestion integration
mechanism

Data storage user interface

Recommendation
data processing
engine

Fig. Components of JRS

 Data Ingestion:

The procedure of gathering, obtaining, and putting entails locating common fields among several data
data into a system that recommends jobs is known sources and mapping them to a common structure.
as data intake. In this process, data is extracted The data may then be directly ingested into the
from numerous sources and transformed into a recommendation engine or loaded into a data lake
format that the system can readily process. The or warehouse. After that, the data is checked to
accuracy and efficiency of the job suggestion make sure it is true, full, and consistent. Data
system depend heavily on the quality of the data quality concerns are found and fixed using
ingested. Job postings, resumes, and user activity procedures including data profiling, data auditing,
logs are some sources for importing data. and data verification.

The process of data integration is depicted in the Data ingestion is the process of collecting,
figure. This entails gathering information from acquiring, and importing data into a job
sources, such as job boards and employer websites, recommendation system. Data must be gathered
or from profiles of job seekers and employers. from numerous sources and converted into a format
Extraction of the pertinent data from the data that the system can simply process. The accuracy
sources comes after data collection. To do this, the and efficacy of the job recommendation system
necessary data fields must be extracted utilizing depend heavily on the quality of the data ingested.
web scraping tools, APIs, or data extraction tools. Sources for importing data includes job postings,
There could be mistakes, inconsistencies, or job seeker profiles and user activity logs.
missing values in the extracted data. This retrieved
data is filtered and processed to make sure it is The fig. shows the process involved in data
reliable, dependable, and in a format that the integration. This involves collecting data from
recommendation system can easily accept. This sources which can be scraping data from job boards
stage is very important because the accuracy and and employer websites or collecting data from job
efficiency of the recommendation system are seekers and employer profiles. Extracting the
directly related to the quality of the data. pertinent data from the data sources comes after the
procedures including deleting duplicates, adding data has been gathered. To accomplish this, it is
missing numbers, and fixing errors are examples of necessary to extract the necessary data fields
data cleaning procedures. The gathered data must utilizing web scraping tools, APIs, or data
then be mapped to a common data model. This extraction tools. The extracted data may contain
errors, inconsistencies, or missing values. This
13

extracted data is filtered and processed to ensure it fields across different data sources and mapping
is accurate, consistent, and in a format that can be them to a standardized format. The mapped data is
easily ingested into the recommendation system. then ingested into the job recommendation system.
This step is crucial as the quality of the data This may involve loading the data into a data
directly affects the accuracy and effectiveness of warehouse or data lake, or directly ingesting it into
the recommendation system. Data cleaning may the recommendation engine. After that, the data is
involve techniques such as removing duplicates, checked to make sure it is true, full, and consistent.
filling in missing values, and correcting errors. The Data quality concerns are found and fixed using
next step is to map the extracted data to a common procedures including data profiling, data auditing,
data model. This involves identifying common and data verification.

 Data Storage: Data is often handled in a job recommendation


system through a series of phases that include
Data is often kept in a database or data warehouse cleansing, transformation, and analysis. The
and used in a job suggestion system. The data that purpose of data processing is to draw conclusions
the system utilizes to generate recommendations and patterns from the data that may be applied to
may include details about job searchers, job provide job seekers with precise and pertinent
advertisements, businesses, and other pertinent data employment suggestions. Some typical processes in
points. Usually, the data is kept in a structured the data processing process for a job
manner that makes it simple to query and analysis. recommendation system are data gathering, data
Data about job seekers, for instance, can include cleaning, data transformation, data analysis, job
particulars like their location, education, talents, recommendation generation and performance
and professional background. Information evaluation. Data gathering is to gather information
regarding job advertisements may contain the about job searchers, job posts, and businesses from
position's title, description, location, pay scale, and a variety of sources, including job boards, social
necessary qualifications. Before being placed in the media, and online business communities. Since the
database, the data is frequently cleansed and pre- data which is obtained is frequently unclean,
processed. This can involve activities like deleting erratic, and incomplete, we need to pre-process and
duplicate records, rearranging data in different clean it. Duplicate data must be eliminated, missing
formats, and adding in missing data points. The job data must be handled, and errors must be fixed. It
recommendation system can employ a variety of could be essential to transform the data into a
algorithms and approaches to analyze the data after format that is simpler to analyze after cleaning the
it has been saved to produce recommendations. For data. For instance, this can entail standardizing job
instance, the system might provide personalized descriptions or transforming location information
recommendations based on a job seeker's abilities into a standardized format. After the data has been
and preferences by using machine learning cleaned and processed, the system can begin to
algorithms to find trends in the data. The databases examine the information using a variety of
used for storing data in job recommendation system algorithms and methods, including clustering,
are Relational databases (SQL databases), NoSQL classification, and regression. Finding patterns and
databases, Graph databases, and document-oriented insights that can be utilized to generate suggestions
databases. for jobs is the aim of data analysis. The job
 Data Processing: recommendation system can produce job
recommendations for job seekers based on insights
14

and patterns discovered in the data. These skill sets, and pairs them with openings that most
suggestions may consider a job seeker's abilities, closely match their profile. Several steps need to be
experience, geography, and other factors. The followed to construct a strong recommendation
recommendation system's performance is assessed engine for a job recommendation system. Data
in the last phase using measures like recall, must first be gathered and stored in a way that
precision, and F1-score. This feedback can then be makes it simple to access and analyze. Job seeker
used by the system to hone and enhance its profiles, job advertisements, job applications, and
recommendation algorithms. other pertinent information should all be included
in this data. Data must be processed and analyzed
 Recommendation Engine: using a variety of algorithms and techniques after it
An essential part of a job recommendation system has been gathered. Collaborative filtering is one of
that aids job searchers in locating acceptable the methods that job recommendation systems
employment prospects is a recommendation engine. utilize the most frequently. To find trends in the
A recommendation engine's goal is to use historical data, collaborative filtering examines job seeker
data to find trends and anticipate which upcoming profiles and job ads. According to these patterns,
job ads will be of interest to job seekers. In essence, the approach suggests acceptable job posts to job
a recommendation engine analyses the data of job seekers and helps identify similar job seekers and
seekers, including their background, training, and job postings.
15

Fig. data mining techniques in job recommendation system

Content-based filtering is another frequent method job seekers can be made by a recommendation
used in recommendation engines. This approach engine by analyzing past data and utilizing a
entails looking for trends and similarities in the variety of algorithms and methodologies.
content of job advertisements and applicant
profiles. Based on the information in job seekers'  User Interface:
profiles, this technique assists in identifying job ads A job recommendation system's user interface
that are likely to be of interest to job searchers.
is essential since it serves as the main interface
Recommendation engines also employ machine
for people to communicate with the system. A
learning algorithms in addition to collaborative and
content-based filtering. Inferring from past data, user interface that is well-designed can
machine learning algorithms assist in forecasting improve user experience, raise engagement,
future job ads that are likely to be of interest to job and eventually result in better suggestions. A
seekers. These algorithms are educated on data to dashboard or homepage serves as the user
anticipate the preferences of job seekers based on interface for a job recommendation system and
their previous actions. A job recommendation presents relevant job posts based on the user's
system must, in general, have a recommendation preferences and profile. Filters for region, job
engine. Using information about a job seeker's type, and industry are just a few examples of
experience, education, and skill set, it helps match
how users can further narrow their search
them with appropriate job ads. An accurate forecast
parameters.
of future job posts that are likely to be of interest to
16

The user interface may offer users information on the relevancy of a recommended position. They can
the hiring process, corporate culture, and other also say whether they have applied for a position
pertinent aspects regarding the job posting in that was recommended. The job seeker's profile can
addition to job recommendations. A "thumbs up" or be improved using this feedback, and the
"thumbs down" rating system or a comments area suggestions made by the system. Implicit feedback
are only two examples of user interface features is another type of feedback that is gleaned from job
that enable people to comment on job seekers' online behavior. This can entail keeping
recommendations. This feedback system can help track of a job seeker's clicks, time spent on a job
future recommendations be more accurate. The ad, and applications. The information provided by
demands and preferences of the user should guide this kind of feedback can be used to improve future
the design of the user interface to make sure it is recommendations by the recommendation system
simple to understand and use. To learn how users and reveal the job seeker's interests.
engage with the system and pinpoint areas for
improvement, user research methods like surveys A job recommendation system's feedback
or interviews may be used. mechanism often consists of a feedback loop, in
which the system iteratively improves its
 FEEDBACK MECHANISM: suggestions in response to user feedback, which in
turn generates better feedback. The system may
Based on a job seeker's abilities, experience, continuously learn from and adjust to the tastes and
preferences, and other pertinent criteria, a job demands of the job seeker thanks to this feedback
recommendation system is created to offer loop. Making sure the feedback is both accurate
personalized job recommendations. Any and representative is one of the challenges in
recommendation system must have a mechanism creating a feedback mechanism in a job
for feedback since it allows the system to develop recommendation system. Action must be taken to
and refine its recommendations over time. minimize any biases that may infiltrate into the
Feedback can take many different forms when used feedback process to ensure that the system delivers
in the context of a job recommendation system. fair and impartial recommendations. By leveraging
Explicit feedback is one type of feedback, in which both explicit and implicit information, the system
job searchers express their exact opinions regarding may more fully understand the needs and
the referrals they receive. For instance, job seekers preferences of the job seeker, leading to better job
can comment on the job description itself or assess matching and more successful job placement.

 INTEGRATION:
17

A job recommendation system's integration, by job seekers and by employers in job


which facilitates seamless data sharing and advertisements. The integration of various
communication across different system algorithms makes it possible for them to
components, is an essential component. To cooperate easily and deliver precise
guarantee that the job recommendation system employment recommendations.
runs smoothly and provides accurate and
The job suggestion system's user interface is
pertinent job recommendations to job seekers,
also integrated. employment seekers receive
it is crucial to integrate various systems and
employment recommendations from the user
services. Integration of diverse data sources is
interface, which also enables them to submit
an important component of job suggestion
applications. Job seekers may simply navigate
systems. A recommendation system for jobs
and interact with the job suggestion system
often uses information from numerous sources,
thanks to the user interface's integration with
such as resumes, application histories, job ads,
other system elements like the algorithms and
and seeker profiles. The integration of various
data sources. The external systems and
data sources enables the job recommendation
services that the job recommendation system
system to obtain all the information required to
interacts with, such as job boards, applicant
produce accurate job recommendations. The
tracking systems, and career websites, must
incorporation of multiple algorithms and
also be integrated. Integration with these
machine learning models employed in the job
external systems guarantees that candidate
recommendation system is a crucial
data and applications can be reliably processed
component of integration. To produce useful
and tracked, and that job recommendations can
recommendations, these algorithms and
be effortlessly shared between platforms.
models must analyze the information provided

B) Identification of potential sources of or people from a particular region. The use of


bias in the JRS system sizable and varied data collection in the
system's training can lessen the likelihood of
1. Incomplete or inaccurate data: Biased or this happening. It is also important to
insufficient data utilized to train the system can periodically analyze data to ensure it still
result in inaccurate recommendations. A represents the intended audience accurately.
system may be less effective at recommending
women or people of color for open positions if 2. Biased training data: It is possible to adjust
it is trained on data that solely comprises men the weight given to numerous factors in a
matching algorithm, such as a candidate's
18

education or work experience. If the algorithm is 6. Lack of contextual information: Using data
set up to give more weight to characteristics that obtained from untrusted sources raises the
are not necessarily predictive of job success, or possibility of producing inaccurate or
if it is not constructed to consider the individual misleading suggestions. Candidates who do
variances between candidates, this can lead to not utilize social media or do not fit specific
inaccurate findings. To reduce this danger, we personality characteristics may be
need algorithms that can prioritize crucial disadvantaged if the system uses social media
criteria while also accounting for each data to judge their personalities. To reduce this
applicant's specific set of abilities and danger, make sure that no slant is introduced
experiences. In addition, algorithms should be into the recommendation process by any
reviewed on a regular basis to ensure that they unreliable third-party data sources. Third-party
are being used fairly. data sources should also be examined

3. Lack of diversity in the training data: The


frequently to ensure impartiality.
interface that job-seekers interact with can
introduce bias into the recommendation process 7. Lack of contextual information: Candidates
if it is not created with universal usability in may be less inclined to trust or use the system
mind. It is possible that candidates who do not if they do not understand the process by which
understand English or who do not have access to recommendations are generated. In addition, it
a particular type of technology will be given less can be harder to identify and correct causes of
favorable suggestions because of the interface's bias if there is a lack of transparency in the
language or technical requirements. To lessen recommendation process. This threat can be
this danger, interfaces should be made to be reduced by making the criteria for making
usable by people of varying linguistic and recommendations public and simple to
technological abilities. To further guarantee understand.
universal usability, interfaces should be
evaluated with a wide range of end users. 8. Limited data sources: Hiring managers may
make biased recommendations if they provide
4. Unbalanced class distribution in the training feedback that is biased or impacted by their
data: The accuracy and thoroughness of the job own personal biases. For instance, if a hiring
postings used to find qualified candidates: manager has a personal bias against people
Recommendations may be skewed if the job with a particular educational background, they
descriptions are inaccurate or lack essential may suggest less favorably on candidates who
information. This kind of discrimination against share that history. Training for hiring managers
women is possible. To mitigate this risk, to help them recognize and overcome their
businesses must ensure that their advertised own prejudices can help reduce this danger.
positions are fully and accurately represented to
prospective employees. To reduce this danger, 9. Algorithmic complexity: A lack of diversity
employers should make sure that job postings on the team tasked with building and
provide detailed and fair representations of what developing the system increases the likelihood
is expected of applicants. In addition, job that vital details will be overlooked.
descriptions need to be revised on a regular
basis to ensure objectivity. 10. Overfitting: Biases in the system may go
undetected and continue to affect its
5. Over-reliance on proxies: Use of information recommendations if it is not constantly
that could be used to discriminate based on checked and assessed for fairness and efficacy.
legally protected traits; for instance, using Regular testing and evaluation of the system to
information about a person's zip code or level discover any biases and adapt the system
of education could lead to discriminatory accordingly is essential for reducing this risk.
suggestions based on their race or
socioeconomic status. To reduce the possibility
of discrimination, it is vital to avoid using any 11. Unexplained algorithmic decisions: If the
data that could be linked to legally protected hiring process is unfair or fails to consider
traits in the selection process. In addition, it is qualified applicants, the resulting
important to frequently assess the fairness of recommendations may be skewed. If a certain
using specific data points. credential is needed to apply for a position, for
instance, qualified applicants who lack that
19

credential but have otherwise equal experience


or skills may be overlooked. A fair and open
application process that considers all competent IV. METHODOLOGY
applicants can help reduce this danger.
A) Design thinking process for developing
fair JRS.
12. Lack of human oversight: The system may The methodology involved in this study is
perpetuate existing biases or miss new ones if developing a design thinking approach. Design
humans are not involved in the decision-making thinking is a method of problem-solving that
process at all stages. Having human oversight of centers the process around the user. To develop
the recommendation process to identify and creative solutions that satisfy user needs, it
rectify any biases is crucial for reducing this
entails iterative cycles of brainstorming,
risk.
prototyping, and testing. A typical design
13. Lack of standardization in job requirements: thinking approach has seven steps as follows
If the recommendation system were trained with empathize, define, ideate, prototype, test,
skewed data, the resulting recommendations implement and monitor.
would be inaccurate. The system's ability to
recommend a wide variety of applicants may be i. EMPATHIZE
hindered, for instance, if the training data is
The first step of design-thinking approach is
biased towards a particular demographic or
empathy. Design thinking is a human-centric
leaves out specific types of jobs. To reduce this
approach where problem solving is done by
possibility, it is crucial to use training data that
keeping user as the main focus. Empathy plays a
accurately reflects the population of potential
crucial role in problem solving from user’s
employees.
perspective. Designers can comprehend the user's
issues and emotions by empathising with them.
14. Lack of standardization in candidate profiles:
This knowledge enables designers to develop
Biased suggestions might result from the
solutions that not only address the issue at hand but
system's overemphasis on narrow criteria, such
also enhance user experience. Empathy is therefore
as a required degree or qualification, for a given
a key component in creating successful and user-
position. For example, if a job ad requires a
friendly designs.
specific degree, the algorithm may ignore
applicants who do not hold that degree but do In this study, we need to empathize with the user
have the necessary abilities and experience. who is a job applicant using a job recommendation
Reducing this risk can be accomplished through system like linked in, monster, career builder and
the provision of recommendations based on zip recruiter. This is to understand his emotions,
several characteristics, such as knowledge, these emotions are represented as four categories
experience, and potential. (think & feel, hear& see, does, says) and two
outcomes (pain and gain). These emotions are
represented in the form of map using the tool Miro
empathy map template.
20

Fig. Empathy Map: Empathizing with the user of Job Recommendation System.

MAP ANALYSIS  Scared of the safety of the data which they are
uploading while applying for these job
Think and feel: recommendations.
 The user is frustrated that he couldn’t find the Hear and See
right job. The jobs recommended by A.I.
software is irrelevant or out of context.  There are not too many choices in jobs
 The user feels that he needs to improve or recommended by the A.I. algorithms.
upgrade his skills to find the right job for him.  The A.I. algorithms give similar job
 The people come in with excitement for the recommendations.
tailored specific jobs they require.  The job recommendations from A.I. are not
always feasible and the recruiters don’t
respond to applications of these responses.

Says: Pain:

 Since the options are irrelevant, there is  Unable to get the desired job because of this.
confusion in choosing the right job.  There are too many fake recommendations.
 Due to irrelevant job recommendations, the
search process is tiring. Gain:

Does:  This is a platform with many opportunities.


 It helps in finding a job that matches my skills.
 Sends CVs directly to the HRs and the  It increases networking.
companies.
 Stop searching for jobs online. By empathizing with the user of a job
recommendation system, I can say that, even
21

though the user is finding the platform useful with its advantages, he is not satisfied with the outcomes
of the recommendation system.

ii. DEFINE By analyzing the mind map, there are five biases
mostly found in JRS namely gender bias,
After empathizing with the user, we understand the algorithmic bias, demographic bias, occupational
problem that need to be solved. In this stage we bias and availability bias. The causes for these
define the problem statement, identify the key biases are found to be many but the major factors
stakeholders and also defining the user groups. are mentioned in the mind map. Pre-determined
According to the literature review there are few algorithms, past-hiring patterns and stereotyping
biases in job recommendation system and how are the main causes for biases in AI system. This is
users are affected by those biases. because the algorithms are set by the given inputs
This define stage can be represented by using mind and pre-determined set of rules. Past hiring patterns
map tool with the help of Miro templates. Based on also have an impact as the data is stored and system
the research study the data is represented in the is set to perform according to historical data. The
mind map. It has four categories mentioned which proven effects of Bias in AI recommendation
are types of bias, effects of bias, causes of bias and system are irrelevant job suggestions, fake job
key stakeholders. This mind map is used to recommendations and lack of jobs with matching
understand the problem and the people who are requirements.
affected by the problem, what are the causes for the
problems and how does it impact the users.

Fig. mind map of bias in job recommendation system.

iii. IDEATE interested parties, especially the consumers, need to


make the changes to the models as a result.
The design thinking process consists of many steps
in that ideate is one of them in that brainstorming is From the Affinity map, the problem has been
one part that includes a huge group of consumers, identified as the Biasness in the Job
and experts in the field these need to be brought recommendation AI system (JRS), whereas the
forward for an exercise in brainstorming. Biasness source and reference material is Building Fairness
should not be there in AI JRS and for that methods into Job Recommendation AI Systems (JRS): A
for generating ideas need to be developed such as 6 Design Thinking Approach.
six hats. SCAMPER (substitute, combine, Adapt,
Modify, put to another use, Eliminate- Reverse) Many alternative approaches have been derived
whereas the third point considered the choice of such as creating a fair and equitable algorithm,
concepts in that ranking the concepts that have incorporating transparency into the Job
been developed according to their survival, recommendation system, communication can be
attractiveness, and practicality. Lastly, opinions foster using trust, introducing human review and
need to be taken for the biases in the Job oversight, utilizing open-source projects, and
Recommendation Systems (JRS) and in that all the investing in research and development, which can
22

also include fairness metrics lastly can go for the many alternatives come such as transparency,
user feedback. fairness metrics, and algorithm-based whereas user
feedback is another idea in which Human review
The third step of the affinity diagram is group, in and improvement through communication are the
this the different alternative approaches need to ideas and question is the problem in AI. After
combine, in that first group three alternatives have analysis, it can be found that utilizing open sources
been combined together that are create a fair and and investing in research and development can be
equitable algorithm, incorporate transparency into another ideate of the main question of Biasness AI
the JRS, and lastly the fairness metrics. Secondly, systems (JRS).
they introduce human review and oversight, user
feedback, and lastly, they foster user trust through After all those grouping fairness and user feedback
communication all these have been combined can be the alternatives that can be prioritized.
together. Lastly, utilizing open-source projects and
investing in research and development have been
combined together as one group of alternative
approaches.

The last step of the affinity diagram is idea


generation in Fairness is the one group under which

Fig. overview of description


23

Fig. gather- affinity diagram

Fig. group- affinity diagram


24

Fig define-affinity diagram.

Fig. Idea napkin

After brainstorming different ideas and solutions, implementing a blind-fairness approach. The above
Idea Napkin is used to select one solution that has map consists of the problem statement and what,
high probability of solving the problem. The How, where and when along with benefits of the
solution we chose is changing algorithms by solution. Target group is job seekers.
25

iv. PROTOTYPE method where the user’s personal information


like name, age, gender and ethnicity are
In design thinking approach the fourth step is eliminated from the resumes while storing the
Prototype, followed by ideation step where few data and recommending to reduce bias. By
approaches were mentioned which can remove the eliminating these factors that are responsible for
bias in the recommendation system. Out of those bias, fairness can be integrated into job
approaches the one we felt would be best fit to recommendation systems.
create a fair recommendation system is Blind-
fairness approach. This approach is carried forward c. Actions: This step involves the actions to be
to the prototype stage. taken to start prototyping. There are six steps to
In this stage a systematic representation of in developing a Blind-fairness approach. They
prototyping is done. The tool used for prototyping are data collection, data pre-processing, feature
is Miro prototype template. This template consists engineering, model building and model
of nine steps which are given as follows: evaluation.
 Data collection- Collect data on past job
a. Tension: Tension is the first step in postings, resumes, and hiring decisions.
prototyping. In this the problem and need to  Data pre-processing – the data collected is
solve the problem are mentioned. The problem now pre-processed by removing irrelevant
(tension) is Bias in Job recommendation information, followed by converting text
system. There is a need to develop a fair into numerical data. From this numerical
recommendation system so that the users get data if any information that can lead to bias
their desired outcome. is present then it is removed.
 Feature Engineering: Identify the relevant
b. Prototype: A prototype is given based on our features such as work experience,
analysis. The approach we prototyped is Blind- education, and skills, and extract them from
fairness approach. Blind-fairness approach is a the pre-processed data.

 Model Building: Build a machine learning d. Initiating (how can you start): First the data
model such as a decision tree, random sets are collected which contains information
forest, or neural network. These models about past job-postings, resumes and hiring
can predict which job candidates are most decisions. This step is followed by the other
suitable for a given job based on their steps mention under actions.
relevant features.
 Fairness Constraints: Introducing fairness
constraints during model training. These
fairness constraints ensure that the model
does not discriminate against certain
groups based on their protected attributes.
For example, statistical techniques can be
used like group fairness, individual
fairness, or equalized odds to ensure
fairness.
 Bias Mitigation: To mitigate the bias,
analyze the model's output for any biases
or discrimination against certain groups.
 Mitigate the biases by introducing
additional fairness constraints. If there are
additional fairness constraints that are not
present, certain features from the model
that are correlated with protected attributes
can be removed.
 Deployment: Deploy the job
recommendation system to recommend the
most suitable job candidates to recruiters
without disclosing their personal
information.
26

e. Ranking prototype: Here a rating is g. Success: After implementing the blind-fairness


given using a rise scale and it is approach, success is achieved when user could
highlighted using yellow colour to get recommendations without any
describe the rating. The constraints which discriminations of gender, age, ethnicity or any
are rated are gender bias, occupational other factors causing bias. Then we can say the
bias and demographic bias. With the help prototype is a success.
of this prototype gender bias and
demographic bias can be removed to a
maximum extend (completely if there are h. Scaling impact: This is used go scale the
no errors in developing), whereas impact of the proposed prototype according to
occupational bias is rated 2/5. This is user. This can be done after testing the
because if we remove occupational prototype. If this prototype is successful an
database during the approach this would expected impact of 4/5 rating can be given.
result in inefficiency in showing job-
recommendations. i.Hypothesis: In hypothesis, we write our idea
statement and situation. The idea is
f. Contribution: In contribution, we can implementing fairness into job-recommendation
describe how does this prototype contribute to system. The situation can be deploying the
the problem. The blind- fairness approach blind-fairness approach into small groups of job
prototype mitigates the bias from the data applicants who are unable to get good job
thereby those factors doesn’t create any bias. recommendations due to bias.
This prototype can help users in giving
recommendations of their desired jobs.

Fig. prototype for fair job recommendation system

v. TEST

Testing is done for the prototype developed. In this B) DATA COLLECTION AND ANALYSIS
stage users are allowed to try this prototype. Based
on their experience with the developed Data collection is done by taking focus- group
recommendation system, user feedbacks and interviews with open-ended questions to
recommendations are collected. The solution is understand their experience while using a job
validated and based on the results if there is any recommendation system like LinkedIn. The
need of adjustments and improvements, followed respondents were asked questions on how well they
by implementation. know, how frequently they use it, what are their
expectations, problems faced by them, how do they
27

rate the platform etc. All their responses gave an platform and making it more user-friendly. It will
understanding of their needs and expectations and also help in identifying the areas that need
what problems need to be addressed. This improvement and prioritizing them accordingly.
information will be useful in improving the

Fig. job-seeker journey map

Fig. job-seeker journey map (experience)


ANALYSIS In the above fish bone diagram, the rectangular
boxes represent the main causes and square boxes
Understanding the issue and its primary causes represent the factors that are causing it. One of the
allows for the analysis of the user data. Using a main causes is occupational bias which is having
fishbone diagram, these causes and effects limited with that limited data occupational bias is
(problem) are depicted. The components of a fish also the problem narrowing job descriptions with
bone diagram are problem, main causes of the that Past hiring interns one which shared the
problem and the factors contributing for each common cause i.e., the occupational bias and also
cause. Fish bone diagrams are also known as the availability bias. But availability bias is a main
Ishikawa diagrams and they are commonly used in cause which is having to overlook candidates with
quality control and process improvement different but equally valuable qualifications which
initiatives. They provide a visual representation of comes under availability bias and in that past hiring
the factors that contribute to a problem, making it patterns is the one which is having a common main
easier to identify the root cause and develop cause.
solutions.
Demographic bias is another main cause which is
Here the effect is the users are unable to find having to stereotype and lacks diversity in
desired jobs and the main causes of this effect are development teams as the sub cause, in the middle
biases. There are others causes also, but the major square box is the historical data bias and pre-
causes were mentioned in the fish bone diagram. existing beliefs which is having both demographic
These major biases are gender bias, demographic and behavioural as the main cause with that
bias, occupational bias, algorithmic bias, anchoring to certain qualifications is the sub cause
behavioural bias and availability bias. of Behavioural bias.
28

Usage of keywords that describes gender and pre-determined rules is the one which is having
algorithms is the sub-cause of the main cause of algorithmic bias as the main cause and all the
gender bias whereas in the middle-biased statistics, causes and sub causes ultimately causing the
the one which is sharing both the gender bias and problem Unable to find a desired job because of
algorithmic bias as the main cause of even more bias.

Fig: fish bone diagram


29

In our study, based on focus group discussions we Networking: The user starts networking by adding
collected data on user’s experience on the most his friends, colleagues, professors and mutuals. The
commonly used job recommendation system user starts sharing posts on the platform and tries to
(LinkedIn). Based on the inputs received a journey stay active and increase network. The main goal of
map of a job-seeker is made the user in this stage is to increase connections and
improve his reach to make his profile visible.
Customer Journey Map- LinkedIn
Job search: During job search the user starts
This map is used to represent user experience while networking out of his circle. He tries to connect
using the job recommendation platform. This with the recruiters and hiring managers. Some
journey consists of stages related to job might find relevant job as they wanted while some
recommendations only. The stages of the journey doesn’t. The main goal of the user is to find a job
are awareness, consideration, networking, job that meets his requirements.
search, retention and advocacy. For every stage
four parameters were mentioned which are Retention: if the job seekers finds jobs as he
customer activities, customer goals, touch points wanted, he continues to use the app. If he doesn’t
and experience. get a deserved job recommendation even after
having skills and qualifications required, he
Awareness: This is the first stage where the person searches for other platforms to find the job that
gets to know about the platform from a friend, matches his requirements.
colleague or any online advertisement. The user’s
main goal is to learn how to use the platform. Advocacy: based on the experience of the user, they
refer to others. Some people share their experiences
Consideration: in this stage the user starts update on social media while some give ratings. These
his profile by adding a profile picture, personal reviews, ratings and experiences shared can be
details, educational background, skills, positive or negative.
certifications, projects etc. The user tries his best to
make his profile look appealing for the recruiters.
30

C) SYSTEM DESIGN AND STEP-2: collected data is cleaned and pre-


IMPLEMENTATION processed to remove any biases, inconsistencies, or
errors that may affect the recommendation process.
The suggested prototype- Blind-fairness approach
can be implemented in the job recommendation STEP-3: This is followed by removing the personal
systems to address the bias. data with sensitive attributes such as race, gender,
age, and religion are not used in the
The system design involves the same architecture recommendation process. It should be taken care
as of a typical job recommendation system, but the that the inputs must not contain these attributes
algorithms provided will be changed. It adjusts the when used in the recommendation process.
data storage and data cleaning procedures by
removing sensitive data. The updated algorithms STEP-4: an algorithm should be designed that
will also incorporate machine learning techniques takes into account the job seeker's qualifications,
to enhance the accuracy of job recommendations skills, experience, and preferences, as well as the
and ensure that the system is unbiased. This makes job requirements, to provide recommendations that
sures that the overall user experience and increase match the job seeker's profile. A blind fairness
the chances of successful job recommendations. approach, however, forbids the algorithm from
using any sensitive traits or other elements that can
Implementing a blind-fairness approach in job possibly result in bias. Programming languages like
recommendation systems involves considering Python are used to modify the algorithm.
various factors such as the job seeker's
qualifications, skills, experience, preferences, and Sample code to implement blind-fairness approach
the job requirements, as well as ensuring that there using python is given as follows
is no bias or discrimination in the
import pandas as pd
recommendations. The steps to implement blind-
fairness are given as follows. # Load data
STEP-1: collection of data based on job seekers' data = pd.read_csv('data.csv')
qualifications, skills, experience, preferences, past
job applications and also job requirements, # Remove sensitive features
qualifications and skills required by employer. data = data.drop(['race', 'gender'], axis=1)

# Split data into groups based on non-sensitive


features

group1 = data[data['age'] < 25] can remove the sensitive features that are causing
bias. To adjust the algorithms, we need to use
group2 = data[(data['age'] >= 25) & (data['age'] < programming languages like python and R-
45)] programming. The code should be written in such a
group3 = data[data['age'] >= 45] way that it loads the data and removes the sensitive
content from the stored spaces.
# Train recommendation algorithm for each group
STEP-6: After removing the data is evaluated using
# ... performance metrics like accuracy, precision. F1
score, etc. Then the code is implemented in the
# Adjust weights of features to ensure fairness
recommendation system.
# ...

# Evaluate performance using metrics such as


V. RESULT AND ANALYSIS
accuracy and diversity
A. OVERVIEW OF FAIR JRS PROTOTYPE:
# ...
The blind fairness technique for task
# Implement algorithm in recommendation system
recommendation structures is an approach that
# ... ambitions to make sure that the suggestions made
by way of the tool are impartial and sincere to all
STEP-5: Implementing a blind-fairness approach in customers no matter their private traits inclusive of
the system requires adjusting the algorithms that
31

age, gender, race, or faith. This approach involves several steps:

Removing
demographic
facts

Using
Ongoing
impartial
tracking
requirements

Testing for Mitigating


bias bias

 Removing demographic facts: The first step  Ongoing tracking: The method
is to remove any personal information data that recommendation gadget must be typically
might probably find out the man or woman's monitored to ensure that it remains trustworthy
demographic group. This includes records and independent. This may additionally
which incorporate name, gender, age, and moreover incorporate frequently trying out the
location. tool and adjusting as crucial.

Overall, the blind fairness method for interest


 Using impartial requirements: The activity
advice structures is aimed in the direction of
recommendation system wants to use unbiased
making sure that all customers have identical get
standards to assess a candidate's suitability for
proper of entry to undertaking possibilities and that
a task. This way avoiding any criteria that
the tool does no longer discriminate primarily
could probably discriminate in competition to
based totally on any private characteristics. By
positive corporations, which includes academic
following this technique, task recommendation
qualifications which might be simplest
structures can assist in selling fairness and equality
available to fine businesses.
in the place of work.

B) EVALUATION OF THE FAIR JRS


 Testing for bias: The device needs to be tested PROTOTYPE:
to make certain that it does not discriminate
toward any precise demographic institution. Job Recommendation Systems (JRS) are widely
This can be carried out by manner of reading used in recruitment and hiring approaches to assist
the suggestions made via using the machine employers in figuring out suitable candidates for
and identifying any patterns of bias. open positions. However, there is a developing
subject that those systems may be situation to bias,
which could bring about unfair hiring practices. To
 Mitigating bias: If bias is identified, steps
address this issue, researchers and developers have
must be taken to mitigate it. This ought to
proposed numerous techniques to assess and
include changing the algorithms utilized by the mitigate bias in JRS. One such method is the blind
device or adjusting the standards used to assess fairness method, which ambitions to get rid of the
applicants. unfairness in honest activity recommendation
structures.
32

not require complex algorithms or vast quantities of


The blind fairness technique is a framework that records. Another advantage is that it is an obvious
seeks to obtain fairness in JRS by treating all technique that can be without difficulty understood
candidates similarly, regardless of their and carried out by means of developers and users
demographic or personal traits. The approach is alike. Finally, the blind fairness technique can help
based on the concept that bias in JRS is often to build consider and confidence in JRS among
resulting from the presence of sensitive attributes applicants and employers, by using making sure
together with race, gender, or age within the that the device is honest and independent.
candidate's profile, which can influence the
gadget's hints. To put off the impact of these In conclusion, the blind fairness technique is a
sensitive attributes, the blind fairness technique powerful method for eliminating bias in honest task
removes them from the candidate's profile before advice structures. By casting off sensitive attributes
feeding it into the JRS. This means that the device from the candidate's profile, the technique
does now not have access to any information the guarantees that the JRS most effective considers
candidate's demographic or non-public activity-associated criteria and qualifications in
characteristics and may only base its pointers on making guidelines, which could help to cast off
objective qualifications and activity-associated bias and promote fairness. Moreover, by means of
criteria. comparing the system's overall performance the use
of metrics that mirror its ability to provide truthful
The blind fairness technique consists of two suggestions, the method can help to build believe
predominant ranges: pre-processing and evaluation. and confidence in JRS amongst applicants and
In the pre-processing level, the candidate's profile employers alike.
is analysed to identify any sensitive attributes
which can bias the JRS suggestions. These C) COMPARE OF THE FAIR JRS WITH
attributes are then removed from the profile, and EXISTING SYSTEMS
the remaining information is fed into the JRS. The
objective of this stage is to make certain that the In the contemporary generation of automation and
JRS handiest considers activity-associated criteria AI, while such structures are swiftly being utilized
and qualifications, which include education, work in many HR strategies, the advent of actual task
revel in, and abilities, in making recommendations. concept systems is an important trouble. The aim of
the studies is to design algorithms that are
In the evaluation level, the performance of the JRS
doubtlessly free from bias and discrimination based
is measured the use of metrics that replicate the
totally on elements like shade, gender, age, and
system's capacity to provide fair suggestions. These
metrics consist of the overall accuracy of the other private traits. There are several truthful ways
device, in addition to metrics that reflect its to suggest system prototypes already in lifestyles
performance with respect to special demographic that may be evaluated to determine their efficacy.
businesses. For instance, the device can be
One such prototype is the AI Recruiter developed
evaluated primarily based on its capability to
provide guidelines which can be equally correct for through Plyometrics, an agency that employs AI
candidates of various races or genders. and neuroscience to assist companies make better
To put in force the blind fairness approach, builders recruiting selections. The AI Recruiter uses
can use numerous techniques, consisting of gamified exams to assess active possibilities'
statistics overlaying, feature selection, or cognitive, social, and emotional attributes and fit
algorithmic changes. Data masking includes doing them to relevant activity positions. The device is
away with sensitive attributes from the candidate's supposed to be impartial and inclusive because it
profile, either by using deleting them or replacing excludes from the employment system any facts
them with non-sensitive values. Feature choice that may deliver upward push to prejudice, together
entails deciding on only the process-related features with call, gender, and race. Instead, it places an
which might be most applicable to the location emphasis on methodically implemented abilities
being filled, and apart from any capabilities that
and trends, matching applicants to the excellent
can introduce bias. Algorithmic changes contain
enhancing the JRS set of rules to ensure that it does jobs in step with their individual strengths and
not discriminate towards candidates based totally shortcomings.
on their sensitive attributes. Another example is the interest-matching set of
rules advanced with the aid of Textio, an employer
The blind fairness technique has numerous
advantages over other strategies for evaluating and that makes use of AI to help corporations improve
mitigating bias in JRS. One gain is that it is miles a their procedure descriptions and appeal to a huge
tremendously easy and simple approach that does range of individuals. The collection of hints looks
33

at interesting descriptions for gender bias and In conclusion, no matter the truth that every of
indicates adjustments to make them more inclusive those prototypes is a hit in reducing prejudice in
and interesting to a much broader variety of some factors of the recruiting manner, they do no
applicants. Additionally, it monitors the language longer fully address the problem of bias in
utilized in process descriptions to anticipate which recommendation systems. Combining the
phrases and terms would trap or deter candidates advantages of those prototypes and addressing their
from the usage of them. The gadget goals are to shortcomings are critical for the development of an
lessen recruiting prejudice by means of drawing in extra comprehensive and effective answer. This
a greater number of candidates and increasing the may require creating a device that uses gamified
probabilities of matching them with suitable tests to rank process-applicable abilities and traits
method jobs. while additionally casting off non-public records
that is probably used as a foundation for
The impartial resume screening tool developed by discrimination. Additionally, it may contain the use
way of Talvista, an enterprise that gives software of be more to look at task descriptions for prejudice
program software solutions for bias-free and recommend changes with a purpose to lead
employment, is a 3rd prototype. The tool uses AI to them to more inclusive and attractive to a larger
study resumes and weed out any in my opinion variety of candidates. Combining these strategies
identifiable records, which includes name, deal will growth the dimensions of an open, inclusive,
with, and photograph, which could result in and independent activity advising gadget.
prejudice. Additionally, it fits people with healthy
positions by thoroughly evaluating resumes based
totally on interest-applicable requirements. The
device pursuits to decrease unconscious bias in VI. DISCUSSION
recruiting through except for any figuring out
A) INTERPRETATION OF THE
statistics that would lead to prejudice and
RESULTS
specializing in procedure-applicable skills and
enjoy. It has been demonstrated in numerous other
recommendation systems that the strategy of hiding
We can examine the prototypes and discover that
the user's private information works. They employ
each has blessings and downsides. Based on all
a blind-recruiting strategy in LinkedIn's
applicant’s precise abilities and shortcomings, the
employment and recruitment process. With this
AI Recruiter powered by means of Pymetrics does
method, the hiring process will be more impartial
a terrific activity of connecting them with relevant
because recruiters won't be favoring any one
hobby positions. It is primarily based on gamified
candidate over another and will only pay attention
tests, which won't be suitable for all applicants, and
to each applicant's skills and qualifications. It is
there may be nevertheless a hazard of bias within
rarely utilized in job recommendation systems. The
the choice of the cognitive, social, and emotional
users' experience may be enhanced, according to
tendencies which might be possibly measured.
our findings, if we use a blind-fairness strategy. A
By making hobby descriptions extra inclusive and recommendation for the desired employment might
attractive to a much broader variety of humans, be given to the job seekers based on their
Textio's assignment-matching set of rules is qualifications and experience. As a result, a system
powerful in drawing in a larger application pool. for employment recommendations might be
However, it is not able to conquer ingrained approached objectively. The recruiting process can
prejudices within the recruiting procedure, and be made more inclusive and ensure that candidates
therefore leaves the issue of bias in applicant are judged entirely on their qualifications by
selection unaddressed. implementing a job recommendation system that
takes an unbiased approach. A more inclusive and
By getting rid of any personal facts that could varied workforce may result from this.
cause prejudice, Talvista's objective resume
screening tool is mainly useful in decreasing B) IMPLICATIONS OF THE FAIR JRS
unconscious bias while reviewing interest FOR PROCESSES
programs. However, it no longer deals with biases
that can be gifted in the technique description or As an occupation suggestion structure, a blind-
selection procedure, therefore it may not be fairness technique seeks to get rid of prejudices
successful in identifying the quality-wholesome caused by things like race, ethnic background, or
human beings for hobby positions. any additional protected traits. Such a strategy
34

emphasizes considering each applicant similarly, qualifications, work ex, or any additional skill
despite their individual traits, and making sure that which is very important. This results in less
the selection procedure depends solely on accuracy in the outcome, or it may give a less
factors relevant to the position. relevant result.

Although a blind-fairness strategy seeks to lessen 2. undesired consequences: This approach


prejudice, there are nonetheless several may lead to undesired results like producing
considerations to consider: new biases. Take an example, if it ignores the
information about the educational background
1. Possible compromises - Efficacy and justice may of past work experience, this may advertently
be compromised by the blind-fairness method. If disadvantage candidates from underprivileged
details like schooling, previous employment, or backgrounds who never had equal
talents linked with the position are unavailable, the opportunities or access to education or work
position suggestion process could grow less opportunities.
successful at finding the most suitable applicant for
the position in question. 3. Insufficient data: there are some scenarios
where the data which is available may not be
2. Integrity of Data - Equity of the employment sufficient or it may be biased, which makes it
suggestion algorithm depends on the accuracy of difficult to imply in this approach. Insufficient
the information that is utilized. No of the method data can lead to a less efficient result and
employed, the algorithm's suggestions could be decreases the ability.
incorrect if the information used is inaccurate,
skewed, or discriminating. 4. Complex system: practicing this approach
in a job recommendation system may lead to
3. Transparency - A blind-fairness method can making it more difficult to use and this might
make the hiring procedure more transparent, result in increased cost, and the amount of time
thereby rendering it simpler to spot and correct any required may also increase and we cannot
possible prejudices. Applicants might be reassured forget the maintenance it may require.
of their innocence being handled properly during
the hiring process as a result, which can also assist 5. Convenience: this can negatively affect
to increase applicant confidence. the experience, or the comfort user might get,
as candidates might get a feeling of not being
4. System Difficulty - The level of detail of the task evaluated fairly, and the recommendation
suggestion system may rise because of a blind- might not be relevant.
fairness method, which makes it more difficult to
create, roll out, and administer. Overall, this approach has some limitations
that must be kept in mind. This blind fairness
In an employment suggestion structure, a blind- strategy can be proved helpful in creating an
fairness technique can help decrease bias and boost equal platform by eliminating discrimination
openness. To prevent unintended repercussions, it and bringing fairness. But it is a big task to
is crucial to strike an equilibrium between justice balance and ensure that the entire process is
and efficacy and make certain that the process is based on high-quality data to eliminate unseen
built on reliable information. results.
C) LIMITATIONS AND FUTURE
RESEARCH DIRECTIONS
VII. CONCLUSION
In job recommendation system the main role of the
blind fairness approach is to focus on removing A. SUMMARY OF THE RESEARCH
favoritism or any other type of bias based on one’s FINDINGS
caste, age, of male-female or any other type of
factor. This technique is practiced aiming at In this essay, we have looked at the job
treating each one equally, not keeping their recommender system (JRS) literature from a
personal characteristics in mind, and the system is variety of angles. These include the impact of
totally based on profile or job-related. But this data science competitions, the impact of data
approach also has some limitations: availability on the selection of methods and
validation, and ethical considerations in job
1. Lack of information: The important recommendation systems. To better understand
information is not considered like education how these hybrid recommender systems differ,
35

we also branched out the broad class of hybrid The biggest contribution of these studies on
recommender systems. Previous assessments this system is that it has improved the
of job recommender systems have not covered strategies and has reduced the biases. These
this multi-perspective view or the new biases can happen because of many reasons
taxonomy of hybrid job recommender systems. like the language which is being used in the
Although early JRS contributions already task description. For example, a description
highlighted application-oriented challenges, that uses words like “aggressive” or
most of the literature still does not take these “competitive” may deter girl applicants who
into consideration. However, contributions that may interpret these phrases as masculine or
do adopt various viewpoints on the JRS issue uncomfortable environment. Similarly, this
do demonstrate that differing viewpoints can technique may also favor people based on their
be quite advantageous. These advantages can location, age, and gender in place of their
include enhanced model performance qualifications.
(temporal perspective), improved candidate
distribution across a collection of uniform To tackle this problem, many alternative
openings (reciprocal perspective), or techniques have been introduced, one of them
guaranteed algorithm fairness (ethical is gender-neutral language in job descriptions,
perspective). The focus right now is on how to casting off the location records from their
develop job suggestions using the large profiles, the usage of device gaining
quantity of textual data from both candidate knowledge of algorithms to perceive and do
profiles and open positions, where recently away with bias within the recommendation
notably deep representations have shown system. These things can really help to reduce
encouraging results. This focus, though, could biases and improve the quality of the job
give the impression that this is the only recommendation system.
relevant viewpoint. Such a singular viewpoint An additional contribution of the study on this
has the potential to be quite detrimental, system is the identity of the blessing of using
particularly in terms of justice. Even though this system. Fairness in the recommendation
we are not aware of algorithm audits on job system can also help in reducing
recommender systems, a review of Indeed, discrimination in the job market and increase
CareerBuilder, and Monster's candidate search inclusion in the job market.
engines did reveal significant findings for both
individual and group gender unfairness. Moreover, fairness in job recommendation
Algorithms and metrics that can be used to systems can also help by presenting a larger
gauge and guarantee algorithm fairness are, and greater various pool of applicants to select
however, made available by the increased from. This can cause higher hiring decisions
scientific attention to this issue. As a result, and advanced organizational overall
there is an opportunity for researchers to performance. Using a fair process advice
investigate how they may be applied to the system, employers can also increase their
domain of job recommender systems. popularity and emblem photos by
demonstrating their commitment to variety
B. CONTRIBUTIONS AND and inclusivity.
PRACTICAL IMPLICATIONS OF
THE STUDY C. RECOMMENDATIONS FOR
FUTURE DEVELOPMENT OF
Fair job recommendation system plays a major JRS
role in job market. This system happens
because of algorithms to give shape to job Some of the most common suggestions of the
seekers and open activity positions. What we research on fair job recommendation systems
look at in the job recommendation system is are the development and implementation of
fairness and unbiased and which gives equal fair activity in the real world. Many agencies
opportunity to all job seekers, without keeping have already started using fair process
the age, gender, or other traits in mind. A lot of recommendation systems to improve their
studies have been done on its contribution and hiring process. For example, Amazon changed
attainable implementation and these studies are its recruitment process in 2018 by removing
substantial. gender bias by setting new rules in the activity
tips as the previous method received a lot of
36

criticism. To eliminate discrimination and [4] Berente, N.; Gu, B.; Recker, J.; Santhanam, R.
growth variation in the workplace, other (2021). Managing Artifical Intelligence. MIS
companies, such as Pymetrics and Gap Quarterly, 45(3), 1433--1450.
jumpers, have developed process guidance
frameworks that leverage machine learning [5] Biswas, S., & Rajan, H. (2020). Do the machine
techniques. learning models on a crowd sourced platform
exhibit bias? An empirical study on model fairness.
The adoption of an equitable framework for ESEC/FSE 2020, 642–653.
employment suggestions does nevertheless
face some difficulties. One endeavor is the [6] Shuqing Bian, Wayne Xin Zhao, Yang Song,
need for extensive and varied datasets to teach Tao Zhang, and Ji-Rong Wen. Domain adaptation
computers how to use techniques. The for person-job fit with transferable deep global
collection of criteria will not be capable of match network. In Proceedings of the 2019
identifying and counteracting bias in the Conference on Empirical Methods in Natural
advising device without a large dataset. A Language Processing and the 9th International Joint
further obligation is to continuously monitor Conference on Natural Language Processing
and evaluate the device to make sure it (EMNLP-IJCNLP), pages 4810–4820, 2019.
maintains its objectivity and fairness. Despite [7] Shuqing Bian, Xu Chen, Wayne Xin Zhao, Kun
ongoing observation and evaluation, Zhou, Yupeng Hou, Yang Song, Tao Zhang, and Ji-
discrimination may gradually return to the Rong Wen. Learning to match jobs with resumes
system over time. from sparse interaction data using multi-view co-
In closing, studies have shown that fair job teaching network. In Proceedings of the 29th ACM
recommendation systems have substantially International Conference on Information &
boosted inclusion in the workplace and Knowledge Management, pages 65–74, 2020.
decreased discrimination in the labor market. [8] Black, J. S., & van Esch, P. (2020). AI-enabled
These studies have aided in enhancing the recruiting: What is it and how should a manager
fairness and inclusiveness of activities use it? Business Horizons, 63(2), 215–226
guidance frameworks by creating methods to
minimize prejudice, highlighting the [9] Booth, B. M., Hickman, L., Subburaj, S. K.,
advantages of fair work guidance frameworks, Tay, L., Woo, S. E., & D’Mello, S. K. (2021). Bias
and providing beneficial consequences for and Fairness in Multimodal Machine Learning: A
their use. Nevertheless, are still issues with Case Study of Automated Video Interviews. ICMI
putting these processes into practice. To 2021 - Proceedings of the 2021 International
address these issues and ensure that sincere Conference on Multimodal Interaction, 268–277.
process suggestion structure continues study
[10] Alder, G. S., & Gilbert, J. (2006). Achieving
and improvement is required.
Ethics and Fairness in Hiring: Going Beyond the
Law. Journal of Business Ethics 2006 68:4, 68(4),
449–464.
VIII. REFERENCES
[11] Bias in machine learning software: why? how?
[1] Alder, G. S., & Gilbert, J. (2006). Achieving what to do? Proceedings of the 29th ACM Joint
Ethics and Fairness in Hiring: Going Beyond the Meeting on European Software Engineering
Law. Journal of Business Ethics 2006 68:4, 68(4), Conference and Symposium on the Foundations of
449–464. Software Engineering, 429–440.
[2] Jack Bandy. Problematic machine behavior: A [12] Burke, I., Burke, R., & Kuljanin, G. (2021).
systematic literature review of algorithm audits. Fair candidate ranking with spatial partitioning:
Proceedings of the ACM on Human-Computer Lessons from the SIOP ML competition.
Interaction, 5(CSCW1):1–34, 2021. Proceedings of the Workshop on Recommender
Systems for Human Resources.
[3] Zeynep Batmaz, Ali Yurekli, Alper Bilge, and
Cihan Kaleli. A review on deep learning for [13] Le Chen, Ruijun Ma, Anikó Hannák, and
recommender systems: challenges and remedies. Christo Wilson. Investigating the impact of gender
Artificial Intelligence Review, 52(1):1–37, 2019. on rank in resume search engines. In Proceedings
of the 2018 chi conference on human factors in
computing systems, pages 1–14, 2018.
37

[14] Dolata, M., Feuerriegel, S., & Schwabe, G. Evaluations Have Big Consequences. Journal of
(2022). A sociotechnical view of algorithmic Management, 48(3), 657–692.
fairness. Information Systems Journal, 32(4), 754–
818. [25]
AkshayGugnaniandHemantMisra.Implicitskillsextr
[15] Consul, N., Strax, R., DeBenedectis, C. M., & actionusingdocumentembeddinganditsuse
Kagetsu, N. J. (2021). injobrecommendation.InProceedingsoftheAAAICo
nferenceonArtificialIntelligence,volume34,
[16] Fernández-Martínez, C., & Fernández, A. pages13286–13293,2020.
(2020). AI and recruiting software: Ethical and
legal implications. Paladyn, Journal of Behavioral [26] Ramezanzadehmoghadam, M., Chi, H., Jones,
Robotics, 11(1), 199– 216. E. L., & Chi, Z. (2021). Inherent Discriminability
of BERT Towards Racial Minority Associated
[17] Goretzko, D., & Israel, L. S. F. (2022). Pitfalls Data. In Computational Science and Its
of Machine Learning-Based Personnel Selection: Applications – ICCSA 2021. Springer International
Fairness, Transparency, and Data Quality. Publishing.
[18] Yi-Chi Chou and Han-Yen Yu. Based on the [27] Hofeditz, L., Mirbabaie, M., Luther, A.,
application of AI technology in resume analysis Mauth, R., & Rentemeister, I. (2022). Ethics
and job recommendation. In 2020 IEEE Guidelines for Using AI-based Algorithms in
International Conference on Computational Recruiting: Learnings from a Systematic Literature
Electromagnetics (ICCEM), pages 291–296. IEEE, Review. HICSS, Maui, Hawaii.
2020.
[28]
[19] Corné de Ruijt and Sandjai Bhulai. A ChengGuo,HongyuLu,ShaoyunShi,BinHao,BinLiu,
comparison of machine-learned survival models for MinZhang,YiqunLiu,andShaopingMa.
predicting tenure from unstructured resumes. 2021.
[29] Boell, S. K., & Cecez-Kecmanovic, D. (2015).
[20] Habous, A., & El Habib, N. (2021). On being ‘Systematic’ in Literature Reviews in IS.
Combining Word Embeddings and Deep Neural Journal of Information Technology, 30(2), 161–
Networks for Job Offers and Resumes 173.
Classification in IT Recruitment Domain.
International Journal of Advanced Computer [30]
Science and Applications, 12(7), 651–658. FranciscoGutiérrez,SvenCharleer,RobinDeCroon,N
yiNyiHtun,GerdGoetschalckx,andKatrien
[21] Verbert.Explainingandexploringjobrecommendatio
ELISE.WebsiteELISEJobMatchingSearchandMatc ns:auser-drivenapproachforinteractingwith
hPlatform,2021.Retrievedfrom: https://www.wcc- knowledge-
group.com/employment/products/elise-job- basedjobrecommendersystems.InProceedingsofthe1
matching-search-and-match-platform/, 3thACMConferenceonRecommenderSystems,page
accessedApril26,2021. s60–68,2019.
[22] [31]
MauricioNorisFreireandLeandroNunesdeCastro.e- DietmarJannach,GabrieldeSouzaP.Moreira,andEve
Recruitmentrecommendersystems:asystematic nOldridge.Whyaredeeplearningmodelsnot
review.KnowledgeandInformationSystems,pages1– consistentlywinningrecommendersystemscompetiti
20,2020. onsyet?apositionpaper.RecSysChallenge’20,
[23] ProceedingsoftheRecommenderSystemsChallenge2
SahinCemGeyik,StuartAmbler,andKrishnaramKent 020,page44–49,2020.
hapadi.Fairness-awarerankinginsearch& [32] Kilbertus, N., Ball, P. J., Kusner, M. J., Weller,
recommendationsystemswithapplicationtolinkedint A., & Silva, R. (2020). The Sensitivity of
alentsearch.InProceedingsofthe25thACM Counterfactual Fairness to Unmeasured
SIGKDDInternationalConferenceonKnowledgeDis Confounding. In R. P. Adams & V. Gogate (Eds.),
covery&DataMining,pages2221–2231,2019. Proceedings of The 35th Uncertainty in Artificial
[24] Hardy, J. H., Tey, K. S., Cyrus-Lai, W., Intelligence Conference (Vol. 115, pp. 616– 626).
Martell, R. F., Olstad, A., & Uhlmann, E. L. (2022). PMLR.
Bias in Context: Small Biases in Hiring
38

[33] Schoeffer, J., Kuehl, N., & Valera, I. (2021). A Proceedingsofthe51stHawaiiInternational


Ranking Approach to Fair Classification. ACM ConferenceonSystemSciences,pages3914–
SIGCAS Conference on Computing and 3923,2018.
Sustainable Societies (COMPASS), 115–125.
[44] Mehrabi, N., Morstatter, F., Saxena, N.,
[34] Kappen, M., & Naber, M. (2021). Scientific Lerman, K., & Galstyan, A. (2021). A Survey on
Reports, 11(1). Bias and Fairness in Machine Learning. ACM
Computing Surveys, 54(6), 1– 35.
[35]MiaoJiang,YiFang,HuangmingXie,JikeChong,a
ndMengMeng.Userclickpredictionforpersonalizedj [45] Mehrotra, A., & Celis, L. E. (2021). Mitigating
obrecommendation.WorldWideWeb,22(1):325– bias in set selection with noisy protected attributes.
345,2019. FAccT 2021, 237–248.

[36] [46]
Kaggle.CareerbuilderJobRecommendationChalleng RanLe,WenpengHu,YangSong,TaoZhang,Dongyan
e,2012.Retrievedfrom: Zhao,andRuiYan.Towardseffectiveand
https://www.kaggle.com/c/job- interpretableperson-
recommendation,accessedFebruary15,2021. jobfitting.InProceedingsofthe28thACMInternationa
lConferenceonInformationandKnowledgeManagem
[37] ent,pages1883–1892,2019.
TanjaKoch,CharleneGerber,andJeremiasJ.DeKlerk.
Theimpactofsocialmediaonrecruitment: [47] vom Brocke, J., Simons, A., Riemer, K.,
Areyoulinkedin? Niehaves, B., Plattfaut, R., & Cleven, A. (2015).
SAJournalofHumanResourceManagement,16(1),20 Standing on the shoulders of giants: Challenges and
18. recommendations of literature search in
information systems research. Communications of
[38] Lakkaraju, H., Kleinberg, J., Leskovec, J., the Association for Information Systems, 37, 205–
Ludwig, J., & Mullainathan, S. (2017). The 224.
Selective Labels Problem. Proceedings of the 23rd
ACM SIGKDD International Conference on [48] Yeon-ChangLee,JiwonHong,andSang-
Knowledge Discovery and Data Mining, 275–284. WookKim.Jobrecommendationinaskstory:experienc
es,
[39] Marks, P. (2022). Algorithmic hiring needs a methods,andevaluation.InProceedingsofthe31stAnn
human face. Communications of the ACM, 65(3), ualACMSymposiumonAppliedComputing,
17–19. pages780–786,2016.
[40] Leavy, S. (2018). Gender bias in artificial [49] Mirbabaie, M., Brendel, A. B., & Hofeditz, L.
intelligence. Proceedings of the 1st International (2022). Ethics and AI in Information Systems
Workshop on Gender Equality in Software Research. Communication of the Association for
Engineering, 14–16. Information Systems, 50, 123–150.
[41] Mayer, Anne-Sophie; Strich, Franz; and [50] YujinLee,Yeon-
Fiedler, M. (2020). Unintended Consequences of ChangLee,JiwonHong,andSang-
Introducing AI Systems for Decision Making. MIS WookKim.Exploitingjobtransitionpatterns
Quaterly Executive, 19(4), 6. foreffectivejobrecommendation.In2017IEEEInterna
[42] EmanuelLacic,MarkusReiter- tionalConferenceonSystems,Man,and
Haas,TomislavDuricic,ValentinSlawicek,andElisab Cybernetics(SMC),pages2414–2419.IEEE,2017.
ethLex.Should weembed? [51] Mirbabaie, M., Hofeditz, L., Frick, N. R. J., &
Astudyontheonlineperformanceofutilizingembeddin Stieglitz, S. (2021). Artificial intelligence in
gsforreal- hospitals: providing a status quo of ethical
timejobrecommendations.InProceedingsofthe13thA considerations in academia to guide future
CMConferenceonRecommenderSystems,pages496 research. AI & SOCIETY, Published online.
–500,2019.
[52]
[43] VasilyLeksinandAndreyOstapets.Jobrecommendati
SvenLaumer,FabianGubler,ChristianMaier,andTim onbasedonfactorizationmachineandtopic
Weitzel.Jobseekers’acceptanceofjob modelling.InRecSysChallenge’16:Proceedingsofthe
recommendersystems:Resultsofanempiricalstudy.In RecommenderSystemsChallenge,pages1–4. 2016.
39

[53] Nadeem, A., Marjanovic, O., & Abedin, B. [63]


(2021). Gender Bias in AI: Implications for YongLuo,HuaizhengZhang,YonggangWen,andXin
Managerial Practices, 12896 LNCS, 259–270. wenZhang.Resumegan:Anoptimizeddeep
representationlearningframeworkfortalent-
[54] jobfitviaadversariallearning.InProceedingsofthe28t
JianxunLian,FuzhengZhang,MinHou,HongweiWan h
g,XingXie,andGuangzhongSun.Practical ACMInternationalConferenceonInformationandKn
lessonsforjobrecommendationsinthecold- owledgeManagement,pages1101–1110,2019.’
startscenario.InRecSysChallenge’17:Proceedingsof
theRecommenderSystemsChallenge2017,pages1– [64] Wolgast, S., Bäckström, M., & Björklund, F.
6.2017. (2017). Tools for fairness: Increased structure in the
selection process reduces discrimination. PLOS
[55] ONE, 12(12), e0189512.
LinkedIn.AboutLinkedIn,2021.Retrievedfrom:https
://about.linkedin.com,accessedMarch23, 2021. [65] SonuKMishraandManojReddy.Abottom-
upapproachtojobrecommendationsystem.InRecSys
[56] Raghavan, M., Barocas, S., Kleinberg, J., & Challenge’16:ProceedingsoftheRecommenderSyste
Levy, K. (2019). Mitigating Bias in Algorithmic msChallenge,pages1–4.2016.
Employment Screening: Evaluating Claims and
Practices. SSRN Electronic Journal. [66]
SaketMaheshwaryandHemantMisra.Matchingresu
[57] mestojobsviadeepsiamesenetwork.In
KuanLiu,XingShi,AnoopKumar,LinhongZhu,andPr CompanionProceedingsoftheTheWebConference20
emNatarajan.Temporallearningand 18,pages87–88,2018.
sequencemodelingforajobrecommendersystem.InR
ecSysChallenge’16:ProceedingsoftheRecommender [67] JorgeMartinez-
SystemsChallenge,pages1–4.2016. Gil,AlejandraLorenaPaoletti,andMarioPichler.Anov
elapproachforlearninghow
[58] Pessach, D., & Shmueli, E. (2022). A Review toautomaticallymatchjoboffersandcandidateprofiles
on Fairness in Machine Learning. ACM Computing .InformationSystemsFrontiers,pages1–10, 2019.
Surveys, 55(3), 1– 44.
[68] Zhang, N., & Zhang, N. (2018).
[59] Suresh, H., & Guttag, J. (2021). A Framework Understanding the Roles of Challenge Security
for Understanding Sources of Harm throughout the Demands , Psychological Resources in Information
Machine Learning Life Cycle. Equity and Access in Security Policy Noncompliance. Pacis 2018.
Algorithms, Mechanisms, and Optimization, 1–9.
[69]
[60] MotebangDanielMpelaandTranosZuva.Amobilepro
MalteLudewig,MichaelJugovac,andDietmarJannac ximityjobemploymentrecommendersystem.
h.Alight- In2020InternationalConferenceonArtificialIntellige
weightapproachtorecipientdeterminationwhenreco nce,BigData,ComputingandDataCommunicationSy
mmendingnewitems.InRecSysChallenge’17:Procee stems(icABCD),pages1–6.IEEE,2020.
dingsoftheRecommender
SystemsChallenge2017,pages1–6.2017. [70] Langer, M., K¨onig, C. J., & Papathanasiou,
M. (2019). Highly automated job interviews:
[61] Teodorescu, M. H. M., Morse, L., Awwad, Y., Acceptance under the influence of stakes.
& Kane, G. C. (2021). Failures of fairness in International Journal of Selection and Assessment,
automation require a deeper understanding of 27(3), 217–234. https://doi.org/10.1111/ijsa.12246
human–ml augmentation. MIS Quarterly:
Management Information Systems, 45(3), 1483– [71]
1499. AmberNigam,AakashRoy,HartaranSingh,andHarsi
mranWaila.Jobrecommendationthrough
[62] Teodorescu, M., Morse, L., Awwad, Y., & progressionofjobselection.In2019IEEE6thInternatio
Kane, G. (2021). Failures of Fairness in nalConferenceonCloudComputingand
Automation Require a Deeper Understanding of IntelligenceSystems(CCIS),pages212–
Human-ML Augmentation. MIS Quarterly, 45(3), 216.IEEE,2019.
1483–1500.
[72] Saman Shishehchi and Seyed Yashar
Banihashem. Jrdp: a job recommender system
40

based on ontology for disabled people. https://doi.org/10.1016/j.hrmr.2015.01.002,


International Journal of Technology and Human 2015/06/01/.
Interaction (IJTHI), 15(1):85–99, 2019.
[82] Person-job fit: Adapting the right talent for the
[73] textkernel. Website Textkernel, 2021. right job with joint representation learning. ACM
Retrieved from: https://www.textkernel.com, Transactions on Management Information Systems
accessed March (TMIS), 9(3):1–17, 2018.

15, 2021. [83] Ferr`as-Hern´andez, X. (2018). The future of


management in a world of electronic brains.
[74] Abeer Zaroor, Mohammed Maree, and Muath Journal of Management Inquiry, 27(2), 260–263.
Sabha. A hybrid approach to conceptual https://doi.org/10.1177/ 1056492617724973
classification
[84] Dávid Zibriczky. A combination of simple
and ranking of resumes and their corresponding job models by forward predictor selection for job
posts. In International Conference on Intelligent recommendation. In RecSys Challenge ’16:
Decision Technologies, pages 107–119. Springer, Proceedings of the Recommender Systems
2017. Challenge, pages 1–4. 2016.

[75] Acikgoz, Y., Davison, K. H., Compagnone, [85] Vedapradha, R., Hariharan, R., & Shivakami,
M., & Laske, M. (2020). Justice perceptions of R. (2019). Artificial intelligence: A technological
artificial intelligence in selection. International prototype in recruitment. Journal of Service
Journal of Selection and Assessment, 28 (4), 399– Science and Management, 12 (03),382–
416. https://doi.org/10.1111/ijsa.12306. 390.10.4236/jssm.2019.123026 .

[76] Chenrui Zhang and Xueqi Cheng. An [90] Herath, H. M. K. K. M. B., & Mittal, M.
ensemble method for job recommender systems. In (2022). Adoption of artificial intelligence in smart
RecSys cities: A comprehensive review. International
Journal of Information Management Data Insights,
Challenge ’16: Proceedings of the Recommender 2 (1), Article 100076. 10.1016/j.jjimei.2022.100076
Systems Challenge, pages 1–4. 2016. .
[77] Blacksmith, N., Willford, J. C., & Behrend, T. [91] Koch, J., Plattfaut, R., & Kregel, I. (2021).
S. (2016). Technology in the employment Looking for talent in times of Crisis – The impact
interview: A meta-analysis and future research of the Covid-19 pandemic on public sector job
agenda. Personnel Assessment and Decisions, 2(1), openings. Inter-national Journal of Information
12–20. https://doi.org/10.25035/pad.2016.002 Management Data Insights, 1 (2), Article 100014.
10.1016/j.jjimei.2021.100014.
[78] Chamorro-Premuzic, T., & Ahmetoglu, G.
(2016). The pros and cons of robot managers. [92] Phan, T.T., Pham, V.Q., Nguyen, H.D., Huynh,
Harvard Business Review, 2016(12). A.T., Tran, D.A., & Pham, V.T. (2021).Ontology-
https://hbr.org/2016/12/the-pros-and-cons-of-robot- based resume searching system for job applicants
managers. in information technology,
[79] Alice Zheng and Amanda Casari. Feature [93] Ponnaboyina, R., Makala, R., & Venkateswara
engineering for machine learning: principles and Reddy, E. (2022). Smart recruitment sys-tem using
techniques deep learning with natural language processing. In
Intelligent systems and sustainable computing (pp.
for data scientists. O’Reilly Media, Inc., 2018.
647–655). Springer .
[80] Derous, E., & Ryan, A. M. (2018). When your
[94] Singh, A., & Shaurya, A. (2021). Impact of
resume is (not) turning you down: Modelling ethnic
artificial intelligence on HR prac-tices in the UAE.
bias in resume screening.
Humanities and Social Sciences Communications,
[81] Stone, D. L., Deadrick, D. L., Lukaszewski, K. 8 (312). 10.1057/s41599-021-00995-4 Sinha, A. K.,
M., & Johnson, R. (2015). The influence of Akhtar, A. K., & Kumar,
technology on the future of human resource
[95] Zeebaree, S. R. M., Shukur, H. M., & Hussan,
management. Human Resource Management
B. K. (2019). Human resource management
Review, 25(2), 216–231.
systems for enterprise organizations: A review.
41

Periodicals of Engineering and Natural Sciences, 7 Conference on World Wide Web, pages 963–966,
(2), 660–669 . 2013.

[96] Suen, H.-Y., Chen, M. Y.-C., & Lu, S.-H. [105] Emmanuel Malherbe, Mamadou Diaby,
(2019). Does the use of synchrony and artificial Mario Cataldi, Emmanuel Viennet, and Marie-
intelligence in video interviews affect interview Aude Aufaure. Field selection for job
ratings and applicant attitudes? categorization and recommendation to social
network users. In 2014 IEEE/ACM International
[97] Schmidt, J. A., Chapman, D. S., & Jones, D. Conference on Advances in Social Networks
A. (2014). Does emphasizing different types of Analysis and Mining (ASONAM 2014), pages
person–environment fit in online job ads influence
application behavior and applicant quality? 588–595. IEEE, 2014.
Evidence from a field experiment. Journal of
Business and Psychology, 30(2), 267–282. Marc Poch, Núria Bel Rafecas, Sergio Espeja, and
Felipe Navio. Ranking job offers for candidates:
[98] What do consistency and personableness in the
interview signal to applicants? Investigating learning hidden knowledge from big data. In
indirect effects on organizational attractiveness Proceedings of the Ninth International Conference
through symbolic organizational attributes. Journal on
of Business and Psychology, 34(5), 671–684. Language Resources and Evaluation (LREC-2014).
https://doi.org/10.1007/ s10869-018-9600-7 ACL (Association for Computational Linguistics),
[99] Howardson, G. N., & Behrend, T. S. (2014). 2014.
Using the Internet to recruit employees: Comparing
the effects of usability expectations and objective
technological characteristics on Internet
recruitment outcomes. IX. APPENDICES
A) ADDITIONAL DETAILS ON THE
METHODOLOGY, DATA, AND
[100] Measuring procedural justice in police- SYSTEM DESIGN.
citizen encounters. Justice Quarterly, 32(5), 845–
871. https://doi.org/ A Job recommendation system is an intelligent
10.1080/07418825.2013.845677 system that uses data analysis and device mastering
algorithms to signify applicable job opportunities to
[101] Shalaby, Walid, AlAila, BahaaEddin, process seekers primarily based on their
Korayem, Mohammed. 2017. Help me find a job: competencies, enjoy, and preferences. The machine
graph-based approach for job recommendation at collects facts from numerous sources consisting of
scale, 2017 IEEE International Conference on Big resumes, process descriptions, and candidate
Data, 978-1-5386-2715-0/17/$31.00 c 2017 IEEE. profiles, and uses these statistics to healthy
[102] Elshawi, R., Wahab, A., Barnawi, A., et al., candidates with activity opportunities. In this text,
2021. DLBench: A comprehensive experimental we can talk about the methodology, data, and
evaluation of deep learning frameworks. Cluster system design of a process recommendation gadget
Comput. https://doi.org/10.1007/s10586-021- in element.
03240-4. Methodology:
[103] Jan, B., Farman, H., Khan, M., Imran, M., The technique of a process advice gadget usually
Islam, I. U., Ahmad, A., Ali, S., Jeon, G. 2019. involves the subsequent steps:
Deep learning in big data analytics: A comparative
study, Comput. Electr. Eng.,75, pp. 275–287, 1. Data Collection: The gadget collects
[Online]. Available: records from numerous resources
http://www.sciencedirect.com/science/article/pii/S0 including task boards, social media
045790617315835. platforms, and agency web sites.
2. Data Cleaning: The amassed statistics is
[104] Yao Lu, Sandy El Helou, and Denis Gillet. A cleaned to do away with duplicates, beside
recommender system for job seeking and recruiting the point facts, and mistakes.
website. In Proceedings of the 22nd International
42

3. Data Integration: The information from 5.Machine Learning: The system makes use of
diverse resources is integrated and machine getting to know algorithms including
converted into a fashionable layout. collaborative filtering and content cloth-based
4. Data Analysis: The gadget uses statistical totally filtering to healthy assignment seekers with
evaluation techniques along with system possibilities.
clustering and typing to perceive patterns
and relationships within the information. 6.User Interface: The machine provides a consumer
5. Machine Learning: The machine uses interface for task seekers to look and observe for
machine studying algorithms such as jobs.
collaborative filtering and content
7.Performance Metrics: The device measures
material-based filtering to suit process
average overall performance metrics which include
seekers with task opportunities.
accuracy, precision, and endure in mind to evaluate
6. Model Evaluation: The overall
the effectiveness of the tool.In conclusion, a
performance of the machine is evaluated
process advice gadget is an effective device for
on the usage of metrics including
matching process seekers with process possibilities.
accuracy, precision, and recall.
In end, a job recommendation system is a powerful
tool for matching process seekers with job
Data: possibilities. The method of the system involves
facts collection, cleaning, integration, analysis,
The statistics utilized in a job recommendation machine mastering, and model assessment. The
machine typically includes the following: records used within the device include candidate
1. Candidate Profiles: The gadget collects profiles, process descriptions, user behavior, and
facts on the candidate's skills, enjoyment, agency profiles. The system design consists of
education, and region. information collection, garage, cleaning and
2. Job Descriptions: The device collects data integration, information evaluation, system
on task descriptions, along with activity learning, consumer interface, and overall
titles, job requirements, and activity performance metrics.
responsibilities.
3. User Behavior: The device collects
statistics on consumer conduct together
with clicks, likes, and searches.
4. Company Profiles: The machine collects
records on corporation profiles, consisting
of corporation size, enterprise, and
location.

System Design:

The device design of a pastime recommendation


device usually consists of the subsequent additives:

1.Data Collection: The device collects statistics


from several sources in conjunction with process
boards, social media systems, and organisation
internet sites.

2.Data Storage: The amassed information is saved


in a database or statistics warehouse.

3.Data Cleaning and Integration: The facts are


wiped clean and included into a preferred format.

4.Data Analysis: The device uses records


assessment techniques including clustering and
class to perceive patterns and relationships inside
the information.

You might also like