You are on page 1of 20

The rapid shifts in the technological environment continue to disrupt the traditional

functioning of businesses including accounting and auditing processes and practices. As in


the other sectors, the fields of accountancy and auditing have been witnessing rapid
incursions of emerging technologies that have changed the expectations concerning
performance, efficiency and reliability. The growing digitalisation of business functions calls
for accountants and auditors to adapt to the use of such technologies in their professions as
well. The growing trend of automation in accounting and auditing functions calls for a
paradigm change in the training, role and responsibility of future accountant professionals
(Zhang et al., 2018). Zhang et al. (2017) reported that traditional accounting and auditing
practices have been steadily losing their relevance owing to the “complex structures of
ownership of wealth, virtual assets, and rapidly moving transactions in the information
economy”. With the growing complexity of economic transactions, it has been argued that
manual accountancy and auditing roles would have to be supported by disruptive digital
technologies. The need for cognitive software tools to avoid furnishing misstatements
concerning complex business entities has been a growing area of research. Rodrigues et al.
(2023) pointed out that AI enables auditors to identify continuous patterns, which is not so
straightforward to accomplish using traditional auditing approaches. Rodrigues et al. (2023)
also noted that when using AI, auditors can use the whole population for substantive tests,
whereas in conventional processes only relatively small samples could be used.

This implies that using AI for auditing can limit bias and provide a more robust risk
management approach. The need for observing the accounting numbers of organisations in
real-time, which ensures correct valuation, requires the application of ground breaking
information technologies like artificial AI and blockchain (Zhang et al., 2020). The
International Financial Reporting Standards state that ''the usefulness of financial information
is enhanced if it is comparable, verifiable, timely, and understandable” (Zhang et al., 2017).
Traditional accounting and auditing practices have often been blamed for wilful bias or
manipulation of accounting standards (Sherif & Mohsin, 2021). The gaps in conventional
financial reporting standards have been held responsible for facilitating bias. AI could reduce
the risk of such bias in accountancy and auditing functions. There have been major arguments
as to why emerging technologies like AI should be used to address the deficiencies of
conventional accounting standards and procedures. Sherif and Mohsin (2021) argued that AI
could improve professional competency, integrity and due care in accounting functions.
Emerging technologies like Cloud, automation, AI, big data, IoT, blockchain and others have
redefined how business functions are performed. With a persistent rise in the computation
power, businesses can adopt a more scalable approach to adopt these emerging technologies.
These emerging technologies have been observed to drastically alter financial reporting
practices. As a matter of fact, such changes have picked momentum in the past few years
after the Covid-19 pandemic compelled businesses to digitalise their operations and corporate
functions. With the growing use of these emerging technologies in business functions, there
is an urgency for accountants and auditors to keep up with the rapidly shifting technological
space. Industry experts are predicting that with the growing pace of AI, "fully automated
accounting” is a future that is not too distant (Davis, 2023). Ng et al. (2021) defined AI as
intelligent machines that can execute cognitive functions by learning from their environment.
In other words, AI implies intelligent machines and algorithms that can mimic human logical
reasoning and experiential learning. The significant development in machine learning and
neural networks has given rise to powerful AI algorithms that can perform routine
accountancy and auditing tasks efficiently. The advancement in generative and
conversational AI has opened a new frontier in the domain of accounting and auditing.
Generative AI, in particular, has been playing a pivotal role in accelerating the digital
transformation of accounting, auditing and finance roles. Sætra (2023) defined generative AI
as “machine learning solutions trained on massive amounts of data in order to produce output
based on user prompts”. Strickland (2023) reported that a recent KPMG survey found that
more than 50% of the companies with annual revenue of $1 billion or more have deployed
generative AI for financial reporting and auditing purposes. It can be argued that as
organisations become more familiar with deploying AI-based foundational models to carry
out accounting and auditing functions, the rate of AI adoption is likely to see a massive rise in
the coming future. The key stakeholders of accounting and auditing functions of businesses
like investors and regulators would have significant contributions influencing the adoption of
these emerging technologies including AI. The demand for insights into business
performance based on ‘real-time data’ is one of the significant drivers for the incorporation of
AI in accountancy and finance functions. On one hand AI use in accountancy and auditing
functions is expected to provide the corresponding personnel with more leverage in strategic
planning, on the other this application has been argued to be capable of increasing the
efficiency of the concerned processes. The use cases of AI in accountancy include invoice
processing, predictive analysis, fraud detection, forecasting and budgeting, ensuring
regulatory compliance and so on. Similarly, AI brings in robust use cases for auditing
functions as well.

Brazel (2022) reported that audit firms were adopting more and more AI use cases to
increase their audit quality. KPMG, one of the big four audit firms, has partnered with IBM
to accelerate the use of AI in auditing services. All the big four auditing firms have
incorporated AI, signifying that the future trajectory of auditing, accounting and financial
services would be signposted by the evolution of AI technology. EY, another of the big 4
firms, has been using AI to analyse financial documents and hoping to achieve complete
automation in its auditing services in the near future (Faggella, 2019). As the complexity of
business functions evolves with smart technologies and digitalisation, more there is a need to
adopt emerging technologies like AI into accounting and auditing practices. PwC, another
example of the big four auditing firms, has successfully embraced AI for auditing functions.
PwC has developed GL.ai which can utilise the “global knowledge and experience” of the
organisation and replicate the performance of expert auditors (PWC, 2017). It is notable to
say that with successive applications, the power of AI audit and accountancy tools would
likely increase, leading to more scaled adoption by many small and medium sized firms.

Krájnik and Demeter (2021) noted that emerging technologies of Society 5.0 have been
redefining the roles of accounting and auditing in supporting strategic business decision-
making, which is a key driver for the incorporation of AI into such business functions. Figure
1.1 shows the drivers of AI adoption in finance and accounting functions.
Figure 1.1: The Drivers of AI in Accountancy and Auditing
(Source: Goh et al., 2019)

Maulidya et al. (2023) pointed out the rise of digital accountancy and cautioned that if
the working professionals did not upskill accordingly they face the risk of being supplanted.
The organisational leaders have also been playing frontline roles in adopting AI and other
emerging technologies into accounting and auditing practices (Clere, 2023). Lehner et al.
(2022) on the other hand mentioned that AI could and should not be taken as a panacea for all
accounting and auditing problems, and the role of human expertise is never redundant. In
other words, to reap the value of AI in accounting, there has to be a synergistic collaboration
between AI tools and the expert judgments of accounting professionals. Baranidharan et al.
(2023) argued that AI is the future of accounting and auditing roles as this technology can
enable professionals “to concentrate on more strategic decision-making activities by
automating back-end procedures". However, the successful assimilation of AI into
accountancy and auditing would require the expert judgment of the professionals regarding
“controls design and understanding data biases” (Baranidharan et al., 2023). To develop such
a symbiotic relationship the perspectives of accountants and auditors would have to be
understood, which facilitates the use cases of AI tools in such functions.
The AI-driven accounting market stood at USD 1.56 billion in 2024 which can reach
up to USD 6.62 billion by 2029 (Mordor Intelligence, 2023). The implications of AI use for
accounting and auditing purposes are huge. The automation of accounting tasks might create
redundancies as well as opportunities for ushering in a paradigm shift in the job role of
accountants (Griffin, 2019). Despite the growing trends of AI use in accountancy practices
Casino Paderanga et al. (2023) point out that it is still in the early stages of such transitions.
This entails that there is a need to understand the perceptions of the professionals as to the
espoused benefits of such transitions. It is needless to say transitioning towards AI and other
smart finance management technologies is beset by many challenges. Cramarenco et al.
(2023) mentioned factors like job loss, job displacement, employability, upskilling
requirements and others that would play prominent roles in the adoption of AI tools in the
global market. There has been a considerable amount of debate as to the ethical aspects of AI
use in accountancy and auditing tasks. The debate concerning whether accountants and
auditors need to understand AI logic has not received any definitive conclusion. All the Big 4
audit firms have been noted for continued investments into AI capabilities, assisting both
accounting routine procedures and auditing (Munoko et al., 2020). Nevertheless, there are
several fundamental questions that point out the ethical dilemmas of using AI for such
practices. Dorland (2023) reported that the majority of accounting firms were somewhat
sceptical as to the merits of using AI. Concerns range from technology unfamiliarity to
unreliability of the results (Dorland, 2023). Data security and accountability were two other
prominent concerns (Hu et al., 2020). The assumption that AI expert systems will be always
right under every constraint has not been conclusively proven (Munoko et al., 2020). AI
systems as Munoko et al. (2020) pointed out could deviate from the realistic constraints,
leading to major ethical as well as legal ramifications. Goodman and Trehu (2023) noted that
“calls for audits to expose and mitigate harms related to algorithmic decision systems are
proliferating, and audit provisions are coming into force”. Owing to the inherent difficulties
in understanding the algorithmic models used for accountancy and auditing practices, their
scrutiny and acceptability could often be challenging within or without the boundaries of
organisations (Goodman & Trehu, 2023). Timea Fülöp et al. (2023) pointed out reputational
risk being another major barrier to the adoption of AI into accountancy and auditing roles.
The findings of the SEM indicated that there is no statistically significant evidence that peer
influence impacted the AI use intention among accountants and auditors. The existing
literature however majorly asserts, with some exceptional studies, that peer or social
influence has profound impacts on the intention to use new information systems. Hossain et
al. (2019) in their study reported that social influence is a major factor in information system
use. Wang et al. (2013) similarly reported that peer or social influence was a major factor
concerning the intention of using new technology. Yueh et al. (2016) also noted that peer
influence at work is a significant predictor of new information system use. Nonetheless, there
are some studies that affirmed that peer influence was not a significant factor in the use of
new technologies at work. Shibl et al. (2013), for instance, reported that social influence was
not a significant factor in the adoption of new information systems at workplaces. Queiroz
and Fosso Wamba (2019) also found that peer influence could or could not, depending upon
the demography variables, be a significant predictor of the intention to use novel information
systems at workplaces. In the current context, it was found that peer influence did not have a
significant positive impact on the AI use intentions of the accounts and auditors. Andrews et
al. (2021) also reported that peer influence was not a significant factor in AI use intentions.
The current finding indicates that peer influence or peer pressure is not yet a significant
predictor of the intention to use AI among accountants and auditors. This finding has several
implications. The lack of significance of peer influence in the context of using AI among
accountants and auditors indicates that AI-based expert systems have yet not gained the
overall credibility to make their generic adoptions a part of the organisational system as well
as culture. In this context, it can be said that there are lukewarm buy-ins of AI systems from
organisational superiors, which might have limited the influence in encouraging the
workforce to adopt the same. Similarly, it also becomes clear that the influences or
recommendations from the clients of auditors and accountants in the Big Four companies and
other tier two firms are yet not mature enough to make the integrration of AI systems as a
measure of firm competitiveness. Therefore, at workplaces, AI-based expert systems for
auditing and accounting are yet to become an opportunity as compared to uncertainty and
threat.

The data analysis shows that peer influence has a significant positive impact on the AI use
attitude of accountants and auditors. This implies that peer influence can help with the
acceptance of AI use cases in accounting and auditing procedures. It was previously found
that peer influence did not significantly impact the intention to AI use among the accountants
and auditors. However, as it is found that peer influence had a positive impact on AI use
attitude, it can be stated peer or social influence cannot be eliminated as a factor that
improves AI acceptance among accountants and auditors. The findings so far indicate that
peer influence increases the awareness of the benefits of using AI for accounting and auditing
functions. Therefore, despite not being directly related to the AI use intention, peer influence
is a significant factor in shaping the attitude towards this technology. Ying et al. (2019) found
that peer attitude can have a significant impact on the professional skepticism of auditors.
Therefore, if AI expert systems are not perceived to be useful among peers, the same could
lead to the development of negative outlooks toward such technology. On the other hand, if
such expert systems are perceived to be useful by the peers of accountants and auditors the
acceptance of the same is enhanced. Despite AI applications in auditing and accounting have
become notable, there is a perception of a lack of technology maturity (ICAEW Insights,
2023). The opinion that AI cannot be the substitute for the skills of expert accountants and
auditors could play a role in the development of an attitude of cautious optimism. The case
for AI application in accounting and auditing has induced a sense of opportunism among
certain sections of working professionals, however, there still are uncertainties concerning
dependability, ethics and other factors. As per the result, the path coefficient (β) between peer
influence and AI use attitude was 0.096, which indicates that a unit change in peer influence
leads to a 0.096 unit change in attitude. Therefore, it can be seen that despite being a
significant predictor, peer influence does not have a huge impact on the attitude toward AI
use. Peer influence is a significant factor in shaping the attitude toward new technology.
However, the current study found that peer influence, despite being significant, is not a major
determiner of attitude. This implies that the acceptance of AI among auditors and accountants
is yet not mature enough to generate a high level of advocacy. This could be a significant
factor in the adoption of AI expert systems by accountants and auditors. Accounting and
audit service firms must take steps to promote the role of AI in the future of these
professions, to develop a general level of awareness of how this tool can be used and what are
the limitations. In the absence of the same, the acceptance of AI among the concerned
professionals would be asymmetric.

The results of the structural model show that there was no significant positive impact of
technical knowledge and competence on the AI use intention of accountants and auditors.
This finding is in contradiction with the evidence in the literature. Faccia et al. (2023)
asserted that developing technical knowledge and competence through training is important
to help accountants cope with novel information systems. Munoko et al. (2020) on the other
hand pointed out that technical competence was a key factor for the ethical use of disruptive
technologies like AI. Elshamly and Gameel (2023) noted that prior experience in using
supports the adoption of AI tools. The current finding sits in contradiction with the relevant
literature. In light of the current hypothesis it can be said that despite having the necessary
background and knowledge in using AI for accounting and auditing functions, the intention to
use the same is not significant. This shows the paradox where professionals despite having
the necessary knowledge and competence are not motivated to use AI. This implies that
technological self-efficacy is not just the only resource needed for developing the technology
usage intention in organisational contexts. It might imply that there is a knowledge gap in the
deployment of AI in accounting and auditing functions, which is not solely technical. Uren
and Edwards (2023) observed that organisational readiness in adopting disruptive
technologies like AI is not just based on technology self-reliance of the users but there is a
need for achieving synergy between people, process and technology. The current result
indicates that although being aware of AI applications, there is a lack of understanding of
how this technology can improve their performance, which might be a limiting factor for the
intention to use the same. Uren and Edwards (2023) noted that socio-technical aspects of
which people, process and technology are key components play a significant role in the
adoption of technology. Uren and Edwards (2023) identified the following themes of socio-
technical construct: AI’s impact on employees, the suitable design of organisational processes
to support AI adoption, and improving employee skills in using AI in line with business
needs. Therefore, as the current finding indicates that relevant AI skills did not have any
impact on the intention to use the same among auditors and accountants, it might indicate the
presence of other contextual socio-technical factors that can explain the current observation.
Roberts et al. (2021) in their study of the psychology of technology adoption noted that
technology self-reliance in itself cannot guarantee technology adoption. Roberts et al. (2021)
included the following cognitive factors that influence the adoption of technology in
organisations: perceived risk from previous experience of using technology, and the degree of
technological competence of the key decision makers. In this context, Roberts et al. (2021)
found that using disruptive technology solutions for B2B clients is dependent upon their
technical competence. In other words, as a result of a lack of technical knowledge and
competence the perception of risks could be unduly high, and the same would lead to the
rejection of using the proposed technological solutions. Therefore, in the context of the
current finding, it can be said that the technological competence of accountants and auditors
concerning AI might not be ultimately useful in their decision to use the same if the
corresponding socio-technical factors and client acceptability levels are optimum.

This hypothesis proposed that technical knowledge and competence have a significant
positive impact on the AI use attitude of accountants and auditors. The corresponding path
analysis results show that despite the relationship between these two constructs being
significant, the path coefficient was negative (β= -0.101). Therefore, as per the current
finding, technical knowledge and competence had a negative impact on the AI use attitude of
the concerned research population. However, it is also notable that the size of the coefficient
was not large. The evidence in the literature however points toward a positive impact of
technical knowledge and competence on technology use attitude. Qasim and Kharbat (2019)
for instance noted that the development of required technological competence is a precursor
of developing a positive attitude towards technology use. However, there are few studies
whose findings can be used to support the current result. Stein et al. (2024) found that attitude
towards AI is not based on the specific capabilities of the tool but on the constituent
principles. This is to say that even the advantages of using AI for accounting and auditing
functions could fail to generate a positive attitude towards the same if the underlying
principles of the tool are not concordant with the expected business case and ethical
principles. Schemmer et al. (2022) pointed out that AI-based decision-making has not always
been correct in retrospect, which is a fact that contributes negatively to the perception of the
technology. Kapania et al. (2022) used the concept of techno-optimism to explain the positive
attitude towards AI. As the current results indicate that technological knowledge and
competence had a negative impact on the AI use attitude of accountants and auditors, the
same implies that such knowledge does not lead to techno-optimism in general. As technical
knowledge and competence regarding AI use in accounting and auditing led to a negative
attitude toward the use of the same, it can be argued that there are unanswered questions
relating to AI’s applicability for such purposes. It is a significant finding that competence in
AI use led to negative attitudes toward the same among accountants and auditors. This
indicates that AI-based expert systems for accounting and auditing are yet not mature enough
to be deemed useful. The possibility of bias in AI expert systems could be one explanation
for the fact that despite having the necessary knowledge and resources, accountants and
auditors are not too keen on using the same extensively. Varsha (2023) pointed out that
algorithmic biases of AI use in the insurance sector could lead to improper functionalities and
resulting stakeholder dissatisfaction. Jędrzejka (2019) observed that skills and competencies
in operating AI-based intelligent accounting information systems would be fruitless if the
underpinning processes were not appropriately designed. Deloitte (2022) noted that AI
knowledge and skills of auditors should be used to challenge and govern intelligent decision-
making systems. Therefore, the presumption that prior knowledge or skills concerning AI
would automatically develop a positive mindset toward the same among auditors and
accountants is not entirely correct. Auditing and accounting functions are sensitive areas of
organisations where bias in decision-making could reflect poorly on strategic management. It
is therefore expected that accountants and auditors would not take the authority of AI for
granted but rather use their domain expertise and technical understanding of the tool to
question its reliability. So, if the relevant technical knowledge and competence of the auditors
and accountants negatively influence the attitude toward AI, this gives credence to the
perception that there might be explainability issues with the codes or that the same offers
limited advantages in the vast application areas in accounting and auditing.

Table 4.25 shows that there is a significant positive impact of facilitating conditions on the
intentions to use AI. The path coefficient (β= 0.223) was moderately strong, explaining that
facilitating conditions are a significant predictor of the AI use intentions among accountants
and auditors. This finding corroborates the results of many similar studies where the role of
facilitating conditions in the intention to use technology has been empirically studied. For
example, Venkatesh et al. (2012) in their study confirmed that facilitating conditions were a
major predictor of technology use. Hossain et al. (2017) also reported that facilitating
conditions were a key factor in technology use intention. Emhmed et al. (2021) similarly
found that facilitating conditions bear significantly on the intention of using new information
systems. Shaw and Sergueeva (2019) defined facilitating conditions as “the conceptualized
knowledge, resources, and opportunities that are required to perform a specific behaviour”. It
can be said that auditing and accountancy firms that have a well-designed digital
transformation strategy are likelier to create favourable facilitating conditions that support the
AI use intentions among auditors. KPMG (2019) noted that organizations that have been
successful in adopting AI have developed a systematic approach toward the same. The
article noted that in such an approach focus was paid on internal collaboration, organisation-
wide process data strategy and holistic governance practices. Moreover, the development of
centres of excellence for running internal R&D activities was also found to be an important
catalyst that facilitated the integration of AI with operational processes. In other words, the
facilitating conditions to support AI adoption for accounting and auditing roles require the
development of an end-to-end AI design and implementation strategy. The facilitation of the
integration of AI into standard accounting and auditing practices requires achieving an
equilibrium between the level of AI maturity needed and respective workforce skills. Without
the development of necessary technology infrastructure, resources and knowledge
management systems AI cannot be expected to be meaningfully integrated with accounting
and auditing processes. Reducing the complexity of AI systems, technology competence of
the internal resources, top management support and competitive pressure are some of the
notable facilitating factors for the adoption of AI by accountants and auditors. In this context,
it must be mentioned that organizations would have to achieve a balance between risk
appetite and innovation to mitigate bias as well as misplaced over-optimism. To have a
deeper understanding of what is meant by facilitating conditions in the context of AI
adoption, PwC (2022) noted the following drivers of AI maturity.

 Sustainable AI data architecture integrating internal and external data sources


 Enterprise-wide AI implementation strategy with explainable process-level use cases
to drive better understanding and criticism of the tool
 Robust AI governance and control plan ensuring ethics, fairness and adaptability of
the tool
 Readily available AI use cases for relevant processes
 Independent centres of excellence
 Reskilling and upskilling programs for the workforce
 Combining the right talent management strategy and AI maturity models to attain
the first-mover advantage
 Centralised AI model integration

Therefore, developing the right technological infrastructure, and coordination through


people, policies and processes are critical for developing a higher AI adoption rate among
accountants and auditors. It needs to be noted that during the implementation stage,
accountants and auditors would face certain skill deficiencies, however, organisations must
overcome such barriers by incentivizing upskilling programs. Similarly, the presence of AI
literacy among the organizational decision makers and key stakeholders is necessary to avoid
resistance to such technology implementation in accounting and auditing processes. As
previously noted, if client organisations of audit and accounting firms display resistance
toward AI use as a result of a lack of AI literacy, the adoption of such tools would become
pointless from a strategic perspective. Therefore, developing collaborations with key
stakeholders is key to ensuring AI adoption by accounting and consultancy firms. The
strategic business case of AI use in auditing and accounting functions should be clearly
established for both the clients and service providers, and the absence of either one could
create unfavourable circumstances for AI adoption.

The SEM results show that facilitating conditions had a significant positive impact on AI use
attitudes among the accountants and auditors. The corresponding path analysis coefficient (β
= 0.223) shows that facilitating conditions had a significant bearing on the attitude of AI use.
This finding is in line with the literature, where a great number of studies have corroborated
the similar effect of facilitating conditions concerning the attitude toward novel technology.
The facilitating conditions for AI adoption in accounting and auditing functions, as discussed
in the previous section, are defined by the appropriate mix of people, process, policy and
technology. As a disruptive technology with massive implications for traditional employment
and work processes in the realm of accounting and auditing, AI has attracted as much
criticism as praise. Such socio-technical factors could potentially impact the attitude toward
AI. There are major issues with the explainability of AI systems in accounting and auditing.
This phenomenon has often been described as the black box effect, where users have very
poor interpretability of AI systems. The complexity and lack of interpretability of AI systems
can give rise to negative attitudes of accountants and auditors, a barrier which needs to be
countered by developing propitious or facilitating conditions within audit, accountancy and
financial consultancy firms. Lee (2022) mentions that several studies found causal links
between algorithmic decision making and bias. The attitude towards AI use therefore not
defined by its supposed advantages but by the underlying uncertainties as well. The
unfamiliarity with AI tools can lead to the perception of underestimation of the efficacy of
such tools as well as cognitive bias among accountants and auditors. To bridge the gap of
unfamiliarity and uncertainty of AI tools between the perceived usefulness and usage
intention and attitude of accountants and auditors there is a need for systematic design,
deployment, governance and adaptation of AI systems. In other words, successful AI
adoption in accountancy and audit firms requires the synergistic coexistence of human and
artificial intelligence. The development of a symbiotic relationship between AI and human
expertise in the context of accountancy and auditing is one of the critical aspects in
generating the feel-good factor. Therefore, ensuring positive attitudes toward AI use in
accounting and auditing functions mandates the development of supportive organisational
facilitating conditions. O’Shaughnessy et al. (2022) argued that the inherent complexity of AI
systems makes them susceptible to human scrutiny concerning governance requirements. An
end-to-end governance framework is an essential part of the facilitating conditions to support
the adoption of AI in accounting and audit firms. It can be argued that a holistic governance
framework can enhance the critical understanding of AI systems in accountancy and auditing,
which would ensure a balanced perception of such disruptive technologies. This would not
only diminish the black box type of AI application in accounting and auditing functions but
also ensure greater alignment between technology use and strategic process objectives. The
large-scale adoption of AI into accounting and auditing roles in the near future would
possibly come through a mix of company policy and preferential usage by the concerned
personnel. Having a positive attitude toward this technology is largely dependent upon the
notion that AI is an augmentation but not a replacement for accountants and auditors. This
very assertion points to the significance of the technological competence of accountancy and
audit forms and their human resources, the development of which is explicitly dependent
upon organisational facilitating conditions.

Table 4.25 shows that there was no significant negative impact of personal well-being
concerns on the AI usage intentions of accountants and auditors. This finding is in
contradiction with many similar studies where it was conclusively shown that disruptive
technology at the workplace had a negative impact on subjective well-being concerns.
Beaudry and Pinsonneault (2005) in their study reported that there was a significant negative
impact of smart technologies and the well-being perceptions of employees. Beaudry and
Pinsonneault (2005) also reported that emotional stability is a major factor in the acceptance
of technologies in workplaces. The current literature on the subject of the perception of AI
among the workforce points out the presence of fear, anxiety and huge uncertainty. These
perceptions lead to psychological coping mechanisms of technology avoidance. It has been
argued by Lane et al. (2023) that employers typically deploy AI to reduce cost and improve
efficiency while employee benefits are not primary considerations. It has been argued that AI
would take over relatively simple and repetitive tasks and free up more time for accountants
and auditors to carry out complex strategic decision-making analyses. On one hand, this can
be viewed as an improvement of the job role of accountants and auditors, but, on the other,
the same can also imply that there would be intense competition among these professionals
and redundancy fears among the relatively less skilful personnel. Thus, the incorporation of
AI can be linked with technology-induced stress among employees who are at a relative
disadvantage in adopting such skills. Therefore, it is logical to argue that when AI, which is
often labelled as inexplicable to personnel with no relevant knowledge, is disrupting the
traditional accounting and auditing functions there might be notable resistance and personal
well-being concerns which would diminish the usage intention of the technology. However,
the current study found that the same was not the case, indicating well-being concerns do not
modulate AI use intention among the auditors. The descriptive statistics of the indicators
point out the fact there was no majority of the opinion that denied the existence of such
personal well-being factors. However, as the same did not negatively impact the AI use
intentions of the auditors and accountants, it can be fairly argued that there is an
acknowledgement of the skill gap but the same does not deter the adoption of the same. This
signifies that any resistance to AI adoption among accountants and auditors requires
knowledge-based intervention that would enhance the familiarity of the technology. At the
same token accounting and audit firms need to recognise the need to augment their workforce
with due assistance to assuage their fear of redundancy or low self-esteem.

The results from Table 4.25 show that there is a significant negative impact of personal well-
being concerns (β= -0.122, p < 0.05) on the AI use attitude of accountants and auditors. This
result agrees with the conclusions of many past studies. Wickramasinghe (2009) reported that
the lack of job control and autonomy in ICT use leads to negative attitudes toward technology
use. The lack of autonomy and job control has been linked to the work stress of employees.
In the context of accountants and auditors, it can be said that if the AI system has black box
issues as depicted earlier the same can lead to the perception of lack of job control and
autonomy which can cause work stress among them. The size of the effect (β= -0.122)
implies that personal well-being concerns moderately impact the attitude of AI use. Tarafdar
and Saunders (2022) similarly noted that ICT use can lead to exhaustion and burnout among
employees. Therefore, it becomes clear that if AI negatively impacts the perception of job
autonomy and control, the same can lead to technostress and eventually negative attitudes
toward such disruptive technologies. Rubery and Grimshaw (2001) studied the impact of ICT
on the job quality aspects of employees and pointed out the mixed evidence in the literature
as to the impact of technology on the quality of employment. The existing literature shows
the credibility of both optimistic and pessimistic viewpoints concerning ICT use. Rubery and
Grimshaw (2001) noted the following dimensions of job quality, security of employment, job
autonomy and skills. The pessimistic side of the argument states the following.
 ICTs diminish job opportunities in the labour market through automation
 ICTs diminish the relevance of traditional internal career ladders within
organisations
 ICTs degrade the relevance of employment regulation requirements
 ICTs can negatively impact the pay bargaining power of the employees
 There is a negative impact of ICTs on work-life balance
 Lack of skills can lead to work intensification as a result of ICT use
 ICT use can lead to deskilling of the labour force
 ICTs create dead-end jobs

On the other hand, the optimistic views as reported by Rubery and Grimshaw (2001) are
the following ones.

 ICTs create new work opportunities


 ICTs create flexible careers
 ICT make traditional employment protection requirements redundant
 ICT adoption leads to higher pay and self-employment opportunities
 ICTs tend to reduce job effort
 ICTs help with skill enhancement and better employment opportunities

In explaining the result that personal well-being concerns negatively impacted the AI use
attitude among the accountants and auditors, the aforementioned assertions could be used.
With the incorporation of AI in accountancy and auditing functions, it can be said that job
opportunity prospects might be differently impacted based on the background of the
employees. Some accountants might subscribe to the pessimistic views as described above
while some could relate to the optimistic angle. Based on the current hypothesis result it can
be stated that as the personal wellbeing concerns increased the same had a negative impact on
the attitude toward AI use. In other words, AI-induced anxiety, feelings of redundancy, loss
of control over job functions and poor self-image could negatively impact the attitude of the
accountants and auditors toward AI use. The contention that AI would significantly impact
the employment status of certain groups of professionals had a basis in factual evidence. The
prospect of decreasing job opportunities as a result of AI inclusion in accounting and audit is
a concern that negatively impacts the attitude of a certain section of accountants and auditors.
The argument that ICT inclusion in the workplace can change the dynamics of career
progression could also be a significant factor concerning AI use in accountancy and auditing.
This is to say that if AI had the potential to alter the career progression dynamics at
workplaces, the same would be looked upon with mistrust and negativity. Since the
incorporation of AI into different sectors, the redundancy of jobs has been a real problem to
grapple with. There are no legal bindings on companies as to whether they simply could lay
off employees after integrating AI into business processes without any accountability. This
makes the technology particularly favourable to employers as opposed to employees. This
factor might lead to the development of a negative attitude toward AI use in accountancy and
auditing roles. If organisations do not account for this redundancy fear among accountants
and auditors, the corresponding change management process to incorporate AI into
operations could face a huge roadblock. The human resource managers of accountancy and
audit firms must acknowledge that in the absence of employment regulations concerning the
use of AI, it is their priority to address the career growth-related issues that accountants and
auditors could face as a result of such changes. The argument that with the incorporation of
ICTs employees might lose the right to competitive remuneration is also a major factor for
deliberation. In the domain of accountancy and auditing, the incorporation of AI expert
systems might lead to the lowering of pay scales for entry-level positions. As it has been
feared with such technological changes the scope for career advancement would not be
symmetrically distributed among the entire workforce, the feeling of angst, powerlessness
and inferiority among a section of accountants and auditors could be likely. As the majority
of practising accountants and auditors would have to work through the challenges of
upskilling, it can be fairly argued that this situation might lead to work intensification.
Moreover, if this work intensification is not incentivised by organisations, there would be
greater resistance from the staff. Organisations also need to ensure that the incorporation of
AI does not lead to dead-end jobs for a certain section of accountants and auditors.
Therefore, without addressing personal well-being concerns, the negative attitude toward AI
use in accounting and auditing cannot be ameliorated.

H5a was supported (β = 0.199, p < 0.05), indicating that performance expectancy had a
significant positive impact on the AI use intention among the accountants and auditors. This
result corroborates with findings made in similar studies in the IS literature. For instance,
Venkatesh et al. (2012) reported that performance expectancy is one of the major drivers of
technology use intention at workplaces. Dwivedi et al. (2019) similarly reported that
performance expectancy is a significant factor in AI adoption. Wang et al. (2013) also
corroborate the current finding by stating that performance expectancy was one of the major
antecedents of technology use intention at workplaces. Lin et al. (2019) also reported along
similar lines, mentioning that performance expectancy was a key factor for technology use
intention among working professionals. The current result implies that the utility of AI tools
for accountants and auditors is one of the key determiners of compatibility between them.
The fact that the application of AI in accounting tasks helps improve the efficiency and value
for the clients is a significant factor for the positive intention to use the same. This explains
why both big and small-size accountancy firms are showing increasing interest in using AI at
a scale that fits their strategy. It has been argued that AI use in accounting can solve the
problems of efficiency bottlenecks and spending resources on low-value addition activities.
Therefore, from the perspective of accountants who are keen on improving value addition
through their services would be more hospitable toward the adoption of AI. Therefore, it can
be mentioned that performance expectancy is one of the key drivers for urging growth-
oriented accountants to use AI in their job functions. On the other hand, including AI in audit
functions promises to bring new levels of efficiency and accuracy. As the requirements of
risk assessment in the contemporary business environment become more dependent on data,
auditing quality becomes inherently linked with disruptive digital technologies. As the
auditing data takes on a more complex nature with the growing digitalisation of business
functions, AI becomes a key tool for auditors. As in the case of accountants, it has been
similarly argued that AI use would help auditors to focus more on high-value addition
activities. Unlike big auditing and accountancy firms, smaller organisations do not have
access to the resources to develop in-house AI systems to help with their workflow, and this
could be a big factor in the performance expectancy standards of these tools. However, it has
been pointed out that auditing and accounting customised solutions are not always necessary
and “off-the-shelf software solutions” can substitute for the former (AICPA, 2020).
Therefore, it can be understood that both bigger and smaller firms can integrate AI at scale.
Figure 5.1 and Figure 5.2 show the impacts of AI-enabled functions on financial reporting
practices and the areas where performance improvements are highly expected.
Figure 5.1: The Impacts of AI Functions on Financial Services
(Source: KPMG, 2023)

Figure 5.2: The Benefits of AI for Financial Reporting


(Source: KPMG, 2023)

Aside from efficiency and accuracy gains, AI use can help accountants and auditors in
anomaly detection, which is one of the top use cases of such technologies in the financial
service sector (KPMG, 2023). The business stakeholders are likely to expect accountants and
auditors to adopt AI to improve the quality of their roles. Such performance expectancy
standards therefore remain as one of the key drivers of AI use intention of accountants and
auditors.

Referring to Table 4.25 it can be seen that H5b was supported, implying there was a positive
impact of performance expectancy on the AI use attitude of accountants and auditors. The
corresponding path coefficient (β) was 0.338, therefore, it can be said that there was a strong
impact of performance expectancy on the AI usage attitude. This finding lies in agreement
with several key studies as indicated in the literature review section. Balakrishnan and
Dwivedi (2021) found in their study that performance expectancy had a positive impact on
technology use attitude. Dwivedi et al. (2017) similarly reported the significance of
performance expectancy for a positive attitude toward novel technologies. Venkatesh et al.
(2011) also found corroboration of this result, indicating performance expectancy as one of
the major factors of individual attitude toward technology adoption. In current research
context, it can be said that the usefulness of AI in helping accountants and auditors in
strategic decision-making, enhancing productivity and augmenting the quality of work
performance was the key reason for a positive attitude toward the same. The fields of
financial, accounting, tax, audit and assurance services have been strongly impacted by
disruptive technologies like machine learning, big data, blockchain, robotic process
automation, generative AI, natural language processing and others. In light of the current
result, it can be suggested contemporary accountants and auditors are well aware of the value
addition aspects of AI, and they are willing to adopt this technology or at least convinced of
the fact that doing the same is essential for their future career. It has been widely recognised
that the use of AI is going to streamline the overall process of business data collection,
modelling and analysis, and practising accountants and auditors are aware of the implications
of such developments for their profession. This makes them willing to adopt this technology
and develop a positive relationship with the same. There have been anticipations that in
future the role of accountants and auditors would be indistinguishable from that of data
scientists. The limitations of traditional accounting and auditing can be addressed by using
AI, another significant reason for the growing prominence of this technology for the
concerned functions. It has been previously discussed that traditional accounting and auditing
practices have often come across accusations that they are vulnerable to wilful bias and
manipulations of accounting standards (Sherif & Mohsin, 2021). Zhang et al. (2017) argued
that accountants and auditors would have come to terms with the digitalisation of business
processes, and realised that such traditional practices did not fit well with the requirements of
financial reporting accuracy and timeliness standards of today. This perhaps is one of the
biggest motivations for these professionals to adopt a positive attitude to AI. In the early
stages when AI was just making its inroads in the domain of the accounting, financial, audit
and assurance sectors, it was largely believed that such intelligence is only useful for
automating routine and low-impact processes. However, AI has grown in leaps in bounds
since such early days and can be now used to solve more complex problems. Jin et al. (2022)
observed that Deloitte is one of the Big Four firms to use AI for complex auditing problem-
solving. Contemporary accountants and auditors are aware of such promising functions of AI
in complex decision making areas. The acceptance of H5b can lead to the belief that there is a
consensus about the fact that AI driven decision making can be considered reliable which is a
critical reason for taking a positive attitude toward the use of the same. This indicates that AI
is not just useful for handling voluminous accounting and auditing works but also can be
useful in strategic decision-making. There is a broader consensus that AI is compatible with
human skills in accounting and auditing and the former could be used to augment the latter.
This shows that future-generation accountants and auditors would likely use AI as a strategic
tool.

You might also like