Electronic copy available at: http://ssrn.

com/abstract=794669
1

The Impact of Internal Control Quality on Audit Delay in the SOX Era





Michael Ettredge*
University of Kansas

Chan Li
University of Kansas

Lili Sun
Rutgers University - Newark





J anuary 2006





*Corresponding author.
1300 Sunnyside Ave
Lawrence, Kansas 66045-7585
(785) 864-7537
mettredge@ku.edu

Electronic copy available at: http://ssrn.com/abstract=794669
2

The Impact of Internal Control Quality on Audit Delay in the SOX Era


SUMMARY: This study analyzes the impact of internal control quality on audit delay following
the implementation of the 2002 Sarbanes-Oxley Act (SOX). Unlike prior studies that utilize
survey samples, or that employ a proxy for internal control quality such as earnings restatements,
our study employs external auditor assessments of internal control quality that are publicly
disclosed in firms’ SEC 10-K filings as required by SOX Section 404. This makes our study
results both timely and reliable (i.e. not subject to small sample bias or weak proxies). Consistent
with our expectation, we find that the presence of material weakness in internal control over
financial reporting (ICOFR) is associated with longer delays. The types of material weakness
also matter. Compared to specific material weakness, general material weakness is associated
with longer delays. Additional analyses indicate that companies with control deficiencies in
personnel, process and procedure, segregation of duties, and closing process experience longer
delays.

This study also documents a significant increase in audit delay associated with the fulfillment of
the SOX Section 404 ICOFR assessment requirement. This suggests that Section 404 compliance
has made it more difficult for firms to comply with the SEC’s desire to shorten Form 10-K filing
deadlines. Our finding thus supports and helps explain the SEC’s decisions in 2004 and 2005 to
defer scheduled reductions in Form 10-K filing deadlines (from 75 days to 60 days) for large,
accelerated filers.


Keywords: Audit Delay, Internal Control, SOX 404, Filing Deadline.
1
The Impact of Internal Control Quality on Audit Delay in the SOX Era

INTRODUCTION
Due to recent high-profile accounting scandals, regulators and investors have
become more concerned with the timeliness of financial reporting. Section 409 of the
Sarbanes-Oxley Act (SOX) authorizes the SEC to compel reporting firms to rapidly
disclose to the public any information about material changes in their financial conditions
or operations. As a result, the SEC phased-in accelerated deadlines for filing Form 10-Ks
(from 90 days to 75 days, and then to 60 days) over a three year period starting in 2003
(SEC 2002). However, in August 2004 and again in December 2005, the SEC postponed
the implementation of the accelerated 60-day filing deadline for Form 10-K, due to various
obstacles which make it difficult for reporting firms to meet the new dates. Among others,
one challenge filing firms must overcome is the SOX Section 404 requirement of internal
control over financial reporting (ICOFR) assessment by both management and external
auditors. This provision has itself been subject to SEC-mandated delays in implementation.
Although it has been widely reported in the business press that the SOX and its Section 404
ICOFR assessment cause auditing and filing delays, there has been no rigorous study that
disentangles the impact upon audit delay of Section 404 itself from the impact of the SOX
in general and other delay factors. One goal of this study is to fill this void.
The importance of research on the causes of audit delay has been well recognized.
To restore the confidence of investors in capital markets requires reliable and timely
accounting information. It is well documented that late earnings announcements are more
often associated with lower abnormal returns (Givoly and Palmon 1982; Chambers and
Penman 1984; Kross and Schroeder 1984), and a higher degree of information asymmetry
2
(Hakansson 1977, Bamber et al. 1993). Audit delay affects the timeliness of both the
annual earnings release and the annual statement filing date. The business press reports that
investors tend to punish firms that both disclose material weakness in ICOFR and that file
their 10-Ks late (Leone 2005). Regulators need to understand the determinants of audit
delay before they can legislate effectively to reduce it (Leventis et al. 2005).
Not only does this study examine the impact upon audit delays of SOX Section 404,
it also investigates the effect of the quality of internal control. Internal control quality is an
important determinant of audit delays that is largely unstudied in prior literature. Due to the
lack of data, prior research uses either survey data (Ashton et al. 1987), or a proxy such as
earnings restatements (Kinney and McDaniel 1993), to analyze the impact of internal
control quality upon audit delays. The SOX Section 404 requirement of ICOFR assessment
now provides publicly available measures of internal control quality. This enables us to
perform analyses using a large number of sample firms, a recent time period, and direct,
external measures of ICOFR, an important type of internal control.
This study’s sample is derived from all firms that filed Section 404 reports from
J anuary 2005 to J une 2005 and that are covered by the Audit Analytics Database.
Consistent with prior literature (e.g., Ettredge et al. 2000; Leventis et al. 2005), this study
measures audit delay as the length of time from a company’s fiscal year-end to the date its
external auditor signs the audit report. To examine the impact of the SOX Section 404 upon
audit delay, we perform OLS regressions to compare the sample firms’ audit delays in
fiscal year 2004, the first fiscal-year for which Section 404 ICOFR disclosure is required,
with their audit delays in fiscal year 2003. To analyze the impact of internal control quality
upon audit delay, regressions analyses are conducted using data from 2004 to compare the
3
delay difference between the sub-sample of firms with effective ICOFR and the sub-sample
with material weakness. Motivated by Ge and McVay (2005) and Public Company
Accounting Oversight Board (PCAOB) standard No. 2, we also perform regressions to
examine the differential impact upon audit delay of different types of control problems.
Our results indicate a significant increase in audit delay in 2004 likely due to the
implementation of the SOX Section 404 requirements in 2004. This suggests that the
Section 404 procedures create difficulties for firms trying to comply with the SEC’s
shortened 10-K filing deadlines. Our findings thus explain and support the SEC’s decisions
in 2004 and 2005 to defer for one year (most recently to December 15, 2006) the 60-day
Form 10-K filing deadline for large, accelerated filers (i.e. public float of $700 million or
more). We find that the presence of material weakness in ICOFR is associated with longer
audit delay, and that the length of delays varies with the type of material weakness. In
particular, “general” (firm-wide) material weakness is associated with longer delays than
“specific” material weakness (i.e. specific to particular accounts or transactions). In
addition, we also categorize the types of material weakness based upon control systems
components identified by the Committee of Sponsoring Organization’s (COSO)
framework. We find that material weaknesses related to “personnel”, “process and
procedure”, “segregation of duties”, and “closing process” are associated with longer
delays.
The rest of the paper is organized into four sections. The second section comprises
study background and hypotheses development. The third section discusses research
method and data selection. The fourth section presents results, and the last section presents
conclusions, limitations, and future research.
4

BACKGROUND AND HYPOTHESES
Background on SOX Section 404
After Enron and the WorldCom scandals, Congress passed the landmark SOX Act
in 2002 to restore investor confidence. One of the most significant provisions of the SOX is
Section 404, which requires management to assess and report on the company’s ICOFR,
and requires external auditors to attest and report on the assessments made by the
managers, as well as providing their own reports on the ICOFR. The main purpose of the
SOX Section 404 is to satisfy the need of investors to have confidence not only in the
financial reports issued by a company, but also in the underlying processes and controls
that are an integral part of producing those reports.
Shortly after the passage of SOX, the SEC established the SOX 404 implementation
dates (SEC 2003) that apply to companies other than registered investment companies. A
company that is an "accelerated filer" (public float of $75 million or more) must begin to
comply with the requirement for ICOFR assessments in its first fiscal year ending on or
after J une 15, 2004. A company that is not an accelerated filer (public float less than $75
million) must begin to comply with the annual assessments for its first fiscal year ending on
or after April 15, 2005. Subsequently in February 2004, in March 2005, and again in
September 2005, the SEC extended compliance and reporting deadlines, particularly for
smaller companies. The most recent updated SEC rule required that accelerated (non-
accelerated) filers begin Section 404 report compliance in the first fiscal year ending on or
after November 15, 2004 (J uly 15, 2007).
5
Observers argue that requiring small companies to comply with Section 404 is very
costly and the benefit may outweigh the cost. Under a recent rule (SEC, 2005 “Exemptive
Order”), accelerated filers that have a fiscal year ending between and including Nov. 15,
2004 and Feb. 28, 2005, and that had public equity floats of less than $700 million at the
end of their second fiscal quarter in 2004, have been granted 45 additional days to file their
complete 404 reports. As mentioned earlier, the SEC has three times delayed the 404
compliance date for smaller filers. In December 2005, the SEC's Advisory Committee on
Smaller Public Companies recommended to the full SEC that smaller businesses (market
capitalization below $700 million) no longer be required to comply at all with the Section
404 provisions.
Hypotheses Development
Our first research question is whether the ICOFR disclosure requirements under
SOX Section 404 increase audit delays, and if so, by how much? The answer to this
question will provide information about the amount of additional time needed to ‘audit’ the
internal controls. We expect that ICOFR assessments increase audit delays for the
following reasons.
Adding a new reporting requirement for external auditors should increase the time it
takes to complete the audit, especially in the first year of implementation. The provision of
an ICOFR assessment is by no means an easy task. Under SOX Section 404, public
companies need to design, document, and analyze their ICOFR. In effect, they must create
elaborate internal control procedural manuals and update them whenever processes change
(Calabro 2004). Before reporting on the ICOFR, both management and external auditors
must test the controls for design and operating effectiveness. The PCAOB (Standard No. 2)
6
requires auditors to conduct inquiries, observations, inspections of relevant documents, and
specific evaluations of important control stages. The PCAOB also limits the extent to
which external auditors can rely on the work of others, even though internal auditors may
already have tested the processes. Moreover, because the external auditors are required to
test any controls that have significant impact on companies’ financial statements, they
must be vigilant for weaknesses that may appear in a variety of processes, ranging from
how journal entries are consolidated and adjusted, to what information technology controls
are implemented to protect the company’s information systems (Calabro 2004). The
extended audit work should lead to audit delays (Knechel and Payne 2001).
Anecdotal evidence from the business press supports this scenario. Press accounts
state that 404 requirements increased the time companies took to file their fiscal year 2004
reports, and that the number of companies missing the filing deadline for annual reports
therefore jumped in 2005 (Richardson 2005; Hadi 2005). Based on the above reasoning,
our first hypothesis, in alternative form, is:
H1: Companies implementing the new internal control over financial reporting
(ICOFR) requirements experience increased audit delays, ceteris paribus.
In testing H1 we desire a benchmark for what might constitute a significant delay. We
employ fifteen calendar days as the benchmark for the following reasons. First, several
recent SEC reporting requirements are intended to shorten Form 10-K filing deadlines by
fifteen calendar days (i.e. from 90 to 75 days, and from 75 days to 60 days). Second, the
SEC typically grants a fifteen calendar day extension to those firms which file Form 12b-
25 to notify the SEC of their inability to file the Form 10-K on time. Clearly the SEC
7
considers fifteen days to be a significant or material number for filing and reporting
purposes.
Our second research question is whether the quality of ICOFR influences audit
delays. In general, a weak internal control potentially allows accounting errors to occur and
to go undetected. Auditors therefore need to extend their scope of work and perform
additional substantive tests to compensate for the control weakness (Doss 2004; Leech
2004). The extended audit effort due to control weakness should lead to longer audit delay.
Thus we expect that firms whose auditors assert ICOFR problems will exhibit delays in
excess of those encountered by other firms that implement 404 requirements. The PCAOB
(Standard No. 2) designates three types of ICOFR problems: a control deficiency, a
significant deficiency, or a material weakness. Public companies are only required to
disclose material weaknesses in their 404 reports.
1
Thus the SOX 404 reports only allow
researchers to determine whether or not material weakness exists, and if so its nature.
Therefore, our initial measurement of ICOFR quality is the presence or absence of material
weakness. To test our expectation that weaker internal control is associated with longer
delay, the second alternative hypothesis is:
H2: Companies with material weaknesses in their ICOFR experience longer audit
delays than companies with effective ICOFR, ceteris paribus.
Our third main research question is whether different types of material weakness in
ICOFR have differential impacts upon audit delays. This research question is motivated by
a recent study by Ge and McVay (2005). Using a sample of 261 companies that disclosed
material weakness in ICOFR in their SEC filings after SOX, Ge and McVay (2005) provide

1
We measure material weakness as an adverse auditor opinion on a client’s ICOFR. An auditor is required to
issue an adverse opinion on the ICOFR if one or more material weaknesses exist in the company’s ICOFR
(PCAOB Standard No. 2).
8
initial evidence on the common types of material weakness firms disclose after the
implementation of Section 404. They observe several types of material weakness, from
account-specific weaknesses (such as those specific to revenue recognition or current
accruals) to general weaknesses (such as those affecting personnel training, technology
issues, or control environment). Since the impact of each type of material weakness upon
the financial reporting system can differ, the extended audit work needed to compensate for
each type of material weakness may vary as well.
We develop two sets of material weakness categorizations. Our first classification
scheme is based upon the PCAOB’s Standard No. 2 and Moody’s recommendations (Doss
2004). These sources classify ICOFR material weakness into company-level control issues
and specific control issues. Company-level controls refer to controls that “might have a
pervasive effect on the achievement of many overall objectives of the control criteria”
(PCAOB Standard No. 2, para. 52, p. 163). We refer to control problems at the company
level as “general” material weaknesses, which include situations such as an ineffective
control environment, an ineffective audit committee, an inadequate internal audit or risk
assessment function, and an ineffective financial reporting process (PCAOB Standard No.
2). Specific controls are those that are “designed to achieve specific objectives of the
control criteria” (PCAOB Standard No. 2, para. 50, p. 163). We refer to controls at a
specific objective level as “specific” material weaknesses. These relate to control problems
over transaction-level processes or specific account balances, such as inventory, accounts
receivable, and legal proceedings (Doss 2004).
When a specific material weakness is identified, auditors can effectively audit
around it by performing additional substantive procedures. In contrast, a general material
9
weakness is more serious (Doss 2004). It impacts the financial reporting process so
pervasively that the scope of audit work must be expanded and audit effort must be
increased for each auditing objective (PCAOB Standard No. 2). As a result, the audit delay
should be more substantial in presence of a general weakness. The above arguments lead to
our third alternative hypothesis:
H3: Companies with general material weaknesses in the ICOFR experience longer
audit delays than companies with specific material weaknesses, ceteris paribus.
Our second classification scheme for material weakness is based upon the specific
elements of internal control systems defined by the Committee of Sponsoring
Organization’s (COSO) framework. Under this scheme, eight types of material weakness
are defined: ‘Personnel’, ‘Process and Procedure’, ‘Documentation’, ‘Segregation of
Duties’, ‘Information Systems Process’, ‘Risk Assessment/Control Design’, ‘Closing
Process’, and ‘Control Environment’. Since we are not aware of any theory or empirical
evidence that indicates material weakness due to one type of COSO component might
cause longer delay than others, we do not specify hypotheses based upon the COSO
classification scheme, and results are presented in a section on additional analyses.

RESEARCH METHOD AND DATA SELECTION
Research Method
We use the following model to test H1:
AUDELAY=a+b
1
YEAR +b
2
SIZE +b
3
FININD +b
4
HIGHTECH

+b
5
ROA
+b
6
LEVERAGE +b
7
GOCERN +b
8
EXT +b
9
SEGNUM +b
10
LOSS
10
+b
11
RESTATE +b
12
AFEE +b
13
AOPIN +b
14
AUDCHG
+
13
1
_
i
i
YEAR CONTROL VAR
=
×

(1)
AUDELAY Equals the number of calendar days from the fiscal year end
to the date of the auditor’s report.

YEAR Equals 1 for fiscal year 2004; 0 for 2003 (used to test H1).


13
1
_
i
i
YEAR CONTROL VAR
=
×

Represents the 13 interaction terms between YEAR
and the 13 control variables.

Model (1) is estimated using data for both 2003 (pre-SOX 404) and 2004 (post-SOX 404).
The following regression model is used to test H2 and H3:
AUDELAY =b
0
+b
1
MWIC or GMWIC/SMWIC +b
2
SIZE +b
3
FININD +b
4
HIGHTECH

+b
5
ROA +b
6
LEVERAGE +b
7
GOCERN +b
8
EXT +b
9
SEGNUM
+b
10
LOSS +b
11
RESTATE +b
12
AFEE +b
13
AOPIN +b
14
AUDCHG (2)

MWIC Identifies whether a company has a material weakness in its
(used to test H2) ICOFR (1 =material weakness; 0 =otherwise).

GMWIC Identifies whether a company has a general material weakness
(used to test H3) in its ICOFR (1 =general material weakness; 0 =otherwise).

SMWIC Identifies whether a company has a specific material weakness
(used to test H3) in its ICOFR (1 =specific material weakness; 0 =otherwise).

Model (2) is estimated using only 2004 data (post-SOX 404).
The variables in the above two models are defined in table 1. Based upon prior
research, we control for the effects of some other factors that likely affect audit delay:
client size, client industry, client financial condition, client extraordinary items, client
business complexity, client net losses, restatements of financial reports, audit fee, auditor’s
opinion on the financial statements, and auditor change (Ashton et al. 1989; Newton and
11
Ashton 1989; Bamber et al. 1993; Kinney and McDaniel 1993; Schwartz and Soo 1996;
J aggi and Tsui 1999; Ettredge et al. 2000; Cullinan 2003; Leventis et al. 2005).
Data Selection
Our initial sample consists of 3,098 companies that filed 404 reports from J anuary
2005 to J une 2005 provided by Audit Analytics Database. We then require the sample
companies to have the necessary financial statement variables available from Compustat.
This procedure yields a final sample of 2,476 firm-observations for fiscal year 2003, and
2,391 firm-observations for fiscal year 2004, including 2,051 with effective ICOFR and
340 with material weakness. Note that the numbers of firms included in the Year 2003
sample and the Year 2004 sample are nearly equal.
2
The slight difference in sample size
(80 firms) is due to missing data. 96% of our sample firms have a fiscal year ending
between December 1 and March 31. Therefore controlling for non-busy season engagement
is not needed. The companies’ ICOFR status (effective vs. material weakness) is obtained
from the Audit Analytics Database. For firms having material weakness in 2004 we read
their auditors’ 404 reports and manually assign the material weaknesses to appropriate
categories (i.e. specific or general, and COSO category). We provide more details on the
categorization of material weakness in the next section.

EMPIRICAL RESULTS
Descriptive Statistics
Table 2 reports descriptive statistics. Panel A shows the audit delay in 2004 is
significantly longer than that in 2003. The mean delay in 2003 is 50 days, while the mean
delay in 2004 is 70 days. Although the 70-day average delay for 2004 is still within the 10-

2
When testing model (1) we investigate whether results are sensitive to pooling data for 2003 and 2004.
12
K filing compliance period of 75 days allowed at that time, it is longer than the average
delay for 2003 by about 20 days. This supports H1, and exceeds the 15 day threshold that
we view as clearly significant. Within the 2004 sample, the mean audit delay for companies
with material weakness in their ICOFR (87 days) is also significantly longer than that for
companies with effective ICOFR (67 days). This supports H2. In Panel B, three sets of data
are reported: for the full sample of 2,391 companies that reported ICOFR status in 2004,
for the sub-sample of 340 companies that have material weakness in their ICOFR, and for
the sub-sample of 2,051 companies that have effective ICOFR. Univariate analysis
indicates that the companies with material weakness in their ICOFR are significantly
smaller, are less likely to be in financial industries, are more likely to be in high-tech
industries, have lower returns on assets, report more losses, have more restatements, pay
higher audit fees (scaled by assets), are more likely to change auditors, and are more likely
to receive modified auditor opinions other than going concern opinions.
Panel C in table 2 also reports three sets of data: for the full sample of 340
companies that have material weakness in their ICOFR for fiscal year 2004, for the sub-
sample of 126 companies that have general material weakness, and for the sub-sample of
214 companies that have specific material weakness. To determine the types of material
weakness, one of the authors and an accounting doctoral student independently categorized
material weaknesses into general or specific. The percentage agreement between the two
coders was 95%. At the end of the coding process, the two coders met to reconcile
differences and arrived at a consensus in instances where the original classification had
been in disagreement. In the Appendix, we provide examples of material weaknesses under
each type. Univariate analysis suggests that companies with general material weakness are
13
significantly smaller, are more likely to be in high-tech industries, have lower return on
assets, report more losses, are less likely to restate financial reports, and are less likely to
receive modified opinions other than for going concern.
Table 3 reports correlations among variables. Although several independent
variables have correlations above 0.35, the highest variable inflation factor (VIF) in our
regressions is 2.07, suggesting multicollinearity is unlikely to be problematic.
Regression Results
To test whether and how audit delay differs between the pre-404 and post-404
periods, we conduct OLS regression tests on the total sample of 4,867 firms based on
equation (1). We regress audit delay against a dummy variable representing the YEAR (1 =
fiscal year 2004, 0 =fiscal year 2003), control variables, and interactions between each
control variable and YEAR. Model 1(a) of Table 4 reports the results. The coefficient of
YEAR (35.4) is positive and is highly significant (p =0.000), suggesting the
implementation of 404 requirements is associated with increased audit delay of about 35
days for fiscal year 2004. This supports H1. Table 4 also reports regression results for the
year 2004 sample [model 1(b)] and the year 2003 sample [model 1(c)] respectively. The
difference between the two models’ intercepts is approximately 35 (112.9-77.7). This
difference is highly significant (p =0.000, not tabulated), which again provides support for
H1.
3
These results suggest that the SOX Section 404 ICOFR assessment has generated a
significant increase in audit delay, which will make it more difficult for reporting firms to
meet the SEC’s shortened filing deadline target of 60 days for 10-Ks. Our results also

3
Estimating model (1) for each year separately eliminates the possibility that p-values are overstated due to
pooling over years of observations that are not fully independent.
14
support the SEC’s 2005 decision to defer for one year the effective date of 60-day filing
deadlines for large accelerated filers to December 2006.
Among control variables, consistent with prior literature and our expectations,
regression results suggest that companies with longer delays are smaller, have higher
leverage, report more losses, are more likely to restate financial reports, and are more likely
to receive modified auditor opinions. Unlike Leventis et al. (2005), this study finds that
change of auditors is associated with longer delay. When a company changes its auditor,
the new auditor will usually take some time to thoroughly understand the company’s
business and to communicate with the predecessor auditor. Auditors likely view initial
engagements as inherently riskier, and devote extra care to the audit.
We also find that higher (scaled) audit fees are associated with longer delays. Audit
fees should reflect the level of complexity of the audit. While some earlier studies (Ashton
et al. 1989; Newton and Ashton 1989; Bamber et al. 1993) find that financial companies
have shorter delays, this study documents that financial companies had longer delays in
2003, although this difference was not significant in 2004. The longer delay for financial
institutions in 2003 could be due to the changing environment for auditing financial
instruments. Over the last decade, financial instruments increased in complexity, and are no
longer easier to audit compared to non-financial assets. Contrary to our expectation that
firms in high-technology industries should have longer delays because of higher litigation
risk (Kasznik and Lev 1995; Bonner et al. 1998), our results suggest that firms in high-
technology industries have shorter delays. One possible explanation for this could be that
high-technology firms have more sophisticated accounting information systems which
allow them to accomplish new reporting tasks faster. Finally we note that coefficients for
15
several of the YEAR interaction terms in model (1a) are significant, indicating shifts in
coefficients from 2003 to 2004. This is not surprising given the magnitude of the task that
SOX 404 requirements apparently imposed on companies and their auditors.
Table 5 reports the results for testing the relationship between audit delay and
internal control quality. Based upon equation (2), OLS regression tests are performed using
the fiscal year 2004 sample for which ICOFR status is reported. Model (2a) compares the
audit delay between firms reporting effective ICOFR and firms reporting any type of
material weakness. Results indicate that companies with material weakness in their ICOFR
experience significantly longer delays (about 16 days, p =0.000). These findings
empirically support our expectation stated in H2 that weaker internal control is associated
with longer audit delay.
Model (2b) analyzes the difference in audit delay between firms with general
material weakness versus specific material weakness. Again, OLS regression tests are
performed using the year 2004 sample, and based upon equation (2), with the indicator
variable for material weakness being replaced with two indicators for general and specific
material weakness. In model (2b), the coefficient for the indicator variable for general
material weakness is positive and significant (p =0.020), while the coefficient for the
indicator variable for specific material weakness is insignificant. The difference between
the two coefficients is significant (p =0.000, not tabulated).These results suggest that
companies with general material weakness in their ICOFR experience longer audit delays
than companies with specific material weakness, which supports H3.
To summarize, analyses presented in Table 5 indicate that the presence of material
weakness in a company’s ICOFR is associated with longer audit delay, especially when the
16
material weakness reflects company-level control issues. This result probably reflects
extended audit work that must be done to compensate for inferior internal control (ICOFR)
quality. As the control problem becomes more serious, the amount of extra audit work is
increased accordingly.
Additional Analysis
Types of Material Weakness based upon COSO Framework
Next we classify material weakness using a different classification scheme, i.e., the
Committee of Sponsoring Organization’s (COSO) framework. Then we conduct an
additional analysis on the relationship between audit delay and types of material weakness.
Under the COSO framework, material weakness is classified based upon the specific
components/elements of an internal control system. We therefore categorize internal
control material weakness into eight major types specified by COSO: ‘Personnel’
(PERSONNEL), ‘Process and Procedure’ (PROCESS), ‘Documentation’ (DOCUMENT),
‘Segregation of Duties’ (SEGREGATE), ‘Information Systems Process’ (ISPROCESS),
‘Risk Assessment/Control Design’ (RISKASSESS), ‘Closing Process’ (CLOSING), and
‘Control Environment’ (CONTRENV). Again, one of the authors and an accounting
doctoral student independently assigned material weaknesses observed to the various
COSO categories. The percentage agreement for the COSO-based categorization is [change
‘is’ to ‘was’] 89%. At the end of the coding process, the two coders met to reconcile
differences. The Appendix provides examples of material weaknesses categorized as each
type. Figure 1 presents the number of material weaknesses under each COSO-based
category. This figure is based upon the 340 firms that reported material weakness in their
Section 404 reports filed from J anuary 2005 to J une 2005. The total number of COSO-
17
based deficiencies disclosed is 649. On average, each firm has 1.9 COSO-based deficiency
types. The most frequent type of material weakness is ‘Process and Procedure’. The next
most common types are ‘Personnel’, ‘Closing Process’, and ‘Segregation of Duties’.
Although not identical to Ge and McVay (2005), our two sets of material weakness
categorizations (i.e. general versus specific, and COSO-based), when combined, capture a
break-down that is similar to Ge and McVay’s (2005) classification scheme. For instance,
the ‘Account Specific’ and ‘Revenue Recognition’ types identified by Ge and McVay are
designated as ‘specific material weaknesses’ in this study. Their ‘Senior Management’ type
is categorized as ‘general material weaknesses’. The types designated as ‘Personnel’,
‘Segregation of Duties’, ‘Information Systems Process’, ‘Risk Assessment/Control
Design’, ‘Closing Process’, and ‘Control Environment’ defined under our COSO-based
categorization are also included in Ge and McVay’s scheme.
To examine the association between the COSO-based types of material weakness
and audit delay, we replace the indicator variable for material weakness in equation (2)
with variables representing COSO-based types of material weaknesses. The regression is
again estimated using the 2,391 firms which filed 404 reports for fiscal year 2004. Results
presented in table 6 suggest that companies with internal control problems in personnel,
process and procedure, segregation of duties, and closing process have longer delays.
Interestingly, these four types are also the most frequent types reported, as shown in figure
1. These findings therefore indicate that ‘Personnel’, ‘Process and Procedure’, ‘Segregation
of Duties’, and ‘Closing Process’ not only are the most pervasive types of internal control
deficiencies, but also are the more serious types which require more audit work.
4


4
In addition, the least frequent types might lack explanatory power because the dichotomous variables
representing those types lack variability (i.e., are coded predominately as ‘zero’ with few coded as ‘one’).
18
Is Delay Due to SOX in its Entirety, or Specifically to Section 404?
It is important to know that the increase in audit delay documented in this study is
due to Section 404 itself, not due to SOX in its entirety. To ensure this, we have been
careful in our research design. That is, we compare the sample firms’ audit delays in fiscal
year 2004, the first fiscal-year for which Section 404 ICOFR disclosure is required, with
their audit delays in fiscal year 2003. To further address this issue, we re-estimate equation
(1) for a sample of small companies (public float smaller than $75 millions) and for foreign
registrants which are not required to file 404 reports in our sample period. As publicly
traded firms, however, they are subject to other provisions of SOX. Our regression results
in table 7 show a marginally significant decrease in delay for fiscal year 2004 compared to
2003.
5
Therefore, the increased delays documented elsewhere in this study are likely due to
the implementation of Section 404, not due to SOX in general.
Exclusion of Exempted Filers
In our sample, there are 145 small accelerated filers (public equity floats of less
than $700 million) which have used the 45-day extension granted by SEC to file SOX 404
reports.
6
As a sensitivity analysis, we rerun the regressions and exclude those companies.
Our main results still hold (not tabulated) after excluding these firms.
Voluntary Disclosure of Material Weakness
Using information available at the Compliance Week web site, we identify 78
sample firms voluntarily disclosing before December 31, 2004 that they had material
weakness in ICOFR. It is interesting to examine whether the voluntary disclosure of
material weakness is a signal of early awareness that allows for better planning, and

5
See the estimated coefficient for YEAR in model (1a) of table 7, and compare the intercepts of models (1b)
and (1c).
6
Those companies filed 10-K/A to amend the 404 reports.
19
therefore decreases audit delay. The average audit delay for these 78 voluntary disclosers is
89 days (not tabulated), which does not differ significantly from the average audit delay (86
days) for the other sample firms with material weakness. Therefore, our results do not
suggest that voluntary, early disclosure is associated with reduced audit delays.

DISCUSSION AND CONCLUSIONS
This study has two major purposes. First, it examines the impact of SOX Section
404 ICOFR assessments on audit delay. We document a statistically significant and
material (in magnitude) increase in audit delay associated with the implementation of SOX
Section 404 reporting requirements, after controlling for other delay factors. This suggests
that SOX Section 404 compliance has added another layer of difficulty in achieving the
SEC’s desired shortened 10-K filing deadlines. Our results thus support the SEC’s decision
that deferred for one year (to December 15, 2006) the 60-day form 10-K filing deadline for
large, accelerated filers.
Second, we analyze the impact of internal control quality (ICOFR) on audit delays
in the post-SOX era. Unlike prior studies that obtain internal control quality data from
relatively small surveys, or that use a quality proxy such as earnings restatements, this
study employs publicly disclosed data in firms’ SEC filings, which now are available from
firms’ Section 404 internal control assessments. This provides a relevant, external metric
for a large number of firms. Consistent with our expectations, we find that the presence of
material weakness in ICOFR is associated with longer delays. The types of material
weakness also matter. Compared to specific material weakness, general material weakness
is associated with longer delays. These findings indicate that control problems, especially
20
company-level control issues such as an ineffective control environment, or an ineffective
audit committee, are compensated by extended audit work which significantly lengthens
audit delay. In addition, we also attempt to classify control deficiencies based upon the
specific elements of the internal control systems provided by the COSO framework. We
find that four types of COSO-based control deficiencies are associated with significantly
longer audit delays, i.e., ‘Personnel’, ‘Process and Procedure’, ‘Segregation of Duties’, and
‘Closing Process’.
Results of this study increase our understanding of the impact of SOX Section 404
requirements, and the impact of internal control quality upon audit delays. The study has a
major limitation. To ensure the timeliness of the study, we only examine one year’s data
after SOX 404 implementation. Future research can analyze whether and how the impact of
Section 404 upon audit delay changes over time as more data become available. For
example, is SOX 404 implementation subject to a learning curve such that delays will
decrease in future?
21
References
Ashton, R.H., Paul R. Graul, and J . D. Newton. 1989. Audit Delay and the Timeliness of
Corporate Reporting. Contemporary Accounting Research 5 (2): 657-673.
Ashton, R. H., J . J . Willingham, and R. K. Elliott. 1987. An empirical analysis of audit
delay. Journal of Accounting Research 25 (2): 275-292.
Bamber, E. M., L. S. Bamber, and M. P. Schoderbek. 1993. Audit structure and other
determinants of audit report lag: An empirical analysis. Auditing: A Journal of
Practice & Theory 12 (1): 1-23.
Bonner, S. E., Z. Palmrose and S. M. Young. 1998. Fraud type and auditor litigation: an
analysis of SEC accounting and auditing enforcement releases. The Accounting
Review 73 (4): 503-532.
Calabro, L., 2004. Looking under the hood. CFO Magazine. May 01.
Chambers, A. E., and S. H. Penman. 1984. Timeliness of Reporting and the Stock Price
Reaction to Earnings Announcements. Journal of Accounting Research (22) (1): 21-
47.
Cullinan, C. P. 2003. Competing size theories and audit lag: evidence from mutual fund
audits. Journal of American Academy of Business 3 (1/2): 183-193.
Doss, M. 2004. Section 404 reports on internal control: impact on ratings will depend on
nature of material weakness reported (October). Moody’s Special Comment. Moody’s
Investors Service, Inc.
Ettredge, M., D. Simon, D. B. Smith, and M. Stone. 2000. Would Switching to Timely
Reviews Delay Quarterly and Annual Earnings Releases? Review of Quantitative
Finance & Accounting 14 (2): 111-130.
Ge, W. and S. McVay 2005. The disclosure of material weakness in internal control after
the Sarbanes-Oxley Act. Accounting Horizons 19 (3): 137-158.
Givoly, D., and D. Palmon. 1982. Timeliness of annual earnings announcements: some
empirical evidence. The Accounting Review 57 (3): 486-508.
Hadi, M. 2005. More companies are filing late. Wall Street Journal, April 11: C3.
22
Hakansson, N. H. 1977. Interim disclosure and public forecasts - an economic analysis and
a framework for choice. The Accounting Review 52 (2): 396-416.
J aggi, B., and J . Tsui. 1999. Determinants of Audit Report Lag: Further Evidence from
Hong Kong. Accounting and Business Research 30 (1): 17-28.
Kasznik, R., and B. Lev. 1995. To Warn or Not to Warn: Management Disclosures in the
Face of an Earnings Surprise. The Accounting Review 70 (1): 113-134.
Kinney, W. R.,J r, and L. S. McDaniel. 1993. Audit delay for fims correcting quarterly
earnings. Auditing: A Journal of Practice & Theory 12 (2): 135-142.
Knechel, W. R., and J .L. Payne. 2001. Additional evidence on audit report lag. Auditing: A
Journal of Practice & Theory 20 (1): 137-146.
Kross, W., and D. A. Schroeder. 1984. An Empirical Investigation of the Effect of
Quarterly Earnings Announcement Timing on Stock Returns. Journal of Accounting
Research 22 (1): 153-176.
Leech, T. 2004. Moody’s questions ability to audit around deficiencies. Compliance Week.
Dec. 7, 2004.
Leone, M. 2005. How markets punish material weaknesses. CFO.com, J uly 21.
Leventis, S., P. Weetman, and C. Caramanis. 2005. Determinants of Audit Report Lag:
Some Evidence from the Athens Stock Exchange. International Journal of Auditing 9:
45-58.
Newton, J . D., and R. H. Ashton. 1989. The Association Between Audit Technology and
Audit Delay. Auditing: A Journal of Practice & Theory 8 (Supplement): 22-37.
Public Company Accounting Oversight Board (PCAOB) 2004. Auditing
Standard No. 2 – An audit of internal control over financial reporting performed in
conjunction with an audit of financial statements.

Richardson, K. 2005. Big board proposes crackdown on late filers. Wall Street Journal,
February, 11: C3.

Schwartz, K. B., and B. S. Soo. 1996. The Association between Auditor Changes and
Reporting Lags. Contemporary Accounting Research 13 (1): 357-370.

U.S. Securities and Exchange Commission (SEC). 2002. Acceleration of periodic report
filing dates and disclosure concerning website access to reports. September 5.
23

U.S. Securities and Exchange Commission (SEC). 2003. Management's Reports on Internal
Control Over Financial Reporting and Certification of Disclosure in Exchange Act Periodic
Reports. J une 5.

U.S. Securities and Exchange Commission (SEC). 2005. Exemptive order on
management’s report on internal control over financial reporting and related auditor
report. Frequently Asked Questions. J anuary 21.




24
Table 1
Definition of Variables and Expected Signs

Variable Expected Definition
Sign

AUDELAY Number of calendar days from fiscal year-end to date of
the auditor’s report.
MWIC + Whether a client has material weaknesses in the ICOFR
(1 =material weaknesses; 0 =no material weakness).
GMWIC + Whether a client has general material weaknesses
(1 =general material weakness; 0 =otherwise).

SMWIC + Whether a client has specific material weaknesses
(1 =specific material weakness; 0 =otherwise).

SIZE - The size of the client, measured by the natural logarithm
of total assets.
FININD - The client’s industry. 1 =financial industry; 0 =otherwise.
HIGHTECH + The client’s industry. 1 =high-tech industry, 0 =otherwise.
ROA - Net earnings divided by total assets.
LEVERAGE + Total debt divided by total assets.
GOCERN + Whether the client receives a going concern opinion (1 =
going concern opinion; 0 =otherwise).

EXT + Whether the client reports extraordinary items for the
current year. 1 =client reports an extraordinary item; 0 =
otherwise.
SEGNUM + Number of a client’s reportable segments.
LOSS + Whether the client reports negative earnings for the current
year. (1 =reports negative earnings; 0 =otherwise).
RESTATE + Whether the client restated its financial reports in the current
year (1 =restated in the current year; 0 =otherwise).
AFEE + Scaled total audit fee for the current year. It is measured as the
total audit fee divided by total assets.
AOPIN + Auditor’s opinion on the financial statements. (1=modified
opinions other than going concern; 0 =otherwise).
AUDCHG + Whether a client changed auditor during the current year (1=
client changed auditor, 0 =otherwise).
25
Table 2
Descriptive Statistics

Panel A: Descriptive Statistics for Audit Delay (mean in
days)
2004 2004
Material Effective
2004 2003 t-stat p-value Weakness ICOFR t-stat p-value
N = 2,391 2,476 340 2,051
Variable:
AUDELAY 70.17 50.04 40.082 0.000 *** 86.64 67.44 21.910 0.000 ***

Panel B: Descriptive Statistics for Independent Variables by internal Control Quality
Material Weakness Effective ICOFR Total sample
N = 340 2,051 2,391
Variables: Mean Std. dev. Mean Std. dev. Diff Stat.
1
p-value Mean Std. dev.
MWIC 14%
SIZE 19.960 1.679 20.672 1.882 -6.552 0.000 *** 20.570 1.871
FININD 9% 17% 12.515 0.000 *** 16%
HIGHTECH 26% 22% 3.346 0.067 * 23%
ROA -0.041 0.250 0.026 0.383 -3.086 0.002 *** 0.016 0.368
LEVERAGE 0.227 0.262 0.227 0.229 -0.030 0.976 0.227 0.234
GOCERN 1% 1% 0.405 0.525 1%
EXT 3% 4% 0.687 0.407 4%
SEGNUM 2.391 1.744 2.505 1.799 -1.090 0.276 2.489 1.791
LOSS 40% 21% 59.229 0.000 *** 24%
RESTATE 54% 11% 377.837 0.000 *** 17%
AFEE 0.005 0.006 0.003 0.006 5.437 0.000 *** 0.003 0.006
AOPIN 44% 34% 16.609 0.000 *** 35%
AUDCHG 16% 6% 50.799 0.000 *** 7%

(continued)
26
Table 2
(continued)

Panel C: Descriptive Statistics for Independent Variables by Types of Internal Control Weakness

General Weakness Specific Weakness Total sample
N = 126 214 340
Variables: Mean Std. dev. Mean Std. dev. Diff Stat.
1
p-value Mean Std. dev.
GMWIC 37%
SIZE 19.674 1.657 20.129 1.673 -2.430 0.016 ** 19.960 1.679
FININD 10% 8% 0.348 0.555 9%
HIGHTECH 32% 23% 2.862 0.091 * 26%
ROA -0.076 0.243 -0.020 0.251 -1.989 0.047 ** -0.041 0.250
LEVERAGE 0.226 0.245 0.227 0.273 -0.031 0.976 0.227 0.262
GOCERN 2% 0% 2.498 0.114 1%
EXT 5% 2% 1.490 0.222 3%
SEGNUM 2.472 1.794 2.343 1.716 0.658 0.511 2.391 1.744
LOSS 48% 36% 4.842 0.028 ** 40%
RESTATE 45% 60% 6.792 0.009 *** 54%
AFEE 0.005 0.005 0.005 0.007 0.471 0.638 0.005 0.006
AOPIN 33% 51% 10.407 0.005 *** 44%
AUDCHG 17% 15% 0.243 0.622 16%

See Table 1 for variable definitions
1
Chi-square or t-statistic as appropriate.
*** p-values are significant at .01 level, ** p-values are significant at .05 level, * p-values are significant at .10 level.





27
Table 3 Pearson Correlation Matrix among Variables

MWIC SIZE FININD HIGHTECH ROA LEVERAGE GOCERN
DELAY 0.409 *** -0.262 *** -0.059 *** 0.015 -0.031 0.021 0.064 ***
MWIC 1.000 -0.133 *** -0.072 *** 0.037 * -0.063 *** -0.001 0.013
SIZE 1.000 0.265 *** -0.300 *** 0.065 *** 0.311 *** -0.075 ***
FININD 1.000 -0.238 *** 0.120 *** 0.124 *** -0.040 ***
HIGHTECH 1.000 -0.176 *** -0.214 *** 0.025 *
ROA 1.000 -0.049 *** -0.134 ***
LEVERAGE 1.000 0.016
GOCERN 1.000


EXT SEGNUM LOSS RESTATE AFEE AOPIN AUDCHG
DELAY -0.036 * -0.085 *** 0.174 *** 0.182 *** 0.212 *** 0.021 0.078 ***
MWIC -0.017 -0.022 0.157 *** 0.398 *** 0.111 *** 0.079 *** 0.146 ***
SIZE 0.228 *** 0.431 *** -0.331 *** 0.027 * -0.433 *** 0.283 *** -0.084 ***
FININD -0.015 0.089 *** -0.190 *** -0.032 ** -0.097 *** -0.014 -0.008
HIGHTECH -0.122 *** -0.212 *** 0.307 *** -0.021 0.172 *** -0.145 *** 0.003
ROA -0.007 0.041 *** -0.351 *** -0.013 -0.090 *** 0.094 *** -0.019
LEVERAGE 0.084 *** 0.093 *** 0.029 ** 0.047 *** -0.169 *** 0.138 *** 0.005
GOCERN 0.015 -0.021 0.133 *** 0.006 0.200 *** -0.124 *** 0.001
EXT 1.000 0.126 *** -0.003 -0.034 ** -0.096 *** 0.234 *** -0.006
SEGNUM 1.000 -0.159 *** 0.025 * -0.144 *** 0.196 *** -0.024
LOSS 1.000 0.037 *** 0.216 *** -0.071 *** 0.028 **
RESTATE 1.000 0.057 *** 0.041 *** 0.089 ***
AFEE 1.000 -0.136 *** 0.031 **
AOPIN 1.000 -0.087 ***
AUDCHG 1.000

*** p-values are significant at .01 level, ** p-values are significant at .05 level, * p-values are significant at .10 level.
28
Table 4 Regression Models of Audit Delay Difference between 2003 and 2004 for a Sample of Firms that Filed Section 404 Report
between January 2005 to June 2005

Model 1a (Test of H1) Model 1b (Year =2003) Model 1c (Year =2004)
Variables: Estimate t-value p-value Estimate t-value p-value Estimate t-value p-value
Intercept 77.652 15.404 0.000 *** 77.652 14.400 0.000 *** 112.890 23.988 0.000 ***
YEAR (1=’04, 0=03) 35.391 4.936 0.000 ***
SIZE -1.840 -7.253 0.000 *** -1.840 -6.780 0.000 *** -2.319 -9.866 0.000 ***
SIZE * YEAR -0.487 -1.356 0.175
FININD 3.415 3.500 0.000 *** 3.415 3.272 0.001 *** 0.162 0.176 0.860
FININD * YEAR -3.187 -2.285 0.022 **
HIGHTECH -2.802 -3.132 0.002 *** -2.802 -2.928 0.003 *** -2.498 -2.992 0.003 ***
HIGHTECH *YEAR 0.338 0.265 0.791
ROA 1.073 1.196 0.232 1.073 1.118 0.264 0.705 0.751 0.452
ROA * YEAR -0.367 -0.271 0.787
LEVERAGE 13.444 8.573 0.000 *** 13.444 8.014 0.000 *** 6.080 4.163 0.000 ***
LEVERAGE*YEAR -7.346 -3.297 0.001 ***
GOCERN 10.209 3.308 0.001 *** 10.209 3.092 0.002 *** 2.683 0.758 0.448
GOCERN * YEAR -6.519 -1.328 0.184
EXT 2.235 2.204 0.028 ** 2.235 2.060 0.039 ** 0.114 0.069 0.945
EXT * YEAR -2.132 -1.041 0.298
SEGNUM 0.959 4.587 0.000 *** 0.959 4.288 0.000 *** 0.156 0.799 0.424
SEGNUM * YEAR -0.794 -2.670 0.008 ***
LOSS 3.115 3.613 0.000 *** 3.115 3.378 0.001 *** 3.045 3.475 0.001 ***
LOSS * YEAR -0.021 -0.016 0.987
RESTATE 9.863 4.451 0.000 *** 9.863 4.161 0.000 *** 7.011 8.311 0.000 ***
RESTATE * YEAR -2.950 -1.231 0.219
AFEE 664.047 3.863 0.000 *** 664.047 3.612 0.000 *** 270.807 4.632 0.000 ***
AFEE * YEAR -401.308 -2.190 0.029 **
AOPIN 3.144 4.249 0.000 *** 3.144 3.972 0.000 *** 2.259 3.166 0.002 ***
AOPIN * YEAR -0.862 -0.806 0.421
AUDCHG 4.459 2.594 0.010 ** 4.459 2.425 0.015 ** 2.186 1.765 0.078 *
AUDCHG * YEAR -2.513 -1.152 0.250

N = 4867 2476 2391
F-statistic 89.787 0.000 *** 20.388 0.000 *** 29.356 0.000 ***
Adj. R
2
0.330 0.092 0.134
29
Table 5
Regression Models of Relationship between Audit Delay and Internal Control Quality

Test of H2, Model(2a) Test of H3, Model (2b)

Effective ICOFR vs. Material Weakness General vs. Specific Material Weakness

Variables: Estimate t-value p-value Estimate t-value p-value
Intercept 105.161 23.582 0.000 *** 92.615 6.011 0.000 ***
MWIC 16.314 17.264 0.000 *** N.A. N.A.
GMWIC N.A. N.A. 34.358 2.330 0.020 **
SMWIC N.A. N.A. 15.027 1.023 0.306
SIZE -1.961 -8.811 0.000 *** -2.060 -9.081 0.000 ***
FININD 0.336 0.388 0.698 0.064 0.073 0.942
HIGHTECH -2.417 -3.071 0.002 *** -2.658 -3.310 0.001 ***
ROA 1.041 1.177 0.239 0.879 0.974 0.330
LEVERAGE 5.955 4.325 0.000 *** 5.718 4.068 0.000 ***
GOCERN 3.910 1.172 0.241 2.060 0.605 0.545
EXT 0.317 0.205 0.838 -0.461 -0.292 0.771
SEGNUM 0.077 0.420 0.674 0.023 0.123 0.902
LOSS 1.891 2.282 0.023 ** 2.220 2.628 0.009 ***
RESTATE 1.390 1.617 0.106 5.142 6.254 0.000 ***
AFEE 232.229 4.210 0.000 *** 267.600 4.759 0.000 ***
AOPIN 1.499 2.223 0.026 ** 2.277 3.317 0.001 ***
AUDCHG -0.132 -0.112 0.910 1.058 0.886 0.375

N = 2,391 N = 2,391
F-statistic 51.956 0.000 *** F-statistic 40.493 0.000 ***
Adj. R
2
0.230 Adj. R
2
0.199
30
Table 6 Regression Model of Relationship between Audit Delay and
COSO-based Categorizations of Internal Control Weakness



Variables: Estimate t-value p-value
Intercept 106.927 23.972 0.000 ***
SIZE -2.030 -9.112 0.000 ***
FININD 0.146 0.168 0.866
HIGHTECH -2.711 -3.438 0.001 ***
ROA 1.025 1.156 0.248
LEVERAGE 6.217 4.502 0.000 ***
GOCERN 3.032 0.906 0.365
EXT -0.337 -0.217 0.828
SEGNUM 0.026 0.141 0.888
LOSS 1.735 2.090 0.037 **
RESTATE 2.727 3.183 0.001 ***
AFEE 223.445 4.017 0.000 ***
AOPIN 1.819 2.696 0.007 ***
AUDCHG -0.034 -0.029 0.977
PERSONNEL 6.768 3.966 0.000 ***
PROCESS 8.801 7.350 0.000 ***
DOCUMENT 0.069 0.029 0.977
SEGREGATE 7.057 3.638 0.000 ***
ISPROCESS -1.243 -0.395 0.693
RISKASSESS 2.196 0.344 0.731
CLOSE 10.799 5.780 0.000 ***
CONTRENV -1.578 -0.519 0.604

N = 2,391
F-statistic 34.798 0.000 ***
Adj. R
2
0.229







31
Table 7 Regression Models of Audit Delay Difference between 2003 and 2004 for a Sample of Firms That are Not Required to
File Section 404 Report between January 2005 to June 2005
Model (1a) Year =2003 Model (1b) Year =2004 Model (1c)
Variables: Estimate t-value p-value Estimate t-value p-value Estimate t-value p-value
Intercept 51.909 6.993 0.000 *** 51.909 6.381 0.000 *** 32.726 4.560 0.000 ***
YEAR (1=04, 0=03) -19.184 -1.756 0.079 *
SIZE 0.464 1.114 0.266 0.464 1.016 0.310 1.572 3.958 0.000 ***
SIZE * YEAR 1.108 1.821 0.069 *
FININD 4.149 1.549 0.121 4.149 1.414 0.158 1.505 0.641 0.522
FININD * YEAR -2.644 -0.706 0.480
HIGHTECH -4.384 -2.219 0.027 ** -4.384 -2.025 0.043 ** -1.516 -0.848 0.397
HIGHTECH * YEAR 2.868 1.021 0.307
ROA 0.245 0.654 0.513 0.245 0.597 0.551 0.685 0.325 0.745
ROA * YEAR 0.439 0.184 0.854
LEVERAGE 9.324 3.139 0.002 *** 9.324 2.864 0.004 *** 7.845 2.762 0.006 ***
LEVERAGE * YEAR -1.480 -0.341 0.733
GOCERN 18.824 5.388 0.000 *** 18.824 4.916 0.000 *** 6.566 2.297 0.022 **
GOCERN * YEAR -12.258 -2.590 0.010 **
EXT 2.060 0.540 0.590 2.060 0.492 0.623 -4.039 -0.613 0.540
EXT * YEAR -6.099 -0.736 0.462
SEGNUM -0.189 -0.320 0.749 -0.189 -0.292 0.770 0.652 1.253 0.210
SEGNUM * YEAR 0.841 1.015 0.310
LOSS 6.820 3.880 0.000 *** 6.820 3.540 0.000 *** 3.602 2.115 0.035 **
LOSS * YEAR -3.217 -1.242 0.214
RESTATE 8.635 2.188 0.029 ** 8.635 1.996 0.046 ** 12.236 5.313 0.000 ***
RESTATE * YEAR 3.601 0.765 0.445
AFEE 0.455 0.885 0.376 0.455 0.807 0.420 47.671 1.572 0.116
AFEE * YEAR 47.216 1.394 0.163
AOPIN 5.472 3.152 0.002 *** 5.472 2.876 0.004 *** 5.941 3.161 0.002 ***
AOPIN * YEAR 0.470 0.172 0.863
AUDCHG 1.803 0.668 0.504 1.803 0.610 0.542 6.171 2.828 0.005 ***
AUDCHG * YEAR 4.369 1.202 0.230

N = 2611 1295 1316
F-statistic 7.724 0.000 *** 6.314 0.000 *** 10.224 0.000 ***
Adj. R
2

0.065 0.051 0.084
32
Figure 1
Number of Material Weakness in Internal Control by Type: Under
COSO Framework
0
50
100
150
200
250
P
e
r
s
o
n
n
e
l
P
r
o
c
e
s
s

a
n
d

P
r
o
c
e
d
u
r
e
D
o
c
u
m
e
n
t
a
t
i
o
n
S
e
g
r
e
g
a
t
i
o
n

o
f

D
u
t
i
e
s

I
n
f
o
r
m
a
t
i
o
n

S
y
s
t
e
m
s

P
r
o
c
e
s
s
R
i
s
k

A
s
s
e
s
s
m
e
n
t
/
C
o
n
t
r
o
l

D
e
s
i
g
n
C
l
o
s
i
n
g

P
r
o
c
e
s
s
C
o
n
t
r
o
l

E
n
v
i
r
o
n
m
e
n
t
Types of Material Weakness
F
r
e
q
u
e
n
c
y
This figure is based upon 340 firms that reported material weakness in their ICOFR through
Section 404 reports from J anuary 2005 to J une 2005. The total number of COSO-based
deficiencies disclosed is 649. On average, each firm has 1.9 COSO-based deficiency types.





















33
Appendix
Material Weakness Classifications and Examples

Internal control weakness data are collected from the Audit Analytics database. The internal
control problems are classified using two schemes. First, we classify material weaknesses into
general or specific problems. General material weaknesses refer to the internal control problems
that could influence the credibility of financial report over the company-level. Specific material
weaknesses refer to the internal control problems over the transaction-level or specific accounts.

Examples of General Weakness
- Inadequate financial reporting process and policies at the company-level
- Ineffective organization and accountability structure within the accounting function
- Ineffective control within control environment including management overriding,
ineffective audit committee function, board function, and internal audit function
- Ineffective risk assessment and fraud detection
- Insufficient personnel with appropriate qualifications and training in key accounting roles
- Lack of adequate communication of employees' duties and control responsibilities

Examples of Specific Weakness
- Ineffective control related to certain accounts or issues such as revenue reorganization,
lease accounting and etc.
- Insufficient personnel with appropriate qualifications and training in non-routine and
complex transactions such as derivative accounting, pension plan and etc.
- Insufficient documentation, process and policies related to certain accounts
- Lack of segregation of duties in certain accounts or transactions
- Information system deficiency related to certain accounts
- Weakness of reconciliation related to certain accounts


Secondly, we classify material weaknesses based upon the specific elements of internal control
systems under the COSO framework. Eight types of material weaknesses are identified:
Personnel, Process and Procedure, Documentation, Segregation of Duties, Information System
Process, Design of Control/Risk Assessment, Closing Process and Control Environment.


Examples of Personnel Issues
- Insufficient personnel resources with appropriate qualifications and training in
accounting, finance or information systems
- Lack of adequate personnel to effectively perform supervision and review
- Lack of permanent employees in key financial reporting positions
- Lack of a formal program for training members of the Company’s finance and accounting
group

Examples of Process and Policy Issues
- Lack of appropriate process or policy over financial reporting or certain accounts
- Failure to comply with GAAP, SFAS and FASB
34
- Insufficient controls over the selection and monitoring of appropriate methods or
assumptions

Examples of Documentation Issues
- Lack of documentation of the application of U.S. GAAP to transactions
- Lack of documentation of policies and procedures
- Lack of appropriate documentation to support journal entries
- Insufficient documentation with respect to the review of non-standard journal entries
- Inadequate documentation surrounding standard operating procedures for certain key
aspects of information technology environment

Examples of Segregation of Duties Issues
- Lack of segregation of duties in internal control procedure
- Deficiency in segregation of duties associated with personnel having access to computer
accounting or financial reporting record

Examples of Information System Process Issues
- Lack of effective information systems required to support operations and reporting
requirements
- Lack of information systems access and security controls to initiate, authorize, and record
transactions
- Insufficient control over information technology back-up, recovery and firewall
protections

Examples of Design of Control/Risk Assessment Issues
- Deficiency in the design and implementation of internal control over financial reporting
- Inadequate controls to monitor the results of operations and other control activities
- No consistent risk assessment process
- Lack of adequate mechanisms for anticipating and identifying financial reporting risks

Examples of Closing Process/Reconciliation Issues
- Ineffective controls over the period-end financial reporting process including the
procedures used for calculating significant estimates and performing consolidation entries
- Lack of effective controls over quarterly and annual financial statement close processes
- Failure to timely reconcile account balances
- Inadequate preparation and review of reconciliations

Examples of Control Environment Issues
- Senior management did not set an appropriate tone at the top that was conducive to an
effective control environment
- Weaknesses in the control environment which challenge the effectiveness of senior
management's communications regarding the importance of internal controls
- Ineffective control to prevent certain members of management from overriding certain
controls and effecting certain transactions and accounting entries
- Lack of internal audit review of subsidiary operations
- Ineffective oversight by the Audit Committee of financial reporting process and internal
control over financial reporting