This action might not be possible to undo. Are you sure you want to continue?
Open Government Directive
Framework for the Quality of Federal Spending Information May 14 2010
TABLE OF CONTENTS Background " of the Data Qual ity Framework 4 4 4 4 ; : 4 5 8 9 10 10 11 12 12 13 26 27 29 29 29 " 31 32 32 32 33 " .. 42 47
Section 1: Implementation
l.a. Governance Structure 1.a.1 Scope of Federal Spending 1.a.2 Internal Control Governance Structure 1.a.3 Leveraging the OMB A-123 Program 1.b. Risk Assessment 1.b.l Assessment of Inherent Risks in Sub-Processes l.b.2 Apply Criteria for Assessing Inherent Risk l.b.3 Set Priorities for Testing Based on Risk Assessment 1.b.4 Apply Criteria for Assessing Inherent Risk 1.c. General Governing Principles and Control Activities 1.c.l. Governing Principles 1.c.2. Control Activities l.d. Communications l.e. Monitoring Section 2: USASpending.gov Data 2.a. Grants and Cooperative Agreements 2.a.l Compile and Review 2.a.2 Monitor 2.b. Contracts 2.b.l Compile and Review 2.b.2 Monitor Section 3: Recipient-Reporting APPENDIX A: Description of Risk Factors APPENDIX D: Acronyms
TABLE OF FIGURES Figure 1: NSF's OMB A·123 Organizational Figure 2: Risk Assessment Chart Figure 3: NSF's Annual/Quarterly Financial Statements ~ Structure .6 11 17 19 23 Process 30 31 33 35 36 " ""., " 37
Figure 4: OMB A·123 Deliverables and Responsibilities Figure 5: Flowchart Legend Figure 6: Transparency Act Monthly Figure 7: Performance Figure 8: Performance
Metrics - Grants and Cooperative Agreements Metrics - Contracts
Figure 9: ARRA Recipient Reporting Quarterly Schedule Figure 10: Data Quality Review and Timeframe Figure 11: Review Process for Days 22 - 29
Background The National Science Foundation's (NSF) senior management has the responsibility for
transparency and open government. The Foundation is taking specific actions to implement the principles of transparency, participation, and collaboration set forth in the President's Memorandum on Transparency and Open Government dated January 21, 2009 and the Open Government Directive (M-1O-06) dated December 8, 2009. On February 8,2010, the framework for the quality of Federal spending information for the
Open Government Directive was issued. Agencies are accountable for the quality of Federal spending information that is publicly disseminated through such public venues as USASpending,gov and other similar websites, and are working to improve the quality and integrity of that information. The three prlnclples; transparency, participation, and collaboration, form the cornerstone of an open government. Transparency promotes accountability by providing the public with information about what the Government is doing. Participation allows members of the public to contribute ideas and expertise so that their government can make policles with the benefit of information that is widely dispersed in society. Collaboration improves the effectiveness of Government by encouraging partnerships and cooperation within the Federal Government, across levels of government, and between the Government and private institutions, The Foundation has placed an internal control environment, as defined in the OMB A-123, Management's Responsibility for Internal Control, over the preparation and dissemination of financial data, This includes implementing an organizational structure, policies, processes and systems in order to ach ieve the following three objectives: (1) effectiveness and efficiency of the operations producing and disseminating financial information, (2) reliability of the financial information reported, and (3) compliance with applicable laws and regulations. Section 1: Implementation of the Data Qualitv Framework
1.a. Governance Structure
La.I Scope of Federal Spending
• • • • USA Spending Annual audited financial statements: Published in the Annual Financial Report located at: http://www.nsf.gov/pubs/201O/nsfl0001/index.jsp Grants. gov Federal Procurement Data System - Next Generation (FPDS-NG)
l.a.2 Internal Control Governance Structure
The National Science Foundation established and maintains effective internal control and financial management systems that meet the objectives of the Federal Manager's Financial Integrity Act of 1982 (FMFIA) and the revised OMB Circular A-123, Management's Responsibility for Internal Control (A-123), which implements FMFIA. The FMFIA
encompasses accounting and administrative and administrative
controls, which include program, operational,
areas as well as accounting and financial management.
The Foundation conducts an annual assessment and report on the effectiveness of internal" controls over financial reporting and ongoing evaluations and reports on the adequacy of systems of internal accounting and administrative control in accordance with OMB A-123 to reasonably ensure that "(i) obligations and costs are in compliance with applicable law; (ii) funds, property, and other assets are safeguarded against waste, loss, unauthorized use or misappropriation; and (iii) revenues and expenditures applicable to agency operations are properly recorded and accounted for to permit the preparation of accounts and reliable financial and statistical reports and to maintain accountability over the assets. " The Foundation minimizes all known and potential risks affecting NSF's mission, sustains a clean audit opinion, and leverages the A-123 structure to efficiently respond to new and emerging compliance requirements.
1.a.3 Leveraging the OMB A-123 Program OMB Circular A-l23 prescribes that liThe proper stewardship of Federal resources is an essential responsibility of agency managers and staff. Federal employees must ensure that Federal programs operate and Federal resources are used efficiently and effectively to achieve desired objectives. Programs must operate and resources must be used consistent with agency missions, in compliance with laws and regulations, and with minimal potential for waste, fraud, and mismanagement." Roles and Responsibilities Senior management support for the A-123 assessment process is critical to its success. The Accountability and Performance Integration Council (APIC), NSF's Senior Assessment Team, provides core leadership and makes key decisions to direct the A-123 process. The A-123 Program Manager is the day-to-day manager of the program that works closely with the business process owners, the Internal Control Working Group (lCWG) and APIC to implement the A-123 process. In each phase, the different parties involved have an assigned role to complete various tasks. The NSF's A-123 organizational structure, as depicted in the chart below, is based on guidance found in OMB Circular A-123 and the CFO Council's "Implementation Guide for OMB Circular A-123."
Figure 1: NSF's OMB A-123 Organizational Structure.
Director of National Science Foundation The Director of NSF is responsible for the establishment and maintenance of management control systems within NSF in accordance with Federal laws, regulations and standards. This responsibility is driven by the Director's annual requirement to report to the President and Congress whether the agency can provide reasonable assurance that its controls are functioning as intended and achieving its intended objectives; and whether agency financial systems are reporting and are in conformance with the government-wide Joint Financial Management Improvement Program (JMFIP) and other information technology (IT) related requirements. The Director's review and concurrence with this assertion is induded in the Foundation's Annual Financial Report. Senior Management Roundtable (SMaRT)
SMaRT is NSF's Senior Management Council. The accountability and oversight responsibilities encompass a large range of issues across the agency. SMaRT membership includes all Assistant Directors and Office Heads and is chaired by the NSF Deputy Director, who also serves as the Chief Operating Officer (COO). SMaRT is responsible for assessing and monitoring deficiencies in internal control resulting from the overall FMFIA assessment process. SMaRT advises the Director of NSF on the level of reasonable assurance regarding the Foundation's compliance with FMFIA, OMS Circular A-123, Appendix A, and other major laws and regulations. SMaRT also provides the Director with the status of corrections to any existing or new material weaknesses that may need to be reported to the President and the Congress through annual reporting. Accountabifity and Performance Integration Council (AP/C) include providing APIC's
APIC serves as the agency's Senior Assessment Team. Its responsibilities oversight to the documentation, monitoring,
and reporting of internal control.
responsibility for internal control integrates the assessment of program performance and operations as it relates to the Foundation's strategic goals. APIC provides guidance, focuses on the assessment over Internal Control over Financial Reporting (lCOFR), as well as
compliance controls as designed in accordance with laws/regulations,
and the economics
and efficiency of agency operations. APIC is chaired by the Chief Financial Officer (CFO) and includes four Assistant Directors/Office Heads, the Chief Human Capital Officer, the Chief Information Officer, and the General Counsel. The CFO is responsible for providing executive secretariat support to the COO for coordination and analysis of the annual organizational assessment of internal controls. The CFO provides the findings from the agency-wide review to the COO and SMaRT for consideration. Internal Control Working Group (ICWG) The ICWG is appointed by APIC to provide guidance and develop the strategy for the overall FMFIA assessment and effectiveness of ICOFR reviews. The ICWG is responsible for reviewing and making recommendations to APIC regarding the determination of the scope and approach for the FMFIA and A-123, Appendix A assessments. ICWG participates in the assessment (as related to their knowledge/expertise of the respective business process being assessed), analyzes the results, and advises APIC on the results. The ICWG consists of management including staff at the level of Deputy Director, Branch Chief, and Senior Advisor. Many members of ICWG are also business process owners who perform a Significant role in each phase of the A-123, Appendix A implementation process for the assessment of the effectiveness of ICOFR. OMB A-123 Program Manager (Executive Secretary of APIC) The Internal Controls Program Manager, appointed by the CFO, serves as the Executive Secretary of APIC and facilitates alllCWG meetings. The InternalControls Program Manager ensures that the annual organizational review for FMFIA, FFMIA and OMB Circular A-l23, Appendix A is com pleted for timely consideration by APIC. The Internal Controls Progra m Manager facilitates the ICWG meetings and oversees the implementation of the NSF internal control program. In addition, as the Executive Secretary of APIC, the Internal Controls Program Manager is responsible for reporting the results of the internal control reviews to APIC and provides supporting documentation for the Assurance Statement in the Annual Financial Report .: Internal Control Quality Assurance Team The A-123 Team includes the Internal Control Quality Assurance Program Manager, Business Process Owners, and Contractors. NSF Management relies on the A-123 Team for documenting the individual business processes and sub-processes. These processes are first discussed and recorded in a process narrative format. Business Process Owners The A-l23 Team relies on the business process owners for verifying the validity of control documentation as discussed in the process narrative, control matrix, and process flow diagrams; identifying testable key controls; providing the documentation and general computer the corrective actions. necessary for testing the business process, application, testing results; and implementing controls; evaluating the
l.b. Risk Assessment NSF plans to leverage the existing OMB Circular A-l23 Implementation risk areas for Federal spending. The OMB A-123 Implementation types of risks for management to consider as follows: • guidance to identify high
guidance outlines the different
Inherent risk - the susceptibil ity of an assertion to misstatement,
assuming there are no
related specific control activities. Inherent risk factors include: the nature of the agency's programs, transactions and accounts and whether the agency had significant audit findings • Control risk - the risk that misstatements are not prevented or detected by the agency's internal control (assessed separately for each significant financial statement assertion in each significant process or accounting application). Refer to the Financial Reporting Assertions section below for the financial statement assertions Combined risk - the likelihood that a material misstatement would occur (inherent risk) and not be prevented or detected on a timely basis by the agency's internal control (control risk). Fraud risk - the risk that there may be fraudulent financial reporting or misappropriation of assets that causes a material misstatement of the financial statements
The next step taken to improve the quality of Federal spending information is to identify and assess the inherent risk within existing processes and systems that an error could occur and not be detected that could result in misstating or misrepresenting Federal spending information that is publicly disseminated. In this step, each process and system is assessed against ten inherent risk factors (Volume of Transactions, Complexity of Process, Level of Manual Intervention, Fraud Risk, Management Override, Non-Routine Transaction/Estimation, Managed by a Third Party, History of Audit Issues, Changes in Laws and Regulations, and Human Capital Management) and the potential level of risk will be measured for each risk factors. Refer to Appendix A for a description of each risk factor. The inherent risk assessment require significant judgment and are performed by process owners and staff that are integrally involved with the processes/sub processes who have a working knowledge of the associated risks. Similar to the selection of significant reporting requirement, qualitative and quantitative factors are considered in the risk assessment process. APIC and the ICWG, with support from the A-l23 Team, determine the criteria to be applied in assessing the inherent risk of each process/sub process and ensure that the agreed methodology documented. is consistently applied and sufficiently
The inherent risk assessment begins at the sub process level by considering each of the ten risk factors. When assessing risk, process owners consider the level of inherent risk within the sub process regardless of any internal controls that may be in place to mitigate the risks. The assessment of inherent risk is explained in greater detail in section 2.7.1 of this manual. The results of the inherent risk assessment provide a quantitative determine from ICWG, additional subjective or qualitative basis for the ICWG and APIC to the extent of internal control testing required for each sub process. Based on input criteria are applied to confirm the scope of sub
processes to be tested in the current year. These qualitative the process and the previous year test results.
criteria may include the stability of
An assessment of control risk ta kes place following the in herent risk assessment to ascertain whether the key internal controls are effectively designed to meet the intended objectives and address the key data elements that involve the greatest risk of data quality problems, as well as those elements of particular interest to the public. The assessment of control risk reveals which controls require remediation and which controls are adequately designed to allow for testing. The control risk and inherent risk are combined to ascertain the overall risk and likelihood of a misstatement or misrepresentation of Federal spending information.
t.b.t Assessment of Inherent Risks in Sub-Processes
Inherent risk is the susceptibility of a relevant assertion to a misstatement that could be
material, either individually or when aggregated with other misstatements, assuming that there are no related controls. Similar to documentation, once.a baseline of inherent risk is established, the process owners are able to update the rationale to support the risk scores. Inherent risk is assessed at the sub-process level and assigned to the relevant process owner (Division or Branch). Most business processes are shared by more than one Division. For example, each process owner who plays a role in the Pre-Award sub-process perform a risk rating based on specific duties, procedures, and systems used to carry out the sub-process. For the Pre-Award sub process, an assessment of inherent risk would be performed by the DGA and DACS. For shared sub processeswhere there is significant overlap in the functions performed by multiple Divisions, the A-123 Team consolidates the descriptions of risk and the risk scores under one sub process. Similar to the A-123 documentation, (including narratives, control matrices, and flowcharts) the process owner uses the baseline documentation of inherent risk and updates this every year. The A-123 Team reviews the updated risk information and works with the process owner to determine whether the risk score should change. When a new process/sub process is introduced, the A-123 Team assists the process owner with compiling the explanation of risks using the interview questions in the Risk Factor Criteria document. The process owners may also use the interview questions to guide them in updating the baseline descriptions of inherent risk. As the process owner assesses the inherent risks, they are taken Into consider the procedures, resou rces (staff), and systems used from the beginning to the end of the process/sub process and focus on the key transactions performed. The process owner takes into consideration the risk of errors, fraud, or system failures occurring in the absence of the controls in place. This is not an easy task because the inclination is to often rely on the controls that are in place and assume that the risk is low. The process owner is encouraged to look beyond the controls and consider the complexity the sub-process requires significant manual intervention steps involving multiple of the transaction, the possibility For instance, if The and circumstances under which an error may not be prevented or detected.
and requires several elaborate
Divisions, the sub process may be considered highly complex.
Inherent Risk Assessment Tool described in the next section is designed to assist the process owner is assessing the overall inherent risk for a sub process applying eight different risk
factors, the risk. and provides a template for recording the risk score and describing the nature of
l.b.2 Apply Criteria for Assessing Inherent Risk For each of the ten risk factors, the matrix contains a definition of the inherent risk, an example, the criteria for selecting a 'high', 'medium', or 'low' risk and a set of questions to assist the process owner with writing a description of the risk. The A-123 Team works closely with each business process owner to apply the criteria consistently. Stilt assessing inherent risk can be a subjective process and compels the process owner to be open-minded about identifying the risk of an error occurring despite the controls that are in place. An important note to keep in mind during the risk assessment is that assigning a 'high' score for a particular risk factor does not indicate that the process is ineffective or judge the soundness of the process. Instead, the score indicates to the process owner and management in general, that strong controls are needed to prevent and detect errors. Additionally, testing may be necessary to gain confidence that the key data elements are published.
1.b.3 Set Priorities for Testing Based on Risk Assessment The summary of factors considered for cycle testing key internal controls are the subprocesses and key controls recommended for testing in the current and next two fi,scal years based on additional criteria applied by the A-123 Team and ihformation gathered from the process owners. Key internal controls associated with a sub-process identified as having: 1) a low inherent risk, 2) minor change in the procedures during the past year, and 3) whose operation were tested and found effective, may be placed on a three-year testing cycle. However, these factors are reevaluated each year. Sub-processes that were identified as having an overall high level of inherent risk and/or those recognized by management as an area that requires special attention require testing in the current fiscal year. Unless the sub process becomes recognized as having a lower inherent risk, testing is likely to be required annually. Sub-processes that were identified as control mechanism that are designed to mitigate risk are also highlighted
in the schedule. criteria is
For sub processes with a lower overall inherent risk score, the following additional .applied to determine if testing is needed in the current fiscal year includes: • • • The stability of the sub-process during the period under review The result of the prior year's testing of individual key controls in A-123.
The sub process was recognized by management as an area that requires special attention
The A-123 Team assesses the spread of the overall risk scores to identify the most frequent total risk scores among the sub-processes. applying additional qualitative When the most frequent scores fall with 1-2 if any additional sub-processes are points of the median score 16, the Internal Control Working Group is responsible for criteria to determine tested in the current fiscal year. Through experience and past management/review
activities, ICWG discusses and selects sub-processes for testing. The application of qualitative criteria is at the discretion of ICWG. The sub-process may be recognized by management as a highly complex process that requires special attention; and/or the high score of a particular risk factor is given special consideration (e.g. Fraud Risk, History of Audit Issues, Complexity of Process etc.). The following chart shows the relationship between the significance and likelihood 0 of an error resulting in a misstatement or misrepresentation of Federal spending information. Sub-processes that are more material and have a greater probability to result in a misstatement or misrepresentation (assuming no controls are in place) have a high level of inherent risk. In contrast, the nature of the sub-process provides the necessary information that the significance and likelihood results in a low inherent risk when the effect of an error is inconsequential and remote. Many of the sub-processes fall close to the median risk score.
More than Inconsequential
Measuring inherent risk: Assess and plot the significance and likelihood of financial misstatement for each. processes/ subprocess
Inconsequential More than
Figure 2: Risk Assessment Chart
1.b.4 Apply Criteria for Assessing Inherent Risk As described in the Financial Audit Manual (FAM), Section 260.02, control risk is a function of the effectiveness of the design and operation of internal control in achieving the entity's objectives relevant to preparation of the entity's financial statements. Five components that are critical to the structure of internal control are: control environment; risk assessment; information and communication; monitoring; and control activities. During the updating of documentation and the inherent risk assessment, the A-123 Team and process owners worked to identify conditions that significantly increase inherent, fraud, and control risk. Based on identified control environment, risk assessment, communication, monitoring and/or control activity weaknesses, the A-123 Team applied this knowledge to any identified control risks to arrive at a conclusion about the effectiveness outlined in the control matrix. of specific control activities
1.c. General Governing Principles and Control Activities 1.c.1. Governing Principles NSFwill continue to leverage the Treadway Commission's Committee of Sponsoring Organizations (COSO) and the Government Accountability Office's based guidance on governing principles over quality and integrity of data as part of its evaluation and testing methodology ofthe A-l23 Program. In response to revisions to Office of Management and Budget (OMB) Circular A-123 Appendix A, Internal Control over Financial Reporting, the National Science Foundation implemented a compliant internal control program. Key components of the program's framework are based on utilizing the five standards of internal controls in the Federal government. The standards provide the overall framework for establishing and maintaining internal controls, identifying and addressing major performance and management challenges, and identifying and addressing areas of greatest risk for fraud, waste, abuse, and mismanagement. five standards of internal control are: • • • • • Control Environment Risk Assessment Control Activities Information Monitoring and Communication The Government Accountability Office's
The standards are used as the basis against which internal controls are evaluated. Consistent with the framework, the Department's senior program management has determined that the following financial reporting assertions are leveraged as guiding principles to the Foundation's annual internal control and financial data integrity . assessment activities: • Presentation and Disclosure o Financial data or information o o
is presented in a logical and coherent format
with all relevant disclosures included Financial data or information is exhibited in its proper context to ensure clarity and accuracy Financial data or information is cross referenced to other dataset containing the same data to ensure accuracy
Existence or Occurrence o Financial activities or events are fully reported o o Support documentation Support documentation is adequate and readily available is consistent with government wide guidance
Rights or Obligations o System and Process Owners are identified the financial information
and accountable for the quality of
presented and related internal controls.
Completeness or Accuracy o All required financial information o
Financial data presented is complete
Valuation or Allocation o o Financial data /lnforrnatlon is properly valuated and consistency Financial data and information is verified for reliability within the agency and with external resources.
Currently, the Foundation leverages these governing principles as part of its internal control program fra mework to ensu re the accuracy of the financial data generated by the Foundation's financial management systems.
l.c.2. Control Activities
l.b.6.1 Objectives of Assessment Process Internal control ave r fina ncial reporting (ICOFR) is one of the three primary control objectives of the Federal Manager's Financial Integrity Act. OMB Circular A-123 Appendix A requires the National Science Foundation to conduct annual assessments of ICOFRand report on the effectiveness of ICOFR and any material weaknesses identified Federal managers must ensure that internal control is an integral part of the entire cycle of planning, budgeting, executing, accounting, and reporting, rather than an ancillary activity performed outside this cycle of key functions. Management's Approach (Top-Down Approach)
In accordance with the OMB Circular A-123 Appendix A guidance, the Foundation follows a "top-down" or "NSF-wide" approach for implementing Appendix A of the Circular. The topdown approach to planning, documenting, and testing internal control overfinancial reporting begins with identifying the significant NSF"wide financial reports and working down to the underlying key processes, controls, and supporting documentation. The Foundation's APIC and ICWG considered which internal and 'external financial reports should be included in the A-123 process, the materiality threshold to be applied in selecting report line items, and the significant financial accounts, major transaction cycles, and key systems that should be included in the A-123 Appendix A scope. The APIC ICWG then recommended to the APIC Senior Assessment Team the A-123 scope and initiates for the annual process. Documenting
the Assessment Process
Management documents its assessment methodology and how key decisions were made during the plan ni ng step of the assessment. This documentation may be in the form of paper, electronic files, or other media.
The APIC ICWG documented the top-down assessment plan and methodology, including the
establishment of governing bodies, key decisions of the APIC ICWG, and communications with the program areas in a similar manner, each program area documents its establishment of a governance structure and related activities and key decisions. Documenting the Control Evaluation Process
The evaluation of ICOFR involves an assessment of entity-level controls and process-level controls. In this document, business processes affecting the financial statements are referred to as major transaction Major Transaction Cycles Based on an analysis of the financial reports included in the Foundation's A-123 scope and considering materiality, the APIC ICWG identifies and recommends to the APIC the major transaction cycles that should be evaluated. The APIC ICWG also identifies program areas that playa significant role in each of these cycles. b.S.2 Integration and Coordination with Other Control Activities cycles.
The Foundation is subject to numerous legislative and regulatory requirements that promote and support an effective internal control structure. Management coordinates and integrates the ICOFR assessment with these reviews, FMFIA, and other existing internal reviews to leverage the benefit of work already being performed and avoid duplication of effort. Examples of existing control-related activities include those listed below. • • • • • • • • • • • Federal Managers' Financial Integrity Act of 1982 (FMFIA) Federal Financial Management Improper Payments Information Single Audit Act, as amended Inspector General Act of 1978 (IG Act) Federal Information Information Security Management Documentation Act of 2002 (FISMA) Technology Management Reform Act of 1996 (Clinger Cohen Act) Improvement Act of 1996 (FFMIA) Chief Financial Officers Act of 1990, as amended (CFO Act) Act of 2002 (IPIA) Act of 2002 (Recovery Auditing) Section 831 of the Defense Authorization
NSF Enterprise Architecture
Financial Accounting System (FAS) Documentation
Other Internal Reviews APIC integrates its assessment under Appendix A with other related assessments. This coordination • includes: Determining the universe of assessments that may impact control objectives related to financial reporting. This can be accomplished by coordinating with the appropriate offices • Identifying the agency offices and officials responsible for the assessments
• • Meeting with officials to identify the objectives of assessments and determine whether there is any overlap with the Appendix A assessment objectives Ensuring that all control objectives are met in an efficient manner
It is important for APIC to coordinate the assessment under Appendix A with the agency personnel in charge of leading other assessments related to financial reporting.
The Annual Financial Statement Audit
The CFD Act requires agencies to both establish and assess internal control related to financial reporting through the preparation offinancial statements by management and the audit of these statements accounting firm}. by an independent auditor DIG or an independent public
The methodology and documentation utilized by management in conducting its assessment of internal control over financial reporting is similar to that used in conducting the financial statements audit. For example, DMB Bulletin 07-04, Audit Requirement for Federal Financial Statements, requires auditors to obtain an understanding of the components of internal control (See AU § 319.). Assess the level of control risk relevant to the assertions embodied in the classes of transactions, account balances, and disclosure components of the fina ncia I statements. Such controls include relevant IT general and application controls and controls relating to intra-entity and intra-governmental transactions and balances. For those internal controls that have been properly designed and placed in operation, the auditor performs sufficient tests to support a low level of assessed control risk. Those internal controls that have not been properly designed or placed in operation and those internal controls that are found to be ineffective will be reported in accordance with Section 7 (Audit Report) of this Bulletin. Agency management works with the independent auditor to create efficiencies in the financial statement audit and Appendix A assessment processes. Agency management: • Seek the perspective of the DIG or an independent auditor to determine whether management's identification of significant accounts, major classes of transactions, and relevant assertions are consistent with those identified by the financial statement auditor. Differences may exist between management's assessment due to factors such as materiality. • and the auditor's
Facilitate the exchange of information (l.e., sharing of documentation), where possible, between management and the auditors relating to their collective understanding of internal control over financial reporting. This exchange enables both parties to gain a more comprehensive processes and to identify key controls understanding of the financial reporting
Coordinate the timing of control testing and determine the level of reliance the financial statement auditor plans to place on the results of management's testing of key controls. (Note: Management cannot substitute the auditor's documentation or testing of key controls for its own assessment under Appendix A.)
Compare the results of management's Appendix A assessment of ICOFRwith the financial statement audit report on internal control (l.e., reportable conditions and material weaknesses) and investigate the reasons for any reporting differences The related accounting records, whether electronic or manual, supporting information, and specific accounts in the entity's financial reports involved in initiating, recording, processing, and reporting the entity's transactions How the entity's information system captures other events and conditions that are significant to the financial reports The financial reporting process used to prepare the entity's financial reports, including significant accounting estimates and disclosures
Use of SAS 70 reports A Statement on Auditing Standards No. 70: Service Organizations (SAS70) report is a report issued by an independent public accountant in accordance with standards promulgated by American Institute of Certified Public Accountants (AI CPA) on the internal controls of a servicing organization (e.g., a third party providing payroll processing services). AICPA SAS 70 defines the professional standards used by a service organization's auditor to assess the internal controls at a service organization. A SAS 70 Type II report is required of all Federal entities that service other Federal entities, perDMB Memorandum M-04-11, Service Organization Audits. SAS 70 Type II provides the following information: • A description of the service organization's control objectives, the key controls in place to achieve those objectives, and a list of user organization controls contemplated in the design of the service organization's controls An auditor's opinion on whether the service organization's description of its controls fairly presents the relevant aspects of the service organization's controls as of a specific date, whether the controls were suitably designed to achieve the specified control objectives, and whether the controls that were tested were operating with sufficient effectiveness to provide reasonable, but not absolute, assurance that the control objectives were achieved during the period specified An unqualified opinion on the SAS 70 Type II provides evidence that controls are operating effectively and that no additional testing of the process at the service provider is required. If the opinion is qualified, additional procedures are required. Compensating controls at either or both the service provider or the user organization must be evaluated and tested. The user organization is responsible for evaluating and testing the user controls (listed in the SAS 70) contemplated in the design ofthe service organization's controls. Although a service organization may provide assurances to the entities it serves relative to the financial data provided, the customer (user organization) is ultimately responsible for internal control over its financial. reporting. Consider whether the results of other procedures the auditor performed indicate there have been changes in controls at the service organization not identified by management. to
OMB cautions agencies on the use of SAS 70 reports and relying solely on this information
assess the internal controls over financial reporting. For instance, a Type II SAS 70 report is typically not sufficient on its own to satisfy the requirements 'under the Federal Information
Security Management Act (FISMA). An agency may still leverage SAS 70 reports during the A-123, Appendix A assessment if the process owner that oversees the outsourced service looks at the scope of the SAS70 report in the context ofthe overall internal control assessment when considering the nature and type of other assessment activities needed outside of the SAS 70 process. There could be activities being performed by the cross-servicing agency that are significant
but not to the overall cross-servicing agency's program. Another issue to consider may be problems with the interface with the cross-servicing agency's systems that are unique to the Foundation. During testing, the A-123 Team also confirm that the period of review for the SAS 70 reports collected falls within the same fiscal year of the A-123, Appendix A assessment. Plan and
Scope the Evaluation
The A-123, Appendix A implementation process commences each year with the planning and seeping phase. In this phase, management identifies and selects the financial reports, accounts, and business processes to be included in the A-123, Appendix A assessment. The output of the plan ning and seeping phase includes an outline of the business processes and sub processes to be documented, tested, and reported as support for the annual assurance statement.
Determining the Scope of Financial Reports
Planning begins with identifying which internal and external financial reports or information should be considered for inclusion in the A-123 scope. The National Science Foundation's quarterly and annual financial statements and notes are automatically included for consideration. Also, any other significant financial information, not derived from the financial statements, that is used internally for decision-making or shared externally (e.s.. via web site) should also be considered for inclusion in the scope. A-123 allows agencies to exercise discretion in selecting which financial information to include in the scope of the assessment. The Foundation uses the following financial statements to determine the scope of the A-123, Appendix A assessment.
NSF's Annual/Quarterly Statement of Net Cost Statement Statement
Financial Statements ------
Statement of Changes in Net Position of Budgetary Resources of Financing
Notes to the Financial Statements
Figure 3: NSF's Annual/Quarterly Financial Statements.
from the Financial Audit Manual
The Financial Audit Manual (FAM) Section 230.09 recommends "For capital-intensive entities, total assets may be an appropriate materiality base. For expenditure-intensive entities, total expenses may be an appropriate materiality base." Since the business focus is to make grants to universities and colleges to promote the and engineering, the agency is considered as an expenditure driven entity. the Statement of Net Cost (SNC) is recommended as the financial statement appropriately reflects the nature of operations. Foundation's field of science For this reason, that most
The FAM 230.10 also provides guidance that significant intra-governmental balances (such as funds with the U.S. Treasury, U.S. Treasury securities, and inter-entity balances) should be considered separately in determining the materiality because the nature of related party balances pose different risks.
Documentation of the Business Processes and Sub Processes
The purpose of documenting the business process is to describe the flow of transactions from inception to final disposition in the financial statements. This documentation was developed from the perspective of the process owner and depicts the flow of transactions, the key decision points, the databases that support the process, the interfaces with other processes or sub processes, and the key internal control activities that achieve the financial statement assertions. This docu mentation identifies the parts of the process that are susceptible to potential error or misstatements and outline the control objectives in place to mitigate the risk of a misstatement. Documentation forms the basis of support for management's assertion that controls over financial reporting are operating effectively. It is important, that the staff that is most knowledgeable about the process play an integral role in developing the documentation, namely the process owners and the members of the teams that execute or oversee the transactions taking place within the process. The A-123 Team works closely with the business process owners to assist in developing and updating the business process documentation. The standard A-l23, Appendix A, documentation • • • for each business process is comprised of:
Process Narratives (may also be referred to as cycle memos or process memoranda) Flowcha rts Control matrices
Checklist of Deliverables Update New Process Narrative Update New Flowcharts Update New Matrix Coordinate Processes Documentation of Shared
X X X
X X X X
Validate Final Documentation Assess the Design of Controls
Figure 4: OMB A-12.3 Deliverables and Responsibilities.
The table above shows the overall responsibilities of the A-123 team and the business process owners for A-123 related deliverables and activities. The process narrative clearly defines the control activities and key controls. Control activities are the specific policies, procedures, and activities that are established to manage or mitigate risks identified in the risk assessment process. Key controls are those controls designed to meet the control objectives and cover management's financial statement assertions. In other words, they are the controls that management relies upon to prevent and detect material errors and misstatements. Examples of controls that may be identified include: • • • • • • • • • Top-level reviews of actual performance Reviews by management Management Controls over information Establishment at the functional processing measures and indicator; for resources and records and internal control or actual level of human capital
Physical controls over vulnerable assets and review of performance to, and accountability Segregation of duties Access restrictions Appropriate documentation of transactions
controls, the process owner considered the presence of multiple controls
within the same transaction cycle. Some of these controls may serve as compensating controls to a key control. Thus, when the key control fails, the compensating control, which may not be as robust or may not be intended to satisfy the same financial assertions, may be considered as a backup control to mitigate the risk of a potential financial misstatement. It is useful to have this information during the testing phase in the event a key control is not operating effectively. Or, if a suitable remediation is not achieved in time for re-testing and reporting, a compensating control may be used as a proxy. Typically, a single compensating control within a major transaction there may be transaction problem. cycle would not be considered sufficient. Conversely, cycles that have more than one control to detect the same
The process owner documents controls with sufficient detail to facilitate someone unfamiliar with the process to understand and evaluate the control design and later create a test plan. A lack of documentation limits the ability of management to properly communicate the control processes throughout the organization and properly monitor internal control. Highlight Controls Process Since the narrative serves as the starting point for updating the documentation, as controls are described throughout the body of the document the process owner clearly indicates where a control is being described. Highlighting the "identified controls" in the narrative facilitates the updating of all other documentation including flowcharts and control matrices. Once documentation is complete the A-l23 Team allows enough time to crossreference the controls based on the assigned controls numbers outlined in the control matrix. This exercise assists the A-123 Team and the process owners with gaining confidence that the key controls are documented in the matrix for testing. Manage Shared Control Processes Transactions that flow from one division to another are recognized as having shared process ownership. Documentation of shared processes requires a significant level of coordination. The process descriptions reflect the key functions performed by each Division that is involved in the process. For example, the program office staff initiates and approves a purchase requisition which is later reviewed by the Division of Acquisition and Cooperative Support (DA"cS)to obligate funds and award the purchase order to a vendor. Another challenge documenting shared processes was the actual collection and entry of input from multiple sources on the same document. The Divisions may find it easier to complete this task by: • • • Divid ing up the sections perta ining to the work of the given Division Staggering the review and update the document to allow each Division to work using the same document Retaining edits using "Track Changes" in MS Word. jointly to agree and consolidate changes Later, the Divisions can meet
Applying the above steps helps prevent problems with document version control and eliminate the labor-intensive effort of merging documents with overlapping changes. When there is conflicting opinion on how to present a process, the A-123 Project Manager strongly encourages in-person meeting to resolve differences and reach closure. Each party involved in the shared process comes to an agreement to validate the document. Development of the Control Matrix The control matrix is a standard form of documenting targeted for testing. Matrices are beneficial for: • Providing a rigorous framework, adequately documented the individual control activities
helping to ensure that all relevant controls are
• • Providing a structured mechanism for identifying key controls
of the test plan and results
The control activity description and reference number links to the flowchart and/or narrative for easy cross-reference. The control matrix reflects the same sub processes outlined in the other components of documentation. Documentation a minimum, provides answers to the following seven questions: of control activities, at
1. What is the control objective?
2. 3. 4. 5. 6. 7. What is the risk being controlled? What is the control activity? Why is the activity performed? Who (or what system) performs the control activity? When (how often) is the activity performed? What mechanism is used to perform the activity (reports and systems)?
The control matrix illustrates other key attributes that enable management to assess the adequacy of the controls and develop test plans. These attributes typically address the nature of the control (manual or automated), whether the control is preventive or detective, the frequency of the control, the financial statement assertion being addressed, and the information processing objective being addressed.
Map the Financial Statement Assertions to Control Activities
The Financial Audit Manual Section 2.3.5 defines the financial assertions as "management representations that are embodied in financial statement components." The financial statement assertions provide a framework ofsupport for management's statement of reasonable assurance on the effectiveness of controls over financial reporting. A control that isreleva nt to more than one or all of the financial assertions may have more bearing on the risk of a financial misstatement. This is a significant characteristic of a key control. For each significant account and disclosure, management identifies and documents relevant financial statement assertions, as well as testing the controls that apply to those assertions. The table below provides examples of typical control objectives identified for each financial statement assertion:
Controls are controls that operate through and within IT systems and so
applications. They are typically programmed into applications and function automatically, they work with a very high degree of consistency. Automated controls facilitate the prevention and detection of errors and are generally more reliable and efficient than
manual controls. Automated controls are more reliable if they are well designed and operate within a sound control environment (see IT General Controls below). An example of an automated obligation." Because manual controls require human intervention, they are flexible but subject to control is "FAS blocks payment on an invoice that exceeds the amount of the
human error. An example of a manual control is "The Contracts Officer looks up the vendor
name on the Excluded Parties List website and database to confirm that the vendor is not debarred prior to making an award." In the case where systems (e.g., web-based support) are used to facilitate manual controls, the exercise of the controls and the completion human intervention and a greater susceptibility of the transaction to error. processing still entail
Identify Preventative vs. Detective Controls
Preventative controls are designed to inhibit or stave off errors from occurring (e.g., policies
and procedures, segregation of duties, authorization levels/approvals, and terms and conditions). Detective 'controls are designed to catch an error or identify an exception after it has occurred (e.g, exception reports, reconciliations, and inventory reviews). Preventative controls are preferred, and the preponderance of key internal controls is preventative in nature.
Frequency of the controls
It is important for the process owner-to accurately document the frequency of the control activity in the matrix, The frequency and the type of internal control helps determine the appropriate sample size to select for testing in accordance with the Federal Audit Manual (FAM). For example, in the case where the financial statements are produced quarterly, the FAM would require two quarters to be tested. Accordingly, the A-123 Team would select documentation from two quarters to perform the internal control testing. The control activity may occur annually, semiannually, quarterly, monthly, weekly, daily, or recurringwhich means the transaction takes place multiple times per day. The frequency of occurrence affects the determination of sample size,
Development of Flowcharts
Flowcharts provide a visual depiction of the sub process and the ch ronologlcal order of steps taking place during the life cycle of thetransaction. The cross-functional (horizontal) flowchart used as the standard format for A-l23 documentation is created in MS Visio and copied into Excel to facilitate the electronic access for users. Different flowchart shapes are used to distinguish the type of activity taking place in the workflow. The legend below explains the purpose of various flowchart shapes.
Key Control Activity
Document or Report Name
System or Database
Figure 5: Flowchart Legend.
The process owner reviews the flowcharts created in the prior year to determine if any updates are necessary. If there are updates to the process narrative and matrices, the flowcharts are modified to reflect those changes. The process owner initiates the changes by marking up the hardcopy of an existing flowchart. The A-123 Team takes the hardcopy and modifies the flowchart in Visio. To create new flowcharts the A-123 Team can sit with the process owner to draw the flowchart or use the new information gathered in the narrative to create the flowchart. Assess the Design oj Controls The A-123 Team works closely with the process owners to evaluate the adequacy of internal control design. Internal controls are designed to achieve intended objectives [e.g., financial statement assertions, mitigation of specific risks, operational efficiency). Control design considers the depth and breadth of the control activity. Depending on the nature of the control activity, its design achieves its intended purpose fully and consistently. For example, a manual reconciliation must be performed: (i) by knowledgeable staff, (ii) consistentlv throughout the period (e.g., fiscal year), (iii) in accordance with written policies, procedures, and other guidance as necessary, and (iv) with due regard for a proper separation of duties. Also, in the case of reconciliation, the disposition of irreconcilable items are reviewed and approved by a supervisor. In assessing control design, one considers the extent to which it appears to achieve its intended purpose (objectives), whether its design appears to be based upon a model control activity, and whether it appears to be operating effectively. Some key elements of a well-designed limited to: • • • Separation of duties Preventative or detective in finding errors calculates the unliquidated Automation of financial transactions (l.e., the application .balance of obligation after a cash draw down) internal control procedure include, but are not
Limited access to transactions authority)
(l.e. application login, automated job profile for user
When an internal control is found to have a flawed design, the process owner is responsible for remediation of the design deficiency. If the control design is inadequate, the agency will not achieve the desired assurance that the control is preventing or detecting a misstatement even if the control is operating as intended. Management remedies the design deficiencies. For example, when an internal control activity lacks a separation of duties, the process owner and the management team formulate and implement a solution that assures that there are adequate separation of staff roles and responsibilities such that the same person does not both initiate and approve a transaction. After implementing the redesigned control procedure, the process owner adjusts the A-123 documentation to accurately describe the new control procedure. Depending on timing of the corrective action (the A-123 period is July 1 through June 30), there may not be sufficient time to correct the control deficiency and test and verify the effectiveness of the corrective action. Accordingly, although corrected, the deficiency may be reported as a weakness depending on its severity (material or significant).
Validation of Documentation
The process owners and the staff assigned with the responsibility for developing and updating documentation verify that the documentation (the narratives, flowcharts, and matrices) is accurate and complete before the start of testing. The A-123 Team also reviews the documentation for discrepancies. The final validation of documentation is acknowledged in an email to the A-l23 Program Manager stating that the documentation is accurate and requires no additional changes.
Information Technology Controls
For each business process, the A-123 Team leverages the existing documentation of application systems and systems processing environments. The evaluation ofthe control structure with respect to systems is incorporated in the assessment. The control structure includes processes, such as computer operations and change management. It would be helpful to have those IT controls that relate directly to transaction cycles be documented separately to aid in the evaluation. In addition, ICOFRare frequently embedded within software applications. These are reflected on the previously discussed control matrix. The Foundation's information technology is subject to the standard compliance guidelines of governance and auditing. All financial and mixed systems are subject to OMB Circulars All, A-127, and A-130. Additionally, all systems are subject to certification and accreditation by the designated accrediting official in accorda nee with guida nce from the Office of the Chief Information Officer. System security may be accredited for a period of no more than three years or until in the certification and accreditation that may be contains information a major change has been made; the controls identified
(C&A) process are reviewed on an annual basis. C&A documentation
regarding a systems' IT controls and is an example of existing documentation
used for this assessment. Documents such as the System Security Plan, Risk Assessment,
Review of Security Controls, and System Test and Evaluation are also especially helpful. It is critical to assess technology-based (automated) controls and identify key controls in the IT system design. In addition, the processes used to comply with FFMIA and FISMA could serve as a foundation for documenting and evaluating IT controls. Computerized operations can introduce additional risk factors that are not present in manual systems. The assessment team considered each of the following factors and assesses the overall impact of computer processing on inherentrisk. Additional risk factors may include: • • • • • • Uniform processing of transactions Automatic processing misstatements Increased potential for undetected
Existence and com pleteness of the audit trail Nature of the hardware and software used Unusual or non-routine transactions
Assessing IT Risk The general methodology evaluating: • • • that is used to assess computer-related and installation controls involves
General IT controls at the entity-wide
levels being examined
General IT controls as they are applied to the application
Application controls, which are the controls over the input, processing, and output of data associated with individual applications
As part of assessing control risk, the A-123 Team works with the Division of Information Systems (D IS) to ma ke a prelimina ry assessment of whether computer-related controls are likely to be effective. This assessment may be based on discussions with DIS personnel (program managers, system administrators, information resource managers, and system security managers). Preliminary assessments may also take the form of observations of computer-related operations or reviews of written policies and procedures. Controls that are not properly designed or would not be effective indicate weaknesses that may need to be reported. Based on the assessment of inherent and control risks, including the preliminary evaluation of computer-based controls, the A-123 Team identifies the general controls and assess whether they are properly designed. If so, they are tested to determine if they are operating effectively.
There are six major categories of general controls that are considered: 1. 2. 3. 4. 5. Entity-wide controls--security program, planning, and management and change controls
Access controls Application software System software Segregation of duties
Application • • • • controls generally involve ensuring that: form and entered into the application completely accurately,
Data prepared for entry are complete, valid, and reliable Data are converted to an automated completely, and on time Data are processed by the application established requirements Output is protected from unauthorized accordance with prescribed policies IT Controls
and on time, and in accordance with or damage and distributed in
Docu mentation of IT Controls is included in copies of written policies and proced ures, written memoranda, and flowcharts of system conflguratlonsand significant processes, etc. The documentation identifies the control objectives [i.e., financial statement assertions) and related control points designed to achieve those objectives. Documentation by OMB Circular A-123, Appendix A/FISCAM Category provides requirements for information systems controls documentation to assist managers with this effort. Questions regarding IT controls are addressed to the applicable financial system investment manager. NSF effectively addressed the recent increase in the volume of information made publicly available, l.e. ARRA recipient reporting as welt as initial high quality data sets. NSF leveraged its existing policies and procedures and quickly adapted to the swiftness by which information is disseminated. NSF will continue to review, revise and communicate these policies and procedures based on experience and on the type of high quality data sets and other data to be released in the future. l.d. Communications The agency's approach for soliciting input and feedback from the scientific community and the
public has always been "early and often." To support this approach, NSF provides a variety of mechanisms to allow both public- as well as NSF-initiated contact for the stakeholders to interact with the agency and provide feedback. Examples of proactive forums the Foundation is employing to engage the public and the academic community • Open NSF webpage (http:Uopennsf.ideascale.coml) by the General Services Administration Open Government • and solicit their input include:
NSF used the IdeaScale tool provided
to solicit public input on ideas and suggestions for its
Plan. Through the site, the public can share' their ideas or vote on and
discuss ideas provided by other constituents, American Customer Satisfaction Index - An online pop-up survey, used widely across both. the government information and private sector, which NSF employs to measure user satisfaction with data that NSF can use to benchmark needs. to meet the community's services. This survey offers quantitative
itself to ensure that it is continuing
Feedback email aliases - NSF has multiple email aliases that the public and research community can use to contact the Foundation with questions or to provide feedback on a NSF services (firstname.lastname@example.org and in Open Government (email@example.com), variety of topics, including NSF policy (firstname.lastname@example.org), email@example.com), among others. NSF's participation
Online feedback forms - Available on the NSFwebsite and Research.gov, the public can use feedback forms to anonymously submit feedback to the Foundation. Feedback and inquiries received through online mechanisms are heavily monitored and suggestions are compiled for review and consideration. NSF representatives respond directly to inquiries received through feedback aliases. Questions that appear frequently are incorporated into "Frequently Asked Questions" documents, which are posted online and distributed during outreach activities.
The Chief Financial Officer (CFO) is the lead on public dissemination of federal spending information. The Office of legislative and Public Affairs (OlPA) communicates information about the activities, programs, research results and policies of the National Science Foundation. OLPA employs a wide variety of tools and techniques to engage the general public and selected audiences including Congress, the news media, state and local governments, other Federal agencies, and the research and education communities. 1.e. Monitoring The Office of Budget, Finance and Award Management Resource Management are devoted and the Office of Information award and contract and
to financial management,
processing and monitoring, outreach and other functions. There are several reviews conducted annually to monitor the quality of Federal spending information and enhance NSF's extensive post award-monitoring program by initiating reviews of expenditures. The following reviews help assure the accounta bllltv of taxpayer dollars: The A-123 Program NSF conducts an assessment of the effectiveness of internal control over financial reporting, which includes the safeguarding NSF provides an annual statement administrative, of assets and compliance with applicable laws and regulations. of assura nce on the effectiveness of their management, of their
and accounting controls (Section 2 of FMFIA) and the conformance
financial management systems with Federal financial systems standards (Section 4 of FMFIA). Based on the results ofthis assessment for the period ending June 30, 2009, NSF provides and of the internal controls. NSF's reasonable assurance that internal control over financial reporting was operating effectively no material weaknesses were found in the design or operation goal and performance effectiveness of internal control over financial reporting. Act (lP/A) Review: data and grant payments in accordance with IPIA from the NSFdatabase using a statistical sample measure is to attain an reasonable assurance on its assessment of the
The Improper Payment Information
NSF conducts a review of expenditure requirements.
Samples were determined
determination algorithm developed with a random number generation that selected at random
the specified number of Grant Award identifications
and then randomly selected the quarter for
which to be evaluated. Reviews included, but were not limited to the following: Does the cost represent expressly unallowable cost as cited in the Cost Principles, Grant Policy Manual, and award terms and conditions? Is this a duplicative payment? .
Were the services or products provided? Were the costs incurred during the period of performance? Does the payment agree with the terms of sub-award agreement? Was there adequate documentation? NSF's goal and performance measure is to be well below the significant standard of improper Act of 2002 and OMB Guidance.
payments defined as a total of improper payments exceeding $10 million and 2.5% of the total outlays as outlined by the Improper Payments Information The Purchase Card Expenditure Review Division of Acquisition and Cooperative Support (DACS) performs an annual review of internal
controls over purchase card spending. The objective of the review is to report non-compliance with policies and procedures within the Purchase Card Program Manual, which is in compliance with requirements of OMB Circular A-123, Appendix B, Improving the Management that the transaction Government by Charge Card Programs. The review includes verification adequate documentation performance was supported
(order forms, credit card slips, etc.) and proof of delivery, and the
purchase activity has been recorded in the Financial Accounting System (FAS). NSF's goal and measure is to obtain a favorable opinion on the compliance review report. and Business Assistance Program (AMBAP) managing the
The AMBAP program is designed to provide reasonable assurance that institutions Federal awards. Through its advanced monitoring information
highest risk awards possess adequate policies, processes, and systems for properly managing activities, AMBAP collects and analyzes practices at awardee institutions practices. on the business systems and award administration
and promotes dialogue between NSF and its awardees about award administration With many institutions awards at the Institution. activities increases the likelihood that the awardee will effectively administer
receiving more than one award, the assurance resulting from these all NSF-issued The AMBAP program includes site visits and desk reviews.
Federal Financial Report (FFR) Transaction Testing
NSF performs a baseline monitoring
of financial transactions intended to identify financial reporting, or misunderstandings requirements. FFRtransactional of, or nontesting includes: expenditure for selected reporting measure is
reporting anomalies, inaccurate expenditure compliance with, Federal cash management (1) establishing an appropriate documentation expenditures
sample of the award universe; (2)obtaining
for selected awards from grantees;(3)
in accordance with cost principles; and (4) determining
errors as well as documenting
the results of the review. NSF's goal and performance
to obtain a favorable opinion on the review. NSF's overall goal and performance statements measure is to obtain an unqualified accounting firm. opinion in its financial
audit conducted by an Independent
Section 2: USASpending.gov Data The following subcategories of awards are applicable to NSF are grants and contracts with grants being the predominant type of award. In FY 2009\ NSF grant awards were $8.16 billion and contract awards were $489 million. This section documents how NSF's data quality plan and control processes are applied specifically to Federal spending information submitted for USASpending.gov: 2.a. Grants and Cooperative Agreements 2.a.1 Compile and Review The process of submitting grants data to USAspending.gov and ensuring the data are accurate and complete involves organizations across the Office of Budget Finance and Award Management (BFA) and the Office of Information and Resource Management (OIRM). Various components of NSF's grant management and financial management systems house the data created and maintained by multiple organizations. The following organizations are involved in FFATA reporting. • BFA/Budget Division -The Budget Division (BO) ensures the completeness of the Federal spending information by comparing the obligation control totals on the report to the Financial Accounting System (FAS). BFA/Division of Financial Management (DFM) - Designated by the CFO, along with the Division of Institution and Award Support (DIAS), as responsible for Transparency Act reporting for NSF. Responsible, along with DIAS, for submitting data to USAspending.gov. BFA/Division of Institution and Award Support (DIAS) - Designated by the CFO, along .with DFM, as responsible for Transparency Act reporting for NSF. DIAS ensures . completeness of the Federal spending information by comparing the obligation control totals on the report to the records in the Awards System. Responsible, along with DFM, for submitting • data to USAspending.gov. OIRM/Division of Information Systems (DIS) -Responsible for executing the monthly data extract and converting the file to Excel to ease review by BFA divisions. Responsible for uploading data to USAspending.gov and resolving data transmission issues.
This includes NSF's regular appropriation and ARRA funding.
Process for Compiling and Reviewing Data: The diagram below outlines the monthly process to compile, review, certify, and submit Transparency Act data to OMB .
• Financial management system contractors under guidance of DIS run IT scripts to extract and compile data • DIS converts file(s) to Excel • DIS provides guidance and support as needed
• DIS sends data to BFA DFM and DIAS division paints of contact (POGs) • DFM and DIAS coordinate BFA review • BFA divisions review data and work with DFM, DIAS, and DIS to resolve concerns • BFA division POGs send sign-off to DFM and DIAS POGs
• DIS uploads file USAspending.gov • DIS works with OMB to resolve data transmission issues • DIS routes data or business Issues to DFM and DrAS POGs for resolution
Figure 6: Transparency Act Monthly Process.
Reporting Timeline: • By close of busi ness on the first day of the month, DIS shall submit the TXT data file, the Excel file, and the error report to the DFM and DIAS POCs.This file includes all grant and cooperative agreement obligations for the prior month. Approximately 85% of NSF transactions are reported within 30 days. By close of business on the 4th of the month, BFA divisions shall review and sign off on the data. By close of business on the 5th day of the month, DIS shall upload the file to USAspending.gov.
Close of busi ness is 5pm EST. If any of the critical dates in the timeline fall on a weekend or holiday, then the activity shall be completed bythe next business day. Ensuring Consistency of Data with Other Sources: NSF has taken the following venues: • steps to ensure consistency with data reported through other
Developing an automated script to pull data from the NSF system of record for financial data (FAS) and the Awards System, the authoritative source for grants data such as the CFDA number. FAS is also the source of the obligation amount reported to Treasury and the OMB via FACTSII/SF-133. When comparing FACTSII/SF-133 data to USAspending.gov, it is important to recognize that USAspending.gov is a subset of the information reported via FACTS II/SF-133. USAspending.gov requirements specify that obligations made using multiple TAS should report the predominant' source. NSF awards are often made using multiple appropriations. As a result, it is
not possible to reconcile the total by TAS reported to USAspending.gov and that reported via FACTSII/SF-133. • Using the DUNS number to do a CCR lookup and pulling grant recipient data such as name and address with grantee-reported data to CCR.
NSF has established procedures in place to ensure that the three performance metric (timeliness, completeness, and accuracy) relating to transmitting Federal spending information to USAspending.gov are met. The Federal spending information is extracted from the system of record, FAS,and the Awards system. The information is transmitted to representatives in BD and DIAS who then review for completeness. Upon review and approval, the data file is uploaded to USAspending.gov by a representative in DIS. A confirmation email is received from USAspending.gov by the business owners when the report has been transmitted successfully and again when the data has been successfully loaded into USAspending.gov. The table below shows NSF's performance metrics for grants and cooperative agreement awards:
~ -- -- - ---------
--------- - -Perio;'-man~e Metric
_._ within 30 days.
NSFtransmits financial spending information to USAspending on the 5 day following the end of the month.
85% of transactions processed
NSFreviews the data file for completeness by confirming the control totals.
100%2 score on USAspending.gov data quality scorecard. Unqualified audit opinion.
NSFverifies the accuracy of financial expenditure data through the internal and external audits.
Figure 7: Performance Metrics - Grants and Cooperative Agreements.
NSF is in the process of developing corrective NSF records and other authoritative Reporting Deficiencies NSF has identified • the following deficiencies in its USAspending.gov reporting: action plan to resolve identified deficiencies Registration (CCR).
relating to consistency of data elements, such as congressional districts and zip codes, between sources, such as the Central Contractor
NSF does not currently
break out the portion of an award made with interagency
funds and provide the TAS of the funding agency. In FY 2009, interagency funding accounted for 1.4% of total NSF grant obligations.
The only data elements receiving less titan 100% for completeness are those that are not applicable to aioard» to individuals which NSF aggregates in order to ensure no release of Privacy Act information. This includes data elements such as the name of the recipient, DUNS IDI and award title.
NSF reports the ZIP Code associated with the recipients CCR record and uses a commercial service to lookup the associated congressional district information. As a result of the ARRA data quality reviews we conduct, we have identified instances where the congressional district cannot be located fora ZIP Code reported in CCR.
NSF reports the Place of Performance identified
by the grantee at the time they
submitted the proposal. We do not collect this information at any other time during the business process or require that this information be updated. The grantee may change the place of performance 2.b. Contracts 2.b.1 Compile and Review The Automated Acquisition Management System (AAMS) is the automated acquisition tool used by the Division of Acquisition and Cooperative Support (DACS) Contracts Branch to generate procurement documents When a requisition is received in DACS, it is assigned to a Contract Specialist by the Deputy Branch Chief and then the funding information contained in the requisition is entered into AAMS manually by the front office administrator. The funding document is then brought to the Contract Specialist who then uses AAMS to develop the contract that contains the funding information entered by the front office administrator. If, at any time during the process, the requisition is found to be incomplete or unclear, the DACS Contract Branch Front Office Administrator, Contract Specialist or the CO contacts the Program Office Division Administrative Officer indicating what information is missing or unclear and resolves any issues. AAMS interfaces with the Federal Procurement Data System - Next Generation (FPDS) which collects, processes, and disseminates official statistical data on Federal contracting. Pertinent data entered into AAMS is transmitted to FPDS-NG and validated by the FPDS system. Prior to award, the FPDSdocument has not been finalized and is in a draft state. The contract and accompanying file documentation, which make up the award file, is forwarded to the Contacting Officer (CO) by the Contract Specialist for review and approval. The CO reviews the award file, including the FPDS-NG report, for accuracy and completeness and confirms that the level of fu nds com mitted is sufficient to cover the obligation. Depending on the dollar value of the action as stated in the NSF Contracting Manual, the DACS Contracts Branch Policy and Oversight Team may provide an additional review of the contract file which includes reviewing and validating the obligation data and data in the FPDS report. If the award file is correct and there are no higher level reviews required in accordance with NSF's Contracting Manual, the contract is signed by the CO, forming the legal obligation of funds, and the CO clicks the "award" icon in AAMS thereby awarding the action in AAM5 and making the FPDS-NG report final in FPDS-NG. The FDPS-NG interfaces with U5Aspening.gov to upload the Federal spending information. The obligation is then recorded by DACS personnel. Depending on the source of the funds, the obligation will be recorded electronically in the Awards System or manually into the FAS system. without notifying NSF.
All procurements are subject to review in accordance with the Chapter 14, DACS Contracting Manual, dated May 2008. The Contracting Oversight
Oversight Program of the NSF Contracting
Program requires various documents to be coordinated, at successive levels of authority.
reviewed and approved by individuals Executive.
Individuals that may be involved in the review process include by the policy team, details the types of NSFfulfills the Federal
the Program Official, CO, OGC, Contracts Branch Chief, and the Senior Procurement The review and approval matrix, created and maintained documents and the associated review thresholds. Acquisition Regulation (FAR) requirement
These review thresholds are based on the
estimated dollar amounts and risk of the proposed acquisition. to the U.S. General Administrative spending information the three performance performance Administration
to certify completeness and accuracy of contract data (GSA) and the Office Federal Procurement all required Federal relating to transmitting
Policy each year by January 5 (OFPP). NSF is currently transmitting
data sets to the public. NSF established procedures in place to ensure that metric (timeliness, accuracy, and completeness) on USAspending.gov are met. The table below shows NSF's
Federal spending information
metrics for contract awards:
- ---- ----
- - - --- -- I
- - --
NSF transmits financial spending information to FPDS-NG on a real time basis after award in the AAMS system. FPDS- NG interfaces with USAspending.gov.
FPDS-NG within 30 days.
Contracting completeness AAMS.'
Officers review the data file for prior to approving awards in
100% score on USAspending.gov data quality scorecard.
NSF verifies the accuracy of FPDS-NG data through data quality validation annually. and valid verification The Policy and
95% overall accuracy rate.
Oversight Team pulls a statistically certification
sample and checks the data. The data for FY 2009 data is January 5, to GSA 2010, with an overall score of 98%. The results of the data review is submitted and OFPP by January
Figure 8: Performance Metrics - Contracts.
Section 3: Recipient Reporting The National Science Foundation's approach is to review its grantees' quarterly recipient
reports, as required under the American Recovery and Reinvestment Act of 2009 (ARRA). Each quarter, recipients that have received ARRA funding are required to submit reports on the progress and status of their grants via www.FederaIReporting.gov. conducting a data quality review of the submissions and identifying NSF is responsible for material omissions or
sign ificant reporting issues that could mislead the public about the interit and scope of the award. The Office of Management and Budget (OMB) issued memorandum
M-09-21 in June 2009 titled
"Implementing Guidance for the Reports on Use of Funds Pursuant to the American Recovery and Reinvestment Act of 2009". The memorandum issues guidance that establishes reporting tlmellnes and defines roles and responsibilities for ARRA Recipient Reporting. The timeline specifies completion dates for recipients to submit their reports and for agencies to conduct their review as follows: Days 1-10: Recipients and Sub-Recipients enter reports in FederalReporting.gov Days 11-21: Recipients and Sub-Recipients correct, finalize, and submit reports in FederalReporting.gov Days 22-29: Agencies review reports, identify issues, communicate issues and finalize review in FederalReporting.gov Day 30: FederalReporting.gov with recipients to correct
site is locked and reports are published to Recovery.gov
Post Day 30: FederalReporting.gov will be reopened for a "Continuous Quality Assurance" period. OMS Guidance on this continuous period for recipients and agencies has not been finalized. The law and subsequent guidance issued by OMS set clear expectations for accountability and transparency from both Federal agencies and from recipients of Recovery Act funding. In response to this landmark legislation, NSF has developed policies, procedures, and tools for the awardee community. These documents provide up-to-date information regarding NSF's implementation of the Recovery Act, and are available at www.nsf.gov/recovery/recipientreporting. NSF's overall framework for Recovery Act investments emphasizes that grants be awarded in a timely manner while maintaining a commitment to the use of established merit review processes and controls. In accordance with NSF's ARRA recipient review process, NSF has developed the following hierarchy of documents that provide specific guidance for conducting the data quality reviews: Policy Guidance; Data Quality Plan; Data Quality Protocol and Business Logic; and Standard Operating Procedures
Roles and Responsibilities
NSF originally established several Tiger Team committees to review and manage the implementation of the numerous requirements associated with ARRA. One of the Tiger Teams established was the Recipient Reporting Tiger Team. This committee decision makers from the.DFM, Division of Grants and Agreements Information involves key members and (DGA), DIAS, and Division of
Systems (DIS). The Office of the Inspector General (OIG) also has participants that
regularly attend the meetings. The Recipient Reporting Tiger Team meets on a weekly basis. The Tiger Team members are responsible for providing leadership to ensure the requirements
for the recipient reporting are properly implemented, data quality review processes are defined and successfully executed and OMB-mandated deadlines are met. Data Quality Review Plan Approach NSF has implemented a multi-phase recipient reporting review process throughout each quarter
comprising: (1) reviews for omissions (non-reported awards) and/or significant errors, (2) checks for compliance through data matches, (3) a sampling review of descriptive fields, and (4) a validation against the Federal Financial Report (FFR) submitted for the comparable quarter. Guidance and discussions with OMB have indicated that non-reported awards are considered non-compliant with reporting requirements. The only material omission that can exist is when a recipient does not report. NSF has implemented both an IT solution to automate the reviews and involved the Program Officers to review key reporting fields that cannot be automated, such as the project description, quarterly progress, and description of jobs created/retained fields. NSF has developed a protocol document which defines the business rules that will be utilized to conduct each phase of the review to identify significant reporting issues which will be communicated to the recipient to be resolved either during the official ARRA Recipient Reporting period or the following quarter as defined by NSF. The protocol document also defines the Significance categories for identified issues. The issue categories are: Major-1, Major-2, and Minor where reporting issues categorized as Major-l are the most significant and require recipients to resolve the data issues prior to day 30 of the reporting cyde3•
ARRA- Recipient Reporting
rd that includes ARRA funding requires the r
acetitrqlfeder,,1 webs}te., "
~AiJtoma~(,!1l Data Checks Stage 2 fprhgtail"l Offic~rH~views (31- 90 days) :. F~~ei:~IFinand~ll'teport
Figure 9: ARRA Recipient Reporting Quarterly Schedule.
NSF developed its protocol and coordinated with OMB and other agencies to review the
business logic being implemented. An outline of each data quality review is provided below with a timeframe of when the review is conducted. The information provided in the remaining document will describe in further detail the steps necessary to execute the reviews.
Recipients with ARRA aioards valued at less than $25,000 and awards made to individuals are exempt from the quarterly ARRA recipient reporting requirement.
! Data Quality Review
Omissions (non" reported) and Major-l Review (Automated)
Timeframe (DaysAfter End of Quarter) Conducted Days22"29
Purpose To review all reports submitted through www.FederaIReporting.gov and identify significant reporting Issuesfor material omissions or significant issuesthat could mislead the public. Issuesduring this review are required to be corrected by Day 29. To review all reports submitted through www.FederaIReporting.gov and identify reporting issuesthat will require correction during the next reporting cycle. To review a sampling of reports submitted through www.FederaIReportlng.gov and identify reporting issuesthat will require correction during the next reporting cycle. To review all reports submitted through www.FederaIReporting.gov and identify reporting issuesin comparison with the most up to date financial expenditure data. Corrections are required during the next reporting cycle.
Major-2/Minor Review (Automated)
Conducted 2nd Month (after day 30)
Program Officer Review (Manual)
Conducted 2 Month (after day 30)
Federal Financial Reporting (FFR)Review (Automated)
Conducted 3 Month (after day 60)
Figure 10: Data Quality Review and Timeframe ..
In addition to conducting the data quality reviews, NSF has also implemented a coordinated communications plan. E-mail communications are sentto recipients at defined stages during the reporting cycle including the period prior to the reporting start date, during the reporting period, and after data quality reviews are conducted. Reminder Communications Prior to the Reporting Period: Prior to the commencement of the reporting period, NSF sends reminder e-rnalls to the Sponsored Project Office (SPO) contacts for the ARRA award recipients who are required to report for the current reporting period. The purpose of the e-mail is to remind recipients of their obligation, as stipu lated in the terms and conditions ofthe award, to submit a quarterly ARRA report. The e-mail also provides information about important reporting deadlines, resources and contact information for questions. Reminder Communications Prior to the Reporting Deadline: Prior to the completion reportin the of the
reporting cycle, NSF identifies which awards do not have a submitted
FederalReporting.gov system, also referred to as an omission or non-reported award. A reminder e-mail is sent to each SPQ contact identifying all of their omissions with a reminder to submit a report prior to the deadline. contacts if any outstanding report submissions. Data Quality Review Cornmunicatlons: identifying After conducting each data quality review stage and to the NSFwill send a second set of reminders to the
omissions continue to exist prior to the system closing for new
issues per the data protocol business rules, NSF sends e-mail communications
recipient spa contacts. The emails are sent in two stages. The first stage communicates omissions or non-reporting. During this stage we also note significant errors on Federalreporting.gov. The second stage provides a consolidated communication with Major2/Minof, Program Officer Review and FFR review issues. The purpose of the e-mail is to communicate specific reporting issues and to provide guidance to the recipient for correcting the issues and also provide a timeline for required corrections. NSF is utilizing an automated e-mail tool for efficient e-mail distribution. Additionally, NSF has created a centralized ma ilbox titled NSFARRAReviewer@nsf.gov. All e-mall communications distributed to recipients regarding ARRA Recipient Reporting, as well as subsequent questions and replies, are managed from this e-mail box. Each e-mail that is sent to a SPO contact is filed in the corresponding award eJacket folder as part of the official record. Control Totals Definition Each reporting quarter, NSF identifies a control total, or the total number of ARRA awards that require a report submitted via FederaIReporting.gov. The control list includes all awards with ARRA funding excluding awards to lndlvldualsand awards less than $25,000. Per OMB guidance, NSF provides the control total list to the Recovery Accountability and Transparency (RAT) Board. This list is also used to conduct the data quality reviews and identify reporting omissions.
Data Quality Reviews and Related Communications Omissions and Major-l Data Quality Review Process; NSFexecutes the "Major-l Review" between days 22-29 of the reporting period. An automated data review is executed comparing FederalReporting.gov data with NSF records, per NSF's data protocol, concentrating on the following reporting fields: Award Number, Amount of Award, Total Federal Amount of ARRA Expenditure, Number of Jobs, Recipient Congressional District, and Recipient DUNS. The diagram below identifies the roles, tlmellne, and steps in the review process during days 22-29.
Figure 11: Review Process for Days 22 - 29.
data review is executed on Day 22 to compare the reported data for each report
against the expected NSF data. If a discrepancy is identified, an "exception" is created and entered into an exception report. All exceptions identified during this review are categorized as "Major-I" issues. The automated data review will generate the following three exception files:
1. Omissions Report: This file contains a list of awards for which NSF expected to receive a
report but did not receive a report through FederaIReporting.gov. 2. Major~l Exception Report: This file contains significant errors identified per the data
protocol. Each exception will identify relevant award information, the field causing the exception, the value reported by the recipient and the value expected by NSF. The file also includes a list of reports that contain no Major-I issues. 3. Other Exceptions Report: This file contains reports that could not be matched with NSF records and require additional manual review and analysis due to a special circumstance or are not required to be submitted under ARRA terms and conditions. Potential entries in this report could include: • • • • An award number was mistyped and an automatic match could not be found; A report was submitted for an award under $25,000 which does not require a report to be submitted; Multiple reports were submitted for the same award number; or An award number matched but neither the Catalog of Federal Domestic Assistance (CFDA) number or Funding Agency Code matched NSF's designators
After the exception reports are generated, the data review team examines the results and conducts any follow-up analysis that is required for items that are included on the Other Exceptions report. Once the analysis is complete, the team takes the following steps for reporting the review results identifying Major-l issues: • Enter a comment into FederalReporting.gov to change the reviewed status to "Reviewed with Comments" and include a message that corrections must be made prior to Day 30. • • Send a separate e-mail to recipients providing a list of exceptions with guidance that the corrections must be made prior to Day 30. If a report does not have any Major-I issues, the data review team will mark the report in FederalReporting.gov as "Reviewed with No Comments." NSF is committed to ensuring all reports are reviewed and are marked with either "Reviewed with Comments" or "Reviewed with No Comments." The data review team continues to monitor recipient updates in FederaIReporting.gov. If recipients correct any issues and fully resolve the exceptions identified for a given report, the agency marks the reports as "Reviewed with No Comments." • The same automated FederalReporting.gov data review will be executed again on Day 30 when is closed for reporting. The purpose of this data review is to
identify two categories of results:
A final list of recipient reports that were required, but did not appear on the NSF data extract through FederaIReporting.gov. The list must be submitted to OMB and the RAT Board using a template provided by OMB. during the Day 22 data quality
A final list of unresolved significant reporting issues identified
review must be submitted to OMB and the RAT Board using a template provided by OMB. Once the list of non-reports and any outstanding issues have been issued, the first data quality review is completed. Major-2/Minor Data Quality Review Process: NSF executes the second phase of the data
quality review called the "Major-2/Minor Review" post day 30. An automated data review is executed, per NSF's data protocol, concentrating on reporting fields that NSF can validate against system data to ensure accurate reporting. The goal of the review during this phase is to encourage continued improvements to the data reported by recipients. Figure-2 identifies the . roles, timeline, and steps in the review processes that are conducted during days 30-90; An automated data review will be executed post Day 30 to compare the reported data for each report against the expected NSFdata. This review concentrates on approximately 30 reported data fields. If a discrepancy is identified in accordance with the protocol, an "exception" is created and entered into an exception report. All exceptions identified during this review are categorized as either "Major-2" or "Minor" issues where Major-2 items are considered more significant than Minor items. The automated data review will generate one exception file: Major-2/Minor Exception Report: This file contains field level exceptions identified per the data protocol. Each exception will identify relevant award information, the field causing the exception, the value reported by the recipient and the value expected by NSF. The file also includes a list of reports that contain no Major-2 or Minor issues. A consolidated e-mail communication is sent to recipients combining all issues by award number
for a given recipient with any issues found during the "Federal Financial Report Review" and the "Program Officer Review." The e-mail provides guidance and links to the OMB, RATB and NSF recipient reporting websites to assist recipients. Program Officer Data Quality Review Process: NSF executes the third phase of the data quality review called the "Program Officer Review" after day 30. A statistical sampling methodology is utilized to randomly select reports that will be reviewed by the Program Officers. The Program Officers for the selected reports are responsible for reviewing key reporting fields that cannot be reviewed using the automated data review process including: Project Title, Award Description, and Description of Jobs Created. Description, Quarterly Activities/Progress An automated
IT solution is used to generate extract files containing the key reporting fields for is The The
the reports that are randomly selected. The extract file, along with a review template provided to the Program Officers to complete their assessment of the data reported.
Program Officers select specific responses for each field that is provided on the template.
data review team consolidates and reviews the results. The awardee's get a central email of all lower level issues at the end of the quarter with items to improve in future quarterly reports. This is being tracked in the tracking tool that is being developed.
Federal Financial Report (FFR) Data Quality Review Process: NSF executes the fourth and final phase of the data quality review called the "FFR Review" post day 60. An automated data review is executed, per NSF's data protocol, comparing the Total Federal Amount of ARRA Expenditure field to the FFRsubmission for that quarter. During this review, the reported amount on the ARRA Recipient Report is compared to the expenditure value reported on the FFRsubmitted to NSF. If a discrepancy is identified in accordance with our protocol, an "exception" is created and entered into an exception report. All exceptions identified during this review are categorized as "FFR" indicating the issue was found during the FFR Review. The automated data review will generate one exception file: FFR Exception Report: This file contains field level exceptions identified per the data protocol. Each exception will identify relevant award information, the field causing the exception, the value reported by the recipient and the value in NSF's system. The file also contains a list of reports that contains no FFR issues.
Tracking Tool and Reporting
NSF has developed a database tool to track the resu Its of the ARRA recipient reporting data quality reviews. The exceptions identified during each of the data quality reviews are loaded into the tracking tool each quarter. Maintaining quarterly results in this database will allow reports to be developed that will facilitate the jdentification of reporting trends and systematic issues. The reports will also assist NSF in providing award oversight and implementing its riskbased advanced monitoring activities. The following documents were using in planning and conducting the data quality reviews: • • • • • • • NSF ARRA Policies ARRA 1512 Data Elements Review Protocol.xlsx NSFARRA Recipient Reporting Webpage and Grantee Guidance and Tools OMS Memorandum M-09-10: Initial Implementing and Reinvestment Act of 2009 Guidance for the American Recovery Guidance for the American
OMS Memorandum M-09-15: Updated Implementing Recovery and Reinvestment Act of 2009
OMS Memorandum M-09-21: Implementing Guidance for the Reports on Use of Funds Pursuant to the American Recovery and Reinvestment Act of 2009 OMS Memorandum M-10-08: Updated Guidance on the ARRA - Data quality, Nonreporting recipients and Reporting of Job Estimates.
In connection with the plans detailing information disseminated, as required by. the Open government
Directive, the undersigned Acting Director, Office of Cyberinfrastructure herby certifies that the information contained in the attached plan materially represents the identity and other relevant information over the quality and integrity of Federal spending information.
Dr. Jose L. Munoz Acting Director, Office of Cyberinfrastructure
APPENDIX A: Description of Risk Factors
Risk Factor Cr iteria Descriptions
transactions or activities within a process. A process that contains a high number of transactions that may affect the primary mission of the organization. If a backlog or repetitive errors occurred in perform ing the transactions and remain undetected, the financial reporting res ults may be misstated.
volume of transactions processed through this subcycle Is significant and closely relate to the organization's primary mission. A deficiency In the subcycle may result In a material misstatement in financial reporting or have a negative Impact on external parti es or custo mers.
The volume of transaction s processed through this cycle Is high, but not mission critical. A deficiency In the subcycles may result in a financial reporting misstatement.
transactions processed through this cycle are low and not mission critical. A breakdown In the processing of the transactions is correctible over a short period oft ime without significant resources.
What is the number of transactions perform ed for this subprocess in a given year? Are the tran sactions that su pport th is subprocess closely related to the agency's mission critical work? If there was a deficiency or backlog In the subprocess, could It have a significant Impact on financial reporting or external relations hips? To carry out the subprocess, how many organizational entitles playa critical role In carryl ng out the subprocess? How many different systems are used? How many processing steps are there? Are the staff required to have special skills or knowledge to execute the subprocess? Does the subprocess use mostly automated steps? Is there a significant amount of human Intervention? Are there a lot of system work arounds?
The risk associated with the number of organizational en tit ies involved, system interfaces, processing steps, and degree of technical difficultly within the process.
Process Involves i steps from beginning to end which require involvem ent of multiple entl ties (divls ions), interfacing with multiple systems, differentiation of steps based on varying criteria (monetary thresholds), and!or specialized knowledge/s kills.
Process involves several steps fro m beginning to end which rely on one of the following: involvem ent of multiple entities (divisions), Interfacing with multiple systems, differentia tion of steps based on varying criteria (monetary thres holds), and/or specialized knowledge/s kills.
Process may Involve few processing steps, and!or Involve mini mal relianc e on the followlng:multlple entities (divisions), interfacing with multiple systems, differentiation of steps based on varying criteria (monetary thresholds), and/or specialized knowledge/s kills.
The risk with the degree of automation or human intervention in a process. A process that relies on manual steps Is more vulnerable to human error. Automation facilitates the prevention and detection o f errors. The automated fun etion is more reliable i f it well designed, supported by secured access, and separation of duties.
The process i manual and requires a significant level of manual intervention! contact. A failure in the manual steps would significantly impact the subprocess and would require s ignffi cant effort to correct.
process is partially manual and automated. A breakdown In the manual steps niay require some increased effort but not deemed significant.
The process significantly automated and operates with a low level of manual activity.
The risk associated with susceptibility and Incentive of fraudulent activities within a pro cess. Fraud risk is highly Influenced by opportunity and Incentives.
Subprocess Includes Iran sactlon s with a high Jevel of liquid assets which can easily be converted to cash or performance results that could be altered for personal gain or Institutional recognition. Fraudulent activity could res ult Ina financial misstatement, or a distortion or inflation of key· performance data used by external and internal entities.
Process relates to transaction with some level of liquid assets or which can be converted to cash. Process Is somewhat vulnerable to fraudulent a ctlvity that could lead to a financial misstatement.
Process relates to transaction with low level of liquid assets or but are not easily converted to cash. Process has limited vulnerability to fraudulent activity.
hat incentives or opportunlU es exist in the subprocess that may enable personal finan cial gain, reward, or dIstort data used to report on the agency's performance? What procedures are In plac e to prevenf or detect fraudulent activity?
with management using Its authority In an excessive manner for the purposes. of bypassing or overriding existlnq controls.
existence of overrides In this cycle are common. Managem ent override decisions are not consistently documented and lack an audit trail. The override takes place at the senior management level due to the dollar threshold or sensltivi ty for the proc ess.
The existence of overrides in this cycle are common. Management decisions are partiaJly documented with an audit trail. The override takes place at the mid-level management level and may be reviewed by upper management.
Limited potential for management override and decisions are clearly documented with adequate audit trail. Any override take place at the supervisor or team lead level and are reviewed by multiple layers of management in more senior positions.
Are there opportunities for management to override controls? How frequently does management override existing .controls? Is there an audit trail 0 f management override decisions? Are there w rltlen policies and procedures that define when a management override is appropriate? I s there a proc edure that has been Implemented to monitor management overrides performed throughout the
The risk associated with transactions which require steps that circumvent or go outside of normal procedures, and trans action that require Judgm ent (subjectivity) and/or significant manual input.
The process requires significant j udg menta I estimations and/or multiple non routine tra n sactlons. Process requires a high level of judg ment to be exercised and an error may result In a significant financial misstatement.
The process requires some judgmental estimation sand/or non routine transactions. Process requires the processor to exercise judgment in conducting this activity and an error may result in a financial misstatement.
The process minimal or no judgmental estimations. Process Is very routine in nature. Process requires the processor to exercise minimal judgment In conductlnq this activity and an error . would likely not result in a material misstatement.
Are there trans actions in th e subprocess that require a sig niflcant use of judgm ent or rely on estimates? Are there steps in the subprocess that are considered a work around for normal procedure? Is the criteria for perform Ing the estimate applied consistently? -. How frequently do nonroutlne tra ns acti ons occur? is there a procedure in place to monitor the nonroutine activity and ens ure that it is performed cons istently. Does the party vendor perform a s ig niflcant role in the subprocess? What sources of information does management use to confirm tha t the thi rd party's procedures and operations are in compliance with agency specific and federal guidelines (e.q, SAS70 reports, internal or external reviews)? When deficiencies are foun d in th e third party operations, what steps are performed by management to ensure that corrective action is taken?
having a significant business process managed by a third party.
The subprocess is mostly managed by a.Thlrd Party. Managem ent relies on the Information provided by the contractor and ha s little or no direct oversight of the vendor's work. Managem ent obtains the majority if not all assurance related to th is process through SAS7a reviews or cost incurred audits.
The subprocess Is partially managed by a Third Party. Management relies on the information provided by the contractor but also has some direct knowledge and oversight of the processing activities. Management obtains assurance related to this process through SAS70 reviews or cost incurred audits as well as direct contact In the processing environment.
The subprocess managed primarily in house with minimal Interaction with third parties. Managem ent provides direct oversight of these processing activities. Managem ent obtains assurance related to this process through direct contact In the processing environm ent.
The risk of associated with having a history of material weaknesses, reportable conditions, and/or significant deficiencies that remain unresolved from year to year.
External and/or internal review processes have identified numerous and/or uncorrected weaknesses in the internal controls over this process. These weaknesses are significant in nature and evidenc e does not exist that indicates that the weakness will be resolved in a timely manner due to complexity of the error or lac k of sufficient management attention.
External and/or internal review processes have Identified numerous and/or unc orrected . weaknesses in the Internal controls over this process. Evidence exists that indicates that the weaknesses are being addres sed by management and appear to be able to be resolved in a timely manner but due to the nature of the weakness require on going management attention.
Minimal audit issues
have been Identified In
internal and external reviews. The nature of any weaknesses indicates that they do not repres ent a significant financial impact to the entity.
auditors or internal management report any material weaknesses, reportable conditions, or areas of noncompliance for the agency in the last 2 years? Has the OIG or other internal review function ident ifi ed significant control issues or concerns? Has the agency performed corrective action to address the deficiencies? Has corrective action been completed and has management and the auditor determined tha t the deficiencies are closed?
The risk associated with changes in laws and regulations that may Impact processing as well as the controls In place over processing.
The subprocess recently encountered signifi cant modification due to the enactment of a new law or regulation. The Impact of these changes requires significant change In how the subprocess is performed, In addition, as a result of the enactment of the new law or regulation, there is additional scrutiny over the process which Increases the profile of the process. The changes can have a material affect on the financial statements.
The subprocess recently encountered 'modification due to the enactment of a new law or regulation. The impact of these changes requires change In how the subprocess is performed, In addition, a s a result of the enactment of the new law or regulation, there is some additional scrutiny over the proc ess which Increases the profile of the process. These changes are not material in nature and modifications can be made to processes without significant revision or retraining.
The subprocess has not encountered slgnlfl cant modification due to the enactment of a new law or regulation. The requirements of the process remain relatively stable. The process is generally no t viewed as high profile in nature.
Has the subprocess been affected by any recent changes in laws and regulations? Have procedures been Implemented to update the subprocess and ens ure that it Is in compliance with the new requlrem ents? Did management perform an assessment to determine whether or not the updated subprocess meets the new requirem ents? did new controls have to be added to meet new requirem ents? Were there any recent changes in the key or number of pers onnel responsible for carrying out the subprocess? Have there been any changes to subprocess or numberof personnel that have significantly Increased the staff workload? Have the assigned personnel received proper training to perform their duties?
the number and frequency of personnel changes; employee workload stress; quality of assigned personnel; and the s ufficien cy of the number of people available to perfor m job functions.
There have been rec ent Significant personnel changes or long vacancies for sensitive or management positions; staff are assigned to perform an increased workload causing processing to be behind sch ed ule or high stress condition 5. Assigned personnel may" Jack adequate training to perform their jobs; Inadequate I ever of staff a ssig ned to carry out the process.
Due to turnover or vacancy, the staff assigned to perform this subprocess have multiple functions and are provided some basic training to perform the new functions; assigned personnel need additional training to support their job functions; staffing Is low but workload is manageable with extended staff hours.
The staff assigned to perform thl s subprocess has been stable. Personnel are very familiar with their specific job functions and in some instances they perform muitiple functions but workload is manageable. Staff is provided ongoing training to perform new functions.
APPENDIX D: Acronyms
AICPA APIC APICIWG ARRA BFA BPO C&A CFO
American Institute of Certified Public Accountants Accountability Accountability and Performance Integration and Performance Integration Council Council Working Group
American Recovery & Reinvestment Act Budgeting Finance and Award Management Business Process Owner Certification and Accreditation Directorate
Chief Financial Officer Chief Information Officer
Chief Operating Officer Committee of Sponsoring Organizations
Division of Acquisitions and Cooperative Support
------ --------------------------~-------DFM Division of Financial Management DGA DIS FAM FAR Division of Grants and Agreements Division of Information Financial Audit Manual Federal Acquisition Regulation Systems
FAS FISCAM FISMA FFMIA FMFIA GSA ICOFR IPIA IT NSF OFPP DIG OMB PIMS SAS 70 RAT Board SMaRT
Financial Accounting System Federal Information Federal Information System Controls Audit Manual Security Management Act Act
Federal Financial Management
Federal Managers Financial Integrity Act U.S. General Services Administration Internal Control over Financial Reporting Improper Payment Improvement Information Technology Act
National Science Foundation Office of Federal Procurement Office of Inspector General Office of Management Program Information and Budget Management System Policy
Statement on Auditing Standards No. 70: Service Organizations Recovery Accountability Senior Management SponsoredProject and Transparency Board.
Roundtable Office ~---.-~--~~------
This action might not be possible to undo. Are you sure you want to continue?
We've moved you to where you read on your other device.
Get the full title to continue reading from where you left off, or restart the preview.