You are on page 1of 9

Mitigating Risk with Mass Data Change

In Winshuttles conversations with dozens of business and IT managers, we have explored how companies manage mass change and mass data creation to mitigate the risk of incorrect data being entered into SAP. Following are some of the best practices these professionals have shared. This whitepaper is not meant to be prescriptive or exhaustive, but should provide some ideas about deployment and sustainment of mass change and mass data creation scenarios in the SAP environment.

2011 Winshuttle, LLC. All rights reserved. 6/11

Businesses have an undeniably voracious appetite for data change and SAP systems are extremely accommodating when it comes to satisfying this appetite. IT departments develop imaginative ways to help the business engage in mass change of data, occasionally with an intimate understanding of the specific business requirements and sometimes with fixes that explicitly address the ad hoc requirements spelled out by business managers. Effective mass create and change strategies, whether they are applied to transactional or master data, are typically bound not just by technology but also by organizational responsibility, accountability, and process. This means that all staff members engaged in mass create and change scenarios need to understand why they are performing any given action, the underlying dependencies within the business and technology, and finally the strengths and limitations of the technologies they are going to employ to perform subsequent actions. As any experienced employee who uses SAP will attest, its inevitable that bad data will make its way into SAP, either as a result of badly written mass create or change procedures or through an omission or mistake. Sometimes the bad data is the result of some upstream or downstream event, but more often than not, errors creep in because of a lack of understanding or diligence on the part of the data steward(s) or the person responsible for writing the script or report that will create or change the data. In a security-conscious business environment, the desire to restrict access to SAP mass change transactions is understandable. However, the reality is that simply preventing access doesnt make the data safer or improve data quality. Similarly, removing the ability to execute actions such as GUI scripting against SAP is technically possible, but this doesnt address the potential for developers to create BDC sessions or LSMW scripts. It can be argued that such technical tools shouldnt be used by inexperienced staff, but it could be also argued that the most experienced people and those most familiar with the data, are often the people who own that data and they often dont have access to tools to create BDCs and LSMW scripts. So who is left to automate mundane data processing tasks? For the most part, it is falls to technologists. And as is often the case, IT staff is so thinly spread operationally that all that they can do is respond directly to the requests of the business, using the tools at their disposal to achieve specific objectives as efficiently as possible. Winshuttle products, like those of other SAP certified vendors, offer immensely powerful data change and create capabilities for the SAP ERP space. Thousands of users implement technology such as Winshuttles every day to perform mass actions on their SAP environments. When implementing mass change strategies, here are some thoughts from Winshuttle customers on things to consider and how Winshuttle products help.

Functional Testing Rigor

Any well designed mass change or create approach will have been adequately tested in-unit and QA. If the data change method is to be syndicated to business users, proactively engage them in user acceptance testing. Developing scripts for data creation and change should follow the same rigor and discipline as any software development activity. Also, be sure to test system throughput and performance. These tests may reveal some interesting discoveries about the SAP environment; including challenges with related systems (e.g. address checking or tax code jurisdiction systems). Most, if not all, SAP customers should have at the very least a development, testing and production environment. Some users will also have a staging or pre-production environment which is sized and populated with like-production data. Such environments, assuming they have the same configuration, are better choices for creating recordings and prototypes.

Mitigating Risk with Mass Data Change

There really is no substitute for thorough testing of data change and create methods, which should always be done in non-production environments. If scripts are to be authored, consider testing just a few rows in the productive environment rather than trying to break the script in production. Generally, a script will execute without issue if it passes muster with just a few data lines. There are always exceptions, but it is better to be cautious than careless. The Winshuttle Transaction product protects production data and production environments by setting the production and non-production values in the Options settings. If using a Winshuttle Central-based application licensing model, this protection is extended further. In fact, administrators can prevent the use of scripts against defined production systems unless the scripts are approved for use in such circumstances.

During testing and deployment, a script may not port optimally from DEV > QA > PROD. This may be because of personal parameter settings for a given transaction.

Mitigating Risk with Mass Data Change

For Winshuttle Transaction scripts to be truly portable, at least for SAP transaction recordings, there needs to be common configuration or environmental conditions for users running scripts configured or recorded on other systems. Some transactions that may require specific attention are FV60, FB60, FB65, FMBB, FMCIA, FMSA, FMSB, GJEE, FB50, FBCJ, FB10, FB70, FB75, FV70, FV75, and MM02/03. These transactions, among the many thousands in SAP, are commonly used and have screens that may or may not pop up depending on selections made earlier in the day or as part of preferred use of that transaction. Since transaction recordings are procedural, errors may be encountered on first runs if the environment is set up differently than those of the person who recorded the transaction. One option to address this issue is to use the Skip screen if not found feature in the expert mode of the Winshuttle Transaction mapper. Another option is to use the If on first transaction option, which wraps this condition around the screen containing the prompt for company code, etc.

It is critical to have a solid execution, backup and recovery, and risk mitigation planall dovetailed with a comprehensive testing plan. A reckless attitude of fire and forget regarding the execution of mass change and mass data creation activities in productive environments will invariably cause a data-quality problem. With Winshuttle Transactions powerful bi-directional capability, it is possible to repurpose change scripts to back up data before performing any change. This is extra work (double execution, one read, one change) but its the safest alternativepreferable to shutting down all other data processing on SAP while the change is performed and results are verified. Some mass change techniques do make copies or take snapshots of the data before the change. This can be a good mitigation strategy but there are additional safeguards that can be used.

Mitigating Risk with Mass Data Change

Current versions of Transaction with mapping to Excel as a data source support backing up screen data before changing it. This feature may prove to be invaluable if large quantities of SAP data are frequently changed. Be aware that there are some limitations on this capability, such as it not working with GUI scripting and BATCH modes. Also consider that scripts may take a little longer to run because of the need to read the SAP structure before updating the fields. The peace of mind that flows from this feature may justify its use however. The backup screen data feature is enabled as an advanced run option and can be set in the Rich Client but NOT in the Excel Add-In. The author of the script can enable this feature and save it as a characteristic of the syndicated or published script. Large data change workbooks may grow exponentially when this feature is enabled, so be cautious. The backup sheet is a replica of the sheet from which the data changes will flow and is merely a copy of the data in the SAP screens. Always save Excel workbooks that are run as separate workbooks rather than recycling the same workbook over and over. This simple procedure will ensure the preservation of the run log associated with the data created or changed. In addition, the log shows the name of the script, the system it was run against, the credentials of the SAP user, and the date and time of the run. Preserving this artifact may assist with an internal or external audit. Audit External auditors often fixate on mass change transactions like MASS, MM17, CS20, CA85, etc., and as a result these are frequently taken away from ordinary users. It is possible to keep track of who executed these changes and when they occurred, but this information is often buried deep in the bowels of SAP and accessible only to BASIS administrators or users with access to some SM transactions. This also assumes that massive logs are kept for enough time to study these executions when postmortems are conducted. If mass changes on specific data are important enough that a history needs to be maintained, then consider creating

Mitigating Risk with Mass Data Change

a repository of change history over and above the change log related to the data object itself. For LSMW and BDC sessions, this means archiving the run logs, data file sources, and copies of the scripts. For thirdparty products, like those from Winshuttle, this may require the implementation of enforced retention of the completed run logs. Performance Assessment As any IT administrator of a large scale SAP implementation will tell, performing mass changes on SAP systems can result in unexpected outcomes at the database level and in the application stack itself. Individual SAP environments are likely sized for standard operations and, unfortunately, the mass change of data in the SAP environment may not come up as part of the SAP sizing exercise. Performance testing provides some, but not necessarily the best, information. There are so many variables in how a given SAP installation is run that the best way to minimize the impact on a given system is through an appropriate and routine sizing process. Using the Quick Sizer tool, it is easy to input expected transaction volumes and rates for most processes. The tool provides an SAPS value for whatever application is implemented. Discuss using this tool with hardware partners to determine if there will be a need for additional CPU, memory, etc., or if sufficient resources are already present. This assessment should be aggressive but realistic, and carefully evaluate if the system should be sized for peaks or steady state. This assessment should also account for the frequency with which these actions are expected to be performed. The assumption that the ability to process 25,000 incoming sales orders per day for example, is not the same as the desire to reschedule 200,000 sales order lines in ten hours every six months. Other variables include deciding whether to run jobs during peak hours, off hours, against the central instance, against a particular application server or pool of servers, or whether to introduce some sort of delay or sleep function between record changes. Winshuttle Transaction allows users to place integer-based sleep statements between batch runs to minimize system impact. Consider shorter but parallel bursts of this may cause system lock conflicts. throughput but could result in massive whether there are some sorely needed should be accelerated. mass actions, but not when using concurrent ranges of data as For example, additional database indices may improve system volumes of undo logs that are difficult to manage. Also consider maintenance or patching activities that are long overdue or that

SAP systems are relatively tolerant when it comes to large volumes of data creation or change, but there are a few caveats. The SAP environment must be appropriately sized in terms of hardware, appropriately maintained and patched, and there can be no abnormalities in terms of the way the system is configured. Even if all of these issues are cleared as potential red flags, consider users and all the other business processes that are dependent on the SAP system and accordingly schedule particularly large jobs for off peak times, weekends or evenings when the system is in less use. Another option is to keep administrators and other key stakeholders informed of plans to execute large jobs. In environments where Winshuttle Central licensing is applied to Transaction users, it is possible to restrict large job execution centrally. Large parallel runs of mass creation and change of data not only consume SAP dialog processes, they can also result in high SAP server resource utilization, database process thread pegging, and extraordinary record-locking conflicts. Testing under load in non-production environments will provide insights as to whether scripts may cause any issues. Activity Approval Consider an additional layer of governance, especially when dealing with operators who are less experienced or when trying to control data that is particularly sensitive, volatile, or pivotal to operational effectiveness. The classic control mechanism requires users to make requests to IT and produce their data files using a structure or template provided by IT. Super users or IT staff then typically execute scripts against that data. While this approach may work well in some environments, it is often unsustainable when scaling up.

Mitigating Risk with Mass Data Change

Some companies use helpdesk ticketing systems to initiate these types of requests, with IT assigning the resource, who is often an expensive ABAP resource who created the original recording or script. When that individual is on vacation or out of office, the task is often deferred and eventually a backlog of requests results. Occasionally, someone else (perhaps less technical) becomes the gatekeeper of the process, often leading to an inability to update or correct any errors with the program logic.. Unless the creation of the data artifact has a workflow approval embedded in it, the lack of organizational approval is a difficult obstacle to overcome in terms of control and transparency of purpose. Some ways that this can be dealt with include SharePoint workflows wrapped around data documents, or through email attachments that need to be forwarded by specific individuals. Both of these are poor cousins when compared to approaches that rely on a data approval workflow embedded in the end-to-end process. Moreover, these approaches tend to be cumbersome and users often try and short circuit them. A better alternative is to choose a product that has workflow as part of the overall data document submit, approve, and execute process. In environments that dont have Winshuttle Central licensed Transaction and Runner management, it may be necessary to rely on the basic options of publishing Transaction and Query scripts in workbooks. Employees must share workbooks via email or shared drives on the network. For smaller organizations, this approach may be effective; however, it doesnt scale very well. Central comes with a basic workflow available that will allow one person to create a script and another to approve that script for production use. Additionally, scripts can have the data review process enabled which requires one person to create the data and another to approve its use in SAP. Winshuttle Central, when used in conjunction with the Winshuttle Transaction, Query and Runner products, allows the implementation of three basic approval strategies: 1. Script Approval 2. Data Approval with decoupled execution 3. Data Approval coupled with execution For more complex requirements, Central with advanced workflow options provides many benefits. The advantage of a three-step data review process is that many stakeholders are involved in vetting the automation. For less experienced employees who create data, it is desirable to curtail the risks to the SAP system by always having a more experienced supervisor check the data before it is posted. For large mass change and mass data creation activities in highly regulated industry sectors, this may be an unavoidable step that must be put in place. The good news is that Winshuttle Central will help do this for Transaction and Runner.

Mitigating Risk with Mass Data Change

Data Pre-Validation
To avoid rework of data for creation or change in the SAP environment, implement approaches and methods that allow pre-validation of the data. This approach may involve protecting workbooks, restricting field entry values in a data file, or something similar. The validation strategy may simply be, lets see what SAP requires and then building data capture/entry rules around that.

A feature in Winshuttle Transaction assists in the quality of data creation and data change activities by providing the ability to enable pre-validation of data. This feature can be selected in the expert tab of the mapper and will result in the appearance of the Validation button at run time. There are some limitations associated with this feature. For PA30 and PA40 transactions in particular, the multi-commit characteristic of these transaction screens may prevent the feature from working predictably. Additionally, scripts that have been recorded with multiple commit statements in the end-to-end process will not validate properly. Examples of times the feature may be of limited use are transactions where a header needs to be created and saved before lines can be added. Barring these scenarios, the advantage of this feature is the ability to validate material numbers, attributes, string lengths (descriptions) and account posting codes. Experiment with this feature for best results. Financial users may be frustrated by the fact that SAP doesnt conduct full debit/credit checking until an attempt is made to actually post the document. But there are a variety of ways one can identify this problem using formulae in the workbook. The Validate feature essentially runs the full transaction with the commit or save statement disabled. From a system administration perspective, this may reflect in sm20 as a canceled transaction.

Given the vast amounts of data that modern multi-national companies need and produce to do business in a global market, mass data creation and change tools are the only realistic solution to operate efficiently and remain competitive. Implementing tools and best practices that minimize the obvious data risk that are inherent in any mass approach is a best practice that every large SAP customer should consider.

Mitigating Risk with Mass Data Change

Winshuttle is the ERP Usability Company, providing software products that enable business users to work with SAP directly from Excel, Web forms and other interfaces without any programming. Winshuttle focuses on a simple fact when using SAP applications, time is money. Winshuttles usability solutions radically accelerate SAP user transactions, saving and redirecting millions of dollars for SAPs customers every day. These financial benefits are achieved by significantly reducing employee and contractor costs and increasing resources to address more strategic priorities. Thousands of customers use Winshuttle to make their SAP lives easier. Headquartered in Bothell, Washington, Winshuttle has offices in the United Kingdom, France, Germany, and India. For more information, visit

Corporate Headquarters
Bothell, WA Tel + 1 (800) 711-9798 Fax + 1 (425) 527-6666

United Kingdom

London, U.K. Tel +44 (0) 208 704 4170 Fax +44 (0) 208 711 2665


Bremerhaven, Germany Tel +49 (0) 471 140840 Fax +49 (0)471 140849


Maisons-Alfort, France Tel +33 (0) 148 937 171 Fax +33 (0) 143 683 768


Research & Development Chandigarh, India Tel +91 (0) 172 465 5941

Mitigating Risk with Mass Data Change