This action might not be possible to undo. Are you sure you want to continue?
Volume II June 2007
SAP BI C OMPETENCY
June 2007 - For CSC Internal Distribution Only
Welcome to “The Catalog” the Second Edition of SAP BI Competency monthly Newsletter. This monthly news letter focuses on latest offerings and technologies in SAP BI arena. Our goal is to present a concise view of the industry by providing you with a unique view of SAP BI by presenting news, information about the solutions and industry frameworks, as well as links to thirdparty articles and news that you'll find interesting. Over the coming months, we look forward to providing you even richer content and insight into the world of SAP. We hope this news letter will provide some valuable insights into SAP BI both for experienced consultants, as well as for others.
Top stories of 2006
As usual it was a very busy year for SAP. The company released SAP CRM On-Demand, Master Data Management (MDM) continued to capture the interest of organizations hoping to solve their data problems, and SAP jobs were, as always, a hot topic. Of course, the battle with Oracle raged on. To bring the year to a close, look at some of 2006's top stories. CRM On-Demand SAP made a big splash in the CRM market with CRM On-Demand this past year. It was a shift for SAP, which had held off offering Software as a Service despite the success of startups such as Salesforce.com. Once it introduced CRM Sales On-Demand, SAP didn't drag its feet, releasing service and marketing modules during the year while also adding some big-name customers. The simplified pricing model attracted some of SAP's early customers, but there are other costs customers should be aware of. SAP Jobs Every year brings changes that affect the SAP jobs market. This year was no different, with acquisitions, products and new focus areas presenting challenges and opportunities for the SAP job seeker. SAP knowledge and certification are still in demand, and job seekers are advised to keep their SAP skill sets updated. Master Data Management With more companies attempting to get their data in order, 2006 saw a continuing upward trend for Master Data Management (MDM). Making use of the technology it acquired from SAP had some big customer projects during the year. SAP still has to educate the market, however, and many companies still prefer independent vendors. Business Process Management
With so many large vendors staking a claim in the market, Business Process Management (BPM) increasingly came under the spotlight in 2006. Companies such as SAP, Oracle, IBM and Microsoft all have their hats in the ring and, according to some, have a way to go to catch up with pure-play BPM vendors. In 2006, Net Weaver adoption and increasing interest in service-enabled applications drove companies to look more closely at their business processes. That trend should continue. As part of its BPM push, SAP announced the formation of a BPM Expert Community to help add new tools and functionality for business analysts. SMB Focus SAP pledged in 2006 to increase its customer base to 100,000 by 2010, up from 35,000 in 2006. In order to do this, the company will focus on Small and Midsized Businesses (SMBs). Some think that SAP may eventually offer ERP via Software as a Service (SaaS).
Enterprise SOA: The Key to Executing Business Strategy SAP's Annual Report 2006 Enterprise Data Unification with SAP Net Weaver MDM Web cast
Enterprise SOA: Enterprise service-oriented architecture (enterprise SOA) helps organizations quickly implement business strategies through rapid model-driven composition of business processes, increased flexibility for redesign, and improved business processes across organizational boundaries. The below link provide all the details http://www.sap.com/community/pub/flash/kw16_07_story_2.epx
SAP's Annual Report 2006 You may read the report online or download it. http://www.sap.com/mk/get?_EC=fB0uF37IScEAHwG9mLu2Eq
Enterprise Data Unification with SAP Net Weaver MDM Web cast Gain greater knowledge of how the SAP Net Weaver platform helps businesses implement IT practices using a low-cost, phased data unification approach that streamlines the supply chain, enables global data synchronization, calculates global spend analysis, and improves analytical accuracy. http://www.sap.com/mk/get?_EC=bfZUywbK33q8AxGSxdVlIm
Each year SAP offers several key opportunities for current and future customers, as well as partners, consultants, and users, to learn about SAP solutions. http://www.sap.com/company/events/index.epx
These are the web pages where a pool of documents for each and every module like SAP ABAP, Basis, SD, FI CO etc. can be found. It can be downloaded free of cost
Success Stories &Tips:
BW Design and Data Modeling Tips for Optimal ETL The extract, transform, and load (ETL) process is a key aspect of designing and maintaining a data warehouse. It might constitute 60 to 75 percent of the work and cost in a business intelligence project. ETL also represents most of the risks in terms of data loss, recovery, system downtime, and untimely reporting. The benefits of ETL optimization are greatest during the design stage and decrease exponentially as the implementation progresses into development and production. Conversely, the costs of ETL optimization are lowest during design and increase exponentially as the project progresses into development and production. Therefore, ETL optimization is most effective during the design and data modeling phases, both in terms of cost and time required. What follows is a series of practical tips and techniques for ETL optimization that you can apply during the design and data modeling phases of the BW project. We will not cover other ETL optimization techniques such as tuning of source systems, ETL configuration, and Basis parameters. ETL optimization is a complex topic and involves various factors with a high level of dependencies such as data modeling, query, and Basis parameters. As a result, some of these tips come with trade-offs, especially between query and ETL performance. For instance, aggregates are tools for improving performance in reporting, but they create additional processing and overhead at data-load time. This might result in suboptimal ETL performance. These optimization tips and techniques — 24 in all — are organized in six categories: General design and data modeling PSA design Info Cube design ODS design Dimensions and characteristics design Aggregates
General Design and Data Modeling 1. Always start with Business Content. Using standard Business Content considerably reduces ETL development and maintenance efforts. Delivered extractors use standard extraction and recovery methods and can handle extraction of R/3 data in the required, recalculated way. Design and maintenance of these extractors is easier than that of custom-built extractors, and they have less potential for suboptimal coding that could present data and performance issues. These extractors are application- specific and can transform the data from multiple tables into meaningful business information. As an added benefit, most Business Content extractors have built-in delta load capability, reducing the ETL volume by avoiding full loads. It is difficult to duplicate this delta change capability on a custom basis or with a third-party tool, and it could be expensive from a performance perspective. 2. Look for data requirements that are good candidates for BW. Not all data requirements are good candidates for BW, including: Where up-to-the-minute data is required, such as profitability calculations or bill of material (BOM) explosions. We suggest you avoid operational reporting through BW.
Where forms or documents are produced — e.g., invoices or purchases. Where the data is not held in BW and not required in BW.
Also, leverage existing reports in the source systems — e.g., R/3 — to meet some of your reporting requirements. In many projects, considerable effort and money is spent duplicating or reimplementing existing reports without fully evaluating the standard reports available in BW. Note that the additional reporting activity may increase overhead on the source system. 3. Avoid large data loads and complex calculations. Make sure that all the data is needed in BW before you load and replicate it. In cases where complex calculations are required, it may be easy and reliable to perform calculations/aggregations in the source system. This is often the case for complex calculations such as profitability calculations and BOM explosions that require data from extremely large tables that may not be required to be stored in BW. If the calculation is done in BW using different large tables, you need to confirm that the referential integrity is maintained. If the calculations are done in the source system, the calculations will be up-to-date and accurate in the source system and make reconciliation of BW data straightforward. Most of these calculations can be done at the load user exit EXIT_SAPLRSAP_001. Note that the additional processing overhead might have a negative impact on the source system. Rather than loading detailed data that is not required for reporting into BW, you could bring in aggregated data wherever possible. This not only improves data load and staging, but it also minimizes the need to store large amounts of data in BW.
We welcome any comments and constructive feedback to improve this Newsletter further. Please contact Charan (slakkaraju2@ csc.com) & Vijay Bhaskar (vbodanapu@ csc.com).
This action might not be possible to undo. Are you sure you want to continue?
We've moved you to where you read on your other device.
Get the full title to continue reading from where you left off, or restart the preview.