You are on page 1of 21

How to Setup and Run Data Collections

Contents
PDF for Printing This Document .................................................................................................................... 2
Purpose.......................................................................................................................................................... 2
Understanding Centralized vs. Distributed Installation ................................................................................ 3
Understanding the Data Collections Process ................................................................................................ 4
3 Key Stages to the Data Collections Process............................................................................................ 4
Refresh Snapshots Process .................................................................................................................... 4
Planning Data Pull ................................................................................................................................. 4
ODS Load ............................................................................................................................................... 5
Complete, Net Change and Targeted Refresh........................................................................................... 5
Setup Requests for Data Collections ......................................................................................................... 6
Purge Staging Tables - A Special Note ....................................................................................................... 6
ATP Data Collections vs. Standard Data Collections ................................................................................. 6
Fresh Installation or Upgrade - Basic Steps ................................................................................................... 7
For 12.1 ..................................................................................................................................................... 7
For 12.0.6 .................................................................................................................................................. 8
For 11.5.10.2 ............................................................................................................................................. 8
Upgrading EBS Applications - Special Considerations ................................................................................... 9
DBA Checklist................................................................................................................................................. 9
VCP Applications Patch Requirements ...................................................................................................... 9
RDBMS Patches and Setups for Known Issues .......................................................................................... 9
Other RDBMS Setups / Maintenance ......................................................................................................10
RAC Database - Special Requirements ....................................................................................................11
System Administrator Setups ..................................................................................................................11
Applications Super User Checklist ...............................................................................................................12
VCP Applications Patch Requirements ....................................................................................................12
Fresh Install/Upgrade Setup Steps ..........................................................................................................12
Running Data Collections for the First Time in a New Instance ..............................................................13
Parameters for First Run of Data Collections ......................................................................................13

1 Setup and Run Data Collections


Preventing Data Collections Performance Issues........................................................................................14
How Can I Check on Performance When Data Collections is Running?..................................................16
Cloning and Setup after Cloning for Data Collections .................................................................................16
I Have a Centralized Installation – What should I do? ............................................................................16
Steps to Consider after Cloning when Using Distributed Installation .....................................................17
What Happens when I Setup a New Instance in the Instances Form to Connect an APS Destination to
the EBS Source? .......................................................................................................................................17
Key Points ................................................................................................................................................17
I Am Going to Use ATP Data Collections for Order Management ATP, What Do I Need to Do? ................18
Upgrade and Patching .............................................................................................................................18
DBA’s Information ...................................................................................................................................18
Applications Super User’s Information ...................................................................................................18
Appendix A ..................................................................................................................................................18
List of Setup Requests .............................................................................................................................19
References ...................................................................................................................................................19
Revisions ......................................................................................................................................................21

PDF for Printing This Document


Click here to download the PDF of this document

Purpose
This document will offer recommendations for customers using Value Chain Planning - VCP (aka
Advanced Planning and Scheduling - APS) who are performing:

1. Fresh Installation of EBS Applications


2. Upgrading EBS Applications
OR
3. Cloning the EBS Source Instance and/or APS Destination instance.

We will explain the difference between Centralized vs. Distributed and other Terminology used
in this document

We will help you insure the patching and setups are correct so you can launch and complete
Data Collections successfully.

We will address this functionality for 11.5.10 and above, with a concentration on R12.1.

2 Setup and Run Data Collections


Understanding Centralized vs. Distributed Installation
Centralized installation - single database where the EBS Source applications for the OLTP
transactions and setups used by VCP applications are all on the same database. Data Collections
moves the data from these tables for INV, PO, BOM, WIP, OE, etc from these base tables to the
MSC tables where this data can be used by our applications

Patching for a Centralized Installation - All patches are applied to the single
instance. You can ignore references in the patch readme for centralized and distributed
installation and apply all patches to your single instance. Most patches will mention to
just apply the patch to your instance when Centralized.

Distributed Installation - Two (or more) databases located on different machines. One is the
EBS Source Instance where all OLTP transactions and setups are performed. The APS Destination
is where Data Collections process is launched and uses database links to communicate and
move the data to the MSC tables in the APS Destination.

Why do many customers use Distributed Installation?

1. The processing performed for Planning in ASCP, IO, DRP, etc. can take huge amounts of
memory and disk IO. This can slow down the OLTP processing while the plan processes
are running. It is not unusual for large plans to consume anywhere from 5-20+ GB of
memory while running and move millions of rows into and out of tables to complete the
planning process.
2. It is possible to have APS Destination on different release. APS Destination could be 12.1
with EBS Source running 11.5.10 or 12.0.6
a. Note that you may NOT have an APS Destination with lower version that EBS
Source (example - APS Destination on 12.0 or 11.5 with EBS Source on 12.1)
3. It is possible to have more than one EBS Source connected to the APS Destination. This
requires careful consideration as patching for all instances must be maintained.
Example: 2 EBS Source instances connected to 1 APS destination instance and all are on
the same EBS applications release. For VCP patches that effect Collections, all three
instances must be patched at the same time in order to prevent breaking this
integration. Or customer may have 2 EBS Source on different releases. APS destination
on 12.1 with an EBS Source on 12.1 and another on 11.5.10.

Patching for a Distributed Installation - This requires careful attention to the patch
readme to make sure the patches are applied to the correct instance and that pre-
requisite and post-requisite patches are applied to the correct instance. Patch readme
should be clear on where the patch is to be applied. Usually the patch will refer to the
EBS (or ERP) Source (or 'source instance') and to the APS Destination (or 'destination
instance') for where the patch is to be applied.

3 Setup and Run Data Collections


12.1.x – For our 12.1 CU (or cumulative patches), we are only releasing a single
patch and the patch will be applied to both the EBS and APS instances.
Demantra Integration patches will have a separate EBS Source Side patch
requirement.

12.0.x and 11.5.10 – For these releases we have separate patches for different
parts of the VCP applications.

• ASCP Engine/UI patches are applied only to the APS Destination. There
is also a Source Side UI patch to be applied to the EBS Source only - see
the readme.
• Data Collections patches are applied to BOTH the EBS and APS
instances. The readme may list specific patches to be applied to the EBS
Source for certain functionality.
• GOP (or ATP) and Collaborative Planning are applied to BOTH the EBS
and APS instances.
• For Demantra, the main patch is applied to the APS Destination
instance. And there is usually a Source side patch listed in the readme.

Understanding the Data Collections Process


Note 179522.1 has an attachment showing critical information on Data Collection parameters,
profiles examples of requests launched and the objects used for Data Collections.

3 Key Stages to the Data Collections Process

Refresh Snapshots Process

1. This process is launched by the stage Planning Data Pull to refresh the snapshots
used to collect the data.
2. The snapshots work in conjunction with the views to provide the Planning Data Pull
with the information that will be loaded into the staging tables.
3. If the setup profile MSC: Source Setup Required = Yes (or Y in 11.5.10 and below),
then this process will also launch the Setup Requests that create all the objects used
by Data Collections in the EBS Source applications.
4. When deployed with Distributed installation, these processes are launched on the
EBS Source instance. In a Centralized - single instance installation all requests are
launched in the same instance.

Planning Data Pull

1. When the Snapshot process is finished, then the Planning Data Pull will move the
data from the Snapshots and Views to the Staging Tables - all named MSC_ST%
2. This will spawn workers to help manage the many tasks (40+) that gather the data
into the staging tables.

4 Setup and Run Data Collections


ODS Load

1. This will launch to perform Key Transformations of critical data into unique keys for
the VCP applications. For instance, since we can collect from multiple instances, the
INVENTORY_ITEM_ID in MSC_SYSTEM_ITEMS must be given a unique value. This
happens for several entities. We store the Source instance key values in columns
with SR_ like SR_INVENTORY_ITEM_ID in MSC_SYSTEM_ITEMS
2. Then ODS also launches workers to handle the many different load tasks that move
data from the MSC_ST staging tables to the base MSC tables.
3. During complete refresh we create TEMP tables to copy data from MSC_ST staging
tables into the TEMP tables, then use Exchange Partition technology to flip this
temp table into the partitioned table.
a. For instance, if your instance code is TST, you can see in the log file where
we create temp table SYSTEM_ITEMS_TST,
b. Then we move data into this table from MSC_ST_SYSTEM_ITEMS
c. Then we exchange this table with the partition of MSC_SYSTEM_ITEMS used
for collected data - example here shows table name if the instance_id is
2021 - partition name is SYSTEM_ITEMS__2021)
4. Lastly, the ODS Load will launch Planning Data Collections - Purge Staging Tables to
remove data from the MSC_ST staging tables and if Collaborative planning is setup,
then we launch the Collaboration ODS Load.

Complete, Net Change and Targeted Refresh


1. Complete Refresh replaces all data for the entity, with a few exceptions:
a. Items are never deleted.
b. Trading Partners
c. Sales orders are special entity. We default Sales Orders parameter – No in
complete refresh and will collect sales orders in Net Change mode. This is done
for performance reasons. If you need to refresh sales orders in Complete mode,
then you will have to set this parameter to Yes, explicitly. OR you can run
standalone Targeted Refresh of only Sales Orders.
2. In order to choose Refresh Mode = Net Change or Targeted, you must set Planning Data
Pull parameter Purge Previously Collected Data = No.
3. Net Change picks up only the changes to the entities.
a. This can only be run for transactional entities, like WIP, PO, OM and some setup
entities like, Items, BOM, etc. The XLS attachment in Note 179522.1 explains
which entities can be collected in Net Change mode.
b. If you set parameter Yes for a setup entity - like ATP Rules - it is ignored when
running Net Change collections.
4. Targeted Refresh is used to collect one of more entities in complete refresh mode. This
is very useful to synching data for certain business purposes, or collecting certain setup

5 Setup and Run Data Collections


data, or quick collection for setting up a test case.
Examples:
a. Business Purpose - Customer runs an ASCP Plan intraday at 1 pm, after running
Net Change collections, they run a Targeted Refresh of Customers, Suppliers,
Orgs to get all new customers created that morning.
b. We run Net Change Collections during the week, and Complete Collections on
weekends, but during the week, we need to collect new Sourcing rules and
Customers daily, so we run targeted refresh each night.
c. I need to change Sourcing rules and setup a Supplier capacity calendar and run a
testcase. You can collect Approved Supplier List, Calendars, and Sourcing Rules
to get these changes quickly into the MSC tables for testing without running a
complete refresh of all entities.
5. OPM customers cannot use Targeted Refresh. They must use Complete Refresh to
collect OPM data.

Setup Requests for Data Collections


1. Profile MSC: Source Setup Required = Yes (or Y in 11.5.10) is set by Patch installation or
by user if required.
a. The profile should be setup on EBS Source in a distributed installation. Refresh
Snapshots and the Setup Requests will launch on the EBS Source.
2. Data Collections will launch Refresh Snapshots process which will detect this setting and
launch the Setup Requests. See List if Appendix A
3. When the Setup Requests are completed successfully, then the profile will automatically
set this profile to No (or N in 11.5.10)
4. If running Refresh Snapshots as a standalone request, then choose Refresh type - Fast or
Complete. If you use Automatic, then this profile is not checked.
5. The Request ID and Request Name will be listed in the Refresh Collection Snapshots log
file when these requests are launched as part of Data Collections
6. If a Setup Request(s) fail, then run Planning Data Collections - Purge Staging Tables with
parameter Validation = No. Then launch again. We have seen many times where initial
failures in the Setup Requests are resolved by running 2 or 3 times.

Purge Staging Tables - A Special Note


1. When Data Collections fails, you must run this request, Planning Data Collections - Purge
Staging Tables, to:
2. Reset the value in ST_STATUS column of MSC_APPS_INSTANCES.
3. This will also remove the data from the MSC_ST% staging tables that are populated by
the Planning Data Pull.
4. Always use parameter Validation = No.

ATP Data Collections vs. Standard Data Collections


1. ATP Data Collections can only be used for customers who are:

6 Setup and Run Data Collections


a. Centralized Installation
b. Only using ATP for Order Management and NOT using ANY other Value Chain
Planning Applications
c. If you run ATP Data Collections in Mode - Complete Refresh when you are using
Value Chain Planning Applications, you will corrupt the data in the MSC tables.
You will have to run Standard Data Collections with a Complete Refresh to
recover from this mistake.
2. The ATP Data Collections section in the Note 179522.1 XLS attachment includes
information on how to run Standard Data Collections to mimic ATP Data Collections.
a. This can be useful if there is a problem with ATP Data Collections. You can try
standard collections to see if this works to overcome the issue.
b. Understanding how to run a Targeted Refresh using Standard Data Collections
can be useful to quickly collect changes for only certain entities.
c. Running a Targeted Refresh of just Sales Orders can be useful to synch changes
if it appears that sales order changes are not being collected properly.
d. A Targeted Refresh is like Complete, but only collects changes for any entity set
to Yes. CAUTION: if you make a mistake and run Complete Refresh instead of
Targeted Refresh and set wrong entities to No, then data will be lost and errors
will occur. Then a Complete Refresh would be required to recover from this
mistake.
3. Note 436771.1 is an ATP FAQ which will be help to any customer using GOP/ATP.
a. For understanding when ATP will not be available during Data Collections, see
Q: Why do we get the message 'Try ATP again later'?

Fresh Installation or Upgrade - Basic Steps


When installing and using VCP applications, you are installing the entire EBS Application. There
is not standalone install package just limited to VCP applications.

For 12.1
Customers must install the latest release of EBS Applications and during the implementation
cycle, if a new release of EBS becomes available, plan to install that release (example: Installed
EBS 12.1.2, and three months later EBS 12.1.3 becomes available, then it is very important to
plan a move to this latest release.)

For the applications install, review Note 806593.1 Oracle E-Business Suite Release 12.1 - Info
Center.

For VCP Applications the following Critical Notes are required

1. Note 746824.1 12.1 - Latest Patches and Installation Requirements for Value Chain
Planning. Use this note to reference:

7 Setup and Run Data Collections


a. The latest Installation Notes
b. The Latest available VCP patch for your release must be applied after the
installation.
c. Understand the Patching requirements if you have a separate EBS Source
instance.
i. This includes R12.1 EBS Source
ii. OR if you intend to run EBS Source Applications on R12.0.6 or 11.5.10.2
d. Links to other critical notes about 12.1 VCP Applications
e. List of limitations on new functionality if using R12.0.6 or 11.5.10.2 for the EBS
Source instance
2. Note 552415.1 INSTALL ALERT - Setting Up The APS Partitions and Data Collections
Instance in R12
3. Latest User Guides are in Note 118086.1
4. Note.763631.1 Getting Started With R12.1 Value Chain Planning (aka Advanced Planning
and Scheduling APS) - Support Enhanced Release Content Document (RCD)
a. If you also need to review new features for R12.0, then also see Note 414332.1
5. If you are installing to use Rapid Planning, then see Note 964316.1 - Oracle Rapid
Planning Documentation Library

For 12.0.6
Customers should NOT be planning to use this release, but should move to 12.1.

Note 401740.1 - R12 Info Center is available for customers who must perform this installation

For VCP Applications, critical notes are:

1. Note 421097.1 R12 - Latest Patches and Critical Information for VCP - Value Chain
Planning (aka APS - Advanced Planning and Scheduling)
a. You must apply the latest CU Patches available after the installation and before
releasing the instance to users.
b. If you are using only ATP Data Collections, then apply the GOP Patch and Data
Collections patches.
2. Note 412702.1 Getting Started with R12 - Advanced Planning and Scheduling Suite FAQ
3. Note 414332.1 Getting Started With R12 - Advanced Planning Support Enhanced RCD
4. Note 552415.1 INSTALL ALERT - Setting Up The APS Partitions and Data Collections
Instance in R12

For 11.5.10.2
In the rare case that an upgrade to 11.5.10.2 is being considered, AND it is not an intermediate
step on the way to R12.1, then it is very critical that the latest patches be applied.

Review Note 223026.1 and apply the latest Rollup patches for at minimum:

1. Using ASCP - then apply ASCP Engine/UI and Data Collections rollup

8 Setup and Run Data Collections


2. Using any other applications, then also apply those patches.
3. Using ATP Data Collections only - then apply GOP Patch and Data Collections Patch

Also review Note 883202.1 - Minimum Baseline Patch Requirements for Extended Support on
Oracle E-Business Suite 11.5.10

Upgrading EBS Applications - Special Considerations


1. If upgrading from 11.5.10.2 WITH the latest rollup patches applied, then you should not
encounter the INSTALL ALERT mentioned in Note 552415.1
a. You can run the SQL in the note to verify that tables MSC_NET_RES_INST_AVAIL and
MSC_SYSTEM_ITEMS have matching partitions.
2. After upgrade, applying the latest patches is very critical step that cannot be postponed. Users
should not start setup and testing on the base release of your upgrade.

DBA Checklist

VCP Applications Patch Requirements


1. Latest VCP Patches applied for your release as noted above in Fresh Installation or
Upgrade - Basic Steps
2. If Demantra integration is required, then review Note 470574.1 List Of High Priority
Patches For Oracle Demantra Including EBS Integration

Note: After Patching, there may be invalids or missing objects used for Data Collections.
The patch should set profile MSC: Source Setup Required = Yes (or Y in 11.5.10)
Then with the first run of Data Collections, the Setup Requests will be launched and build all
the required objects for Data Collections, so this will resolve any invalid or missing objects
used for Data Collections. As noted above, you can find more information on the Data
Collection objects and the Setup Requests, etc in Note 179522.1

RDBMS Patches and Setups for Known Issues


In addition to any RDMBS patches recommended in the Patch Readme, please log an RDBMS SR
to get patches for the following RDBMS bugs:

All three of these RDBMS bugs relate to some data not being collected and the issue has been
seen in different entities under different conditions in different database versions. So we
recommend all these patches,

If you have this issue NOW, see step #3.c and execute this as a workaround immediately. It has
resolved most issues, but there will be exposure to other RDBMS bugs below

1. RDBMS bug 6720194 fix should be applied - Wrong results from SQL with semijoin and
complex view merging

9 Setup and Run Data Collections


a. Fixed in version - 10.2.0.5 and 11.1.0.7 and 11g R2
2. RDBMS bug 6044413 fix should be applied
a. Fixed in version - 10.2.0.4 and 11g R1
b. Final fix for Table Prefetching causes intermittent Wrong Results in 9iR2,10gR1
and 10gR2 Note 406966.1
3. RDBMS bug 5097836 and 6617866 fixes should be applied
a. Both relate to query run in parallel not fetch all rows.
b. 5097836 Fixed in version - 10.2.0.4 and 11g R1
c. 6617866 Fixed in version - 10.2.0.5 and 11g R2
d. As a workaround, you can also set all snapshots for Data Collections to use
NOPARALLEL to prevent missing data.
e. Run this SQL
select 'alter table '|| owner||'.'|| table_name|| '
NOPARALLEL;'
from all_tables
where table_name like '%_SN'
and degree > 1;
f. Then if any rows are returned, run the output and it will set DEGREE = 1
Example: alter table MRP.MRP_FORECAST_DATES_SN
NOPARALLEL;
4. Set init.ora parameter _mv_refresh_use_stats = FALSE
a. This is part of our performance tuning for MLOGs per Note 1063953.1

Other RDBMS Setups / Maintenance


1. Insure that RDBMS setups for Oracle applications in note Note 396009.1 (Or for 11i -
Note 216205.1) are completed and all relevant settings for the RDBMS are correct.
In the section for Database Initialization Parameter Sizing, for large data volumes, we
recommend that minimum sizing settings are 101-500 users.
2. Database links are required for the distributed installation
a. See Note 813231.1 Understanding DB Links Setup for APS Applications - ASCP
and ATP Functionality.
b. Test and confirm that database links are working for APPS User in SQL*Plus.
c. These database links will be used in Advanced Planning Administrator / Admin /
Instances form when setting up the application.
3. Note 137293.1 - How to Manage Partitions in the MSC Schema
a. This note is required reading for both DBA’s and Applications Super Users
b. When cloning distributed databases, manual manipulation of certain tables is
required in order to maintain proper database links and instance information.
4. Note 1063953.1 Refresh Collection Snapshots Performance - Managing MLOG$ Tables
and Snapshots for Data Collections
a. There will be performance problems in Refresh Collection Snapshots process if
this note is ignored.

10 Setup and Run Data Collections


b. Review the note to understand and manage the MLOGs for Data Collections and
related applications that share the same MLOGs we use for Data Collections
c. When a lot of data changes happen to the base tables, then we get large MLOGs
that must be managed
AND/OR
d. If the ENI and OZF snapshots mentioned in the note are on the system, then
they must be manually refreshed to keep the MLOG table maintained and
prevent performance issues.
e. You may see initial run of Refresh Snapshots taking a long time. During the
setup phases, this will usually settle down after the first 2-3 runs of Data
Collections unless there is a lot of data processing happening between runs of
data collections and MLOGs grow rapidly.
i. Review the note and use the steps in section - I Want to Check My
System and Prevent a Performance Problem (ANSWER #1) to prepare
the system to avoid performance problems.
ii. These steps are usually best accomplished after the first 2 -3 runs of
Data Collections has been completed, but can be executed earlier if
required.
5. PLEASE NOTE that we expect Data Collections to take significant time during the initial
run of Data Collections. This is because we could be moving millions of rows of data
from the OLTP table to empty MSC tables.
a. Once Data Collections has completed for the first time and Gather Schema
Statistics is run for MSC schema (and MSD schema if using Demantra or Oracle
Demand Planning), then the process should settle down and have better timing
b. It is not unusual to have to setup Timeout Parameter for Data Collections to 900
– 1600 for the first run to complete without a timeout error.

RAC Database - Special Requirements


1. Review Note 279156.1 if you have a RAC database. note that if this is a centralized
Installation, then the setups are mandatory.
2. When using ATP and a Distributed Installation and the EBS Source is RAC database, then
you must see Note 266125.1 and perform additional setups

System Administrator Setups


1. Setup the Standard Manager (or if using a specialized manager) to have as many
processes as possible (15 is the minimum recommendation, 25 or higher is
recommended), since we will launch many requests for Data Collections and other VCP
processes. Plus many other processes also use the Standard Manager.
a. NAV: System Administrator / Concurrent / Manager /Administer to check the
setups
b. NAV: System Administrator / Concurrent / Manager /Define and query the
Standard Manager, then click Workshifts to increase the number of processes.

11 Setup and Run Data Collections


Applications Super User Checklist

VCP Applications Patch Requirements


1. The Super User must also be aware of the patching requirements mentioned in Fresh
Installation or Upgrade - Basic Steps
2. AND it is critical to review the Patch Readme of the latest patches applied to check for
any User required steps/setups listed in the patches.

Fresh Install/Upgrade Setup Steps


1. For a Distributed installation, you must be familiar with Note 137293.1
2. Note 179522.1 - Data Collection Use and Parameters explains:
a. The parameters used for Data Collections
b. Provide performance tips for Data Collections based on parameters
c. Setup of the profiles - ref Step #8 below
d. Provides examples of the requests that run during Data Collections
e. Provides details on the objects created for Data Collections
3. Confirm that you are have resolved issue for install mentioned in Note 552415.1
INSTALL ALERT - Setting Up The APS Partitions and Data Collections Instance in R12
a. Running the SQL in the note should confirm that both tables have matching
partitions.
4. If you have RAC Database then see RAC Database - Special Requirements
5. Confirm that Standard Manager (or if used, the specialized manager) has
workshifts/processes as defined above in System Administrator Setups
6. Make sure profile MSC: Share Plan Partitions - No
7. Run Request - Create ATP Partitions (APS Destination)
a. This has no parameters - do not confuse with Create APS Partitions request
b. This will insure that all partitions are created properly for the ATP tables and if
they are already created, then it does no harm.
8. Download the attachment in Note 179522.1 - see the Profiles Tab to setup the profiles
for Data Collections.
9. Decide if you need to be running the Planning Manager in your Instance
a. The Planning Manager is an EBS Source process that performs interface tasks
between the Order Management, WIP, Purchasing, and MRP applications.
b. Under certain setups, this process must be running in order to collect data
properly.
c. Review Note.790125.1 When Does The Planning Manager Need To Be Running?
- And make a decision on your implementation.
10. Setup the Instance Form
a. For a distributed installation - see Note 813231.1 and confirm database links are
working properly.

12 Setup and Run Data Collections


b. More complete information on the Instances form in Note 137293.1 - Section IX
and download the Viewlet to review with the steps in the note.
11. Organizations setup in the instances form:
a. You MUST include the Master Organization
b. Only should collect from ALL Orgs for the first run of Data Collections, so do NOT
enter any extra Orgs that you plan to collect later in the implementation.
c. You can setup Collection Groups, just do NOT use a Collection Group for the first
run of Data Collections.

Running Data Collections for the First Time in a New Instance


We expect that Data Collections will take an extraordinary amount of time for the first
run. Collecting what can be millions of rows from the OLTP tables into the MSC tables -
which are empty - is a very hard process for the database.

It is not unusual for the first run to require that you use Timeout parameter of 600-1600
minutes to get all data collected from the EBS Source tables to the APS Destination MSC
tables. It does not matter if you are using database links and two instances or a single
instance. Be sure to set this high parameter for both the Planning Data Pull and ODS
Load.

Do not use a Collection Group during the first run, collect using parameter = All
If you are not collecting from all Orgs defined in the EBS source, then only define the
required orgs in the Instances form. Be sure to include the Master Org in the Instances
form.

Once Data Collections has completed successfully for the first time, run Gather Schema
Stats for the MSC schema, then the timing should settle down and be reasonable. The
default timeout of 180 for Planning Data Pull and 60 minutes for the ODS Load can be
maintained.

Parameters for First Run of Data Collections


1. Check that profile MSC: Source Setup Required = Yes (or Y in 11.5.10)
2. Note 179522.1 explains the parameters and the Setup Requests that create all
the objects used for Data Collections
3. NAV: Data Collection / Oracle Systems / Standard Data Collections
4. If your EBS Source instance is 11.5.10 and you have just applied Data Collections
Rollup #32 or higher for the first time, then special steps are required.
Run Data Collections using Targeted Refresh for Planners only. This is a one-time
setup step for the very first run of Collections after the patch is applied.
Key Parameters are:
a. Purge Previously Collected Data - No
b. Collection Method - Targeted
c. Set only Planners = Yes and all others No (or equivalent)

13 Setup and Run Data Collections


5. If you have a lot of data to collect, then the Refresh Snapshots process could
take several hours to run for this initial run.
6. Planning Data Pull Parameters - special considerations
a. Set Timeout parameter very high - suggest 900 - 1600 for the first run
b. Set Sales Order parameter explicitly to Yes if using Order Management
application.
1. When running Complete Refresh - this parameter defaults to No
and runs Net Change Collection of Sales Order for performance
reasons.
2. This is normal and recommended during normal operations.
c. All other parameters should be set to default for the first run of
collections
d. If there are other parameters that will need to be Yes for your
implementation, then plan to set those Yes after you get the initial run
of collections completed successfully.
7. ODS Load Parameters - special considerations
a. Set Timeout parameter very high - suggest 900 - 1600 for the first run
b. Set all other Parameters No for this first run.
c. If you need to set these to Yes, then do this later after the first run has
completed successfully and timing for Data Collections is stable.
8. When the first run is completed, then run Gather Schema Statistics for the MSC
schema.
9. If a Setup Request(s) fail, then run Planning Data Collections - Purge Staging
Tables with parameter Validation = No. Then launch again. We have seen many
times where initial failures in the Setup Requests are resolved by running 2 or 3
times.
10. IF Planning Data Pull or ODS Load fails, check for a Timeout error or other
specific error in the log file of the Main request and all the log files for the
Workers also. You must run Planning Data Collections - Purge Staging Tables
with parameter Validation = No, before you launch the Data Collections again. If
you fail to run this request, then Data Collections will error with message text as
see in Note 730037.1.

Preventing Data Collections Performance Issues


After the initial run to collect the data to the empty tables is completed, the timing of Data
Collections should settle down.
The profiles settings in #1 can be done after the initial run of data collections
When a large volume of changes are made to source data via Data Loads or from transactional
loads, then active management steps are required - see #2.

14 Setup and Run Data Collections


1. The Profile setup in Note 179522.1 should be setup correctly for your instance.
Below we have some suggestions, BUT you must review all the profiles for proper
settings.
Notes on the Profile Setups:
a. MSC: Collection Window for Trading Partner Changes (Days) – Must be NULL or
0 (zero) for first run, but then should be setup for smaller number. Generally 7
days should be sufficient. As long as you run a Complete Refresh at least once
per 7 days, then this will collect the changes.
b. Make sure Staging Tables profile is set to Yes if you are collecting from only 1
EBS Source instance - MSC: Purge Staging and Entity Key Translation Tables /
MSC: Purge Stg Tbl Cntrl
i. If you are collecting from Multiple Instances, then you must set this
profile - No
ii. In this case, the MSC_ST staging tables may grow very large in
dba_segments and high water mark will not be reset when we delete
from these tables.
iii. If you must set profile to No, then we suggest that the DBA periodically
schedule a truncate of the MSC_ST% tables when Data Collections is
NOT running to recover this space and help performance.
c. MSC: Refresh Site to Region Mappings Event - this profile will be relevant when
setting up Regions and Zones in Shipping Application. When new Regions or
Zones are defined, then mapping that occurs during Refresh Collection
Snapshots could take a long time. Otherwise, we just collect new supplier
information, which is not performance intensive.
d. Check to make sure Debug profiles are not setup except in case where they are
required for investigation of an issue.
e. IF using Projects and Tasks and have many records, then if profile MSC: Project
Task Collection Window Days is available, you can be set to help performance.
2. MLOG Management is the key to performance of Refresh Collection Snapshots.
Note 1063953.1 discuses the important steps to managing MLOGs to keeping the
program tuned to perform as fast as possible.
a. For customers actively using Data Collections, these performance problems
occur for two primary reasons:
i. MLOG tables are shared between VCP Applications and other
applications - usually Daily Business intelligence applications or custom
reporting
ii. Large source data changes via Data Loads and/or large transactional
data volumes.
b. Once the initial collection has been performed and most data collected, then
use the steps in section -
I Want to Check My System and Prevent a Performance Problem - and ANWER
#1: Yes, we are using Data Collections now

15 Setup and Run Data Collections


c. When you execute these steps, you will be setting the MLOGs to provide best
performance.
d. You must also be aware if you have any MLOGs that are shared with other
applications. In this case, the DBA will need to actively manage those other
Snapshots, refreshing them manually - via cron job may be the best method.
3. If you have very large BOM tables and plan to use Collection Groups to collect
information only for certain Orgs, then be aware that this can take very significant
TEMP space during the ODS Load. We have seen reports where All Orgs takes only
6-12gb of TEMP space, but when using a Collection Group, then TEMP space
requirements exceed 30gb.

How Can I Check on Performance When Data Collections is Running?


1. Note 186472.1 has SQL to check on specific requests that are running and show the SQL
that is running on the RBDMS.
a. When you run this SQL you provide the Request ID and it checks to find the SQL
being run for this session
b. If you have several Planning Data Pull workers (or ODS Load Workers) running,
then you may need to check several different requests.
c. Run this SQL every 10-15 minutes and check if the output changes.
2. Note 280295.1 has requests.sql script which will provide you with details on the all the
requests in a set.
a. Enter the request id for the request that launched the set and it will provide
details of all the requests. Default request name for the set is Planning Data
Collection (Request Set Planning Data Collection)
3. Note 245974.1 - #7 and #11 show how to setup trace for requests, should this be
required to get details on a performance issue.
a. #11 in that note is very useful for setting trace for Refresh Collection Snapshots
on the EBS Source when it is launched by Data Collections from the APS
Destination.
b. You can download guided navigation for both sections to better understand the
steps involved.

Cloning and Setup after Cloning for Data Collections


I Have a Centralized Installation – What should I do?
When you have a Centralized Installation, then you do not have to consider any steps
here after cloning your instance and can proceed with normal operations of running
Data Collections and most Planning applications.

However, if you want to change the Instance Code in the Instances form, then you must
follow steps in Note 137293.1 - Section X to clean up all the old data BEFORE you create

16 Setup and Run Data Collections


a new instance. If you just change the instance code in the form, then this will cause
data corruption.

Steps to Consider after Cloning when Using Distributed Installation


There are many different strategies to cloning when you have a Distributed Installation
using separate databases for VCP and EBS Applications. It would easy to find 10-20
different ways that configurations can be manipulated by cloning, so we cannot address
them in any document that covers all the strategies.

What Happens when I Setup a New Instance in the Instances Form to


Connect an APS Destination to the EBS Source?
When you create the line in the Instances form on the APS Destination (NAV: Advanced
Planning Administrator / Admin / Instances) you are:

1. Using an instance_id from MSC_INST_PARTITIONS where FREE_FLAG = 1 to


store information on these setups you define in the form.
2. Inserting a line into MSC_APPS_INSTANCES on the APS Destination instance
with the Instance Code, Instance_id.
3. Inserting a line into the MRP_AP_APPS_INSTANCES_ALL table on the EBS source
Instance
4. Then when creating Orgs in the Organizations region of the Instances form you
are inserting rows in the APS Destination table MSC_APPS_INSTANCE_ORGS for
that instance_id.
5. In MSC_INST_PARTITIONS, FREE_FLAG = 1 means that the instance_id is not yet
used. Once the Instance_id is used the FREE_FLAG = 2.
6. You should NOT manually change the FREE_FLAG using SQL. You should create
a new Instance partition using Create APS Partition request with parameters
Instance partition = 1 and Plan partition = 0. We do not recommend creating
extra Instance Partitions, since this creates many objects in the database that
are not required.
7. If the previous collected data and plan data from the old instance is not
required, then it should be removed from the system.

Key Points
1. You will have to MANUALLY manipulate the EBS Source instance table
MRP_AP_APPS_INSTANCES_ALL.
2. You cannot have TWO instances which have the same INSTANCE_ID or
INSTANCE_CODE in EBS Source table MRP_AP_APPS_INSTANCES_ALL
3. You cannot have TWO instances which have the same INSTANCE_ID or
INSTANCE_CODE in APS Destination table MSC_APPS_INSTANCES

17 Setup and Run Data Collections


4. You can always use steps in Note 137293.1 - Section X to clean up the data in
the APS Destination and reset the EBS Source and collect all data fresh and new
into a new instance.
5. Once you have a VERY CLEAR and intimate understanding of Note 137293.1 and
the ways to manipulate Instance Codes and Instance ID’s, then you can use
Section XV of Note 137293.1 and come up with a custom flow to manipulate
and setup after cloning your instances.
6. AGAIN, if you have problems, you can always reset using Section X and collect all
data fresh into a new instance.

I Am Going to Use ATP Data Collections for Order Management ATP,


What Do I Need to Do?
For ATP Data Collections, the instance must be Centralized installation on a single database.
You cannot run ATP Data Collections for a distributed installation.
You cannot run ATP Data Collections if you are using any Value Chain Applications besides ATP
for Order Management.

Upgrade and Patching


Make sure the latest patch are applied for 12.1.x and for 12.0.6 or 11.5.10, then both Data
Collections and GOP patches must be applied to your instance

Make sure the INSTALL ALERT Note 552415.1 is resolved for R12 and R12.1 installations.

DBA’s Information
All steps in DBA Checklist apply, EXCEPT that no database links are required and Note 137293.1
will not be so important unless you move to Distributed installation.
Patching is simpler, since all patches are applied to the same instance.

Applications Super User’s Information


All steps in the Applications Super User Checklist apply, with the following exceptions:

1. Note 137293.1 will not be so important.


2. When setting up profiles, we have noted which profiles need to be considered when
running only ATP Data Collections.
3. Review section ATP Data Collections vs. Standard Data Collections for more information

Appendix A
Note 179522.1 should be the primary resource for information about the requests and other
information on data collections objects, parameters, profiles, etc.

18 Setup and Run Data Collections


List of Setup Requests
The Request ID and Request Name will be listed in the Refresh Collection Snapshots
log file when these requests are launched as part of Data Collections

Shortname Request Name New in 12.1 New in 12.0


MSCDROPS Drop Changed Snapshots
MSCWSMSN Create WSM Snapshots
MSCBOMSN Create BOM Snapshots
MSCINVSN Create INV Snapshots
MSCCSPSN Create CSP Snapshots x
MSCMRPSN Create MRP Snapshots
MSCPOXSN Create PO Snapshots
MSCONTSN Create OE Snapshots
MSCWSHSN Create WSH Snapshots
MSCAHLSN Create AHL Snapshots
MSCEAMSN Create EAM Snapshots x
MSCWIPSN Create WIP Snapshots
MSCSYNMS Collections Synonyms
MSCVIEWS Collections Views
MSCTRIGS Collections Triggers
MSCTRITM Create Collections Item Snapshot Triggers x
MSCTRBOM Create Collections BOM Snapshot Triggers x
MSCTRRTG Create Collections Routing Snapshot Triggers x
MSCTRWIP Create Collections WIP Snapshot Triggers x
MSCTRDEM Create Collections Demand Snapshot Triggers x
MSCTRSUP Create Collections Supply Snapshot Triggers x
MSCTROTH Create Collections Other Snapshot Triggers x
MSCTRRPO Create Collections Repair Order Snapshot Triggers x
MSCVWSTP Create Collections Setup Views x
MSCVWITM Create Collections Item Views x
MSCVWBOM Create Collections BOM Views x
MSCVWRTG Create Collections Routing Views x
MSCVWWIP Create Collections WIP Views x
MSCVWDEM Create Collections Demand Views x
MSCVWSUP Create Collections Supply Views x
MSCVWOTH Create Collections Other Views x
MSCVWRPO Create Collections Repair Order Views x

References
Note 179522.1 - Data Collection Use and Parameters
Note 137293.1 - How to Manage Partitions in the MSC Schema

19 Setup and Run Data Collections


Note 552415.1 - INSTALL ALERT - Setting Up The APS Partitions and Data Collections Instance in
R12
Note 163953.1 - Refresh Collection Snapshots Performance - Managing MLOG$ Tables and
Snapshots for Data Collections
Note 813231.1 - Understanding DB Links Setup for APS Applications - ASCP and ATP
Functionality
Note.790125.1 - When Does The Planning Manager Need To Be Running?
Note 279156.1 - RAC Configuration Setup For Running MRP Planning, APS Planning, and Data
Collection Processes
Note 266125.1 - RAC for GOP - Setups for Global Order Promising (GOP) When Using a Real
Application Clusters (RAC) Environment
Note 396009.1 - Database Initialization Parameters for Oracle Applications Release 12
Note 216205.1 - Database Initialization Parameters for Oracle Applications Release 11i
Note 470574.1 - List Of High Priority Patches For Oracle Demantra Including EBS Integration
Note 806593.1 - Oracle E-Business Suite Release 12.1 - Info Center
Note 401740.1 - R12 Info Center
Note 746824.1 - 12.1 - Latest Patches and Installation Requirements for Value Chain Planning
Note 118086.1 - Documentation for Value Chain Planning
Note.763631.1 - Getting Started With R12.1 Value Chain Planning (aka Advanced Planning and
Scheduling APS) - Support Enhanced Release Content Document (RCD)
Note 964316.1 - Oracle Rapid Planning Documentation Library
Note 421097.1 - R12 - Latest Patches and Critical Information for VCP - Value Chain Planning
(aka APS - Advanced Planning and Scheduling)
Note 412702.1 - Getting Started with R12 - Advanced Planning and Scheduling Suite FAQ
Note 414332.1 - Getting Started With R12 - Advanced Planning Support Enhanced RCD
Note 223026.1 - List of High Priority Patches for the Value Chain Planning (aka APS - Advanced
Planning & Scheduling) Applications
Note 883202.1 - Minimum Baseline Patch Requirements for Extended Support on Oracle E-
Business Suite 11.5.10
Note 436771.1 - FAQ - Understanding the ATP Results and the Availability Window of the Sales
Order Form
Note 186472.1 - Diagnostic Scripts: 11i - Hanging SQL - Find the Statement Causing Process to
Hang
Note 280295.1 - REQUESTS.sql Script for Parent/Child Request IDs and Trace File Ids
Note 245974.1 - FAQ - How to Use Debug Tools and Scripts for the APS Suite
Note 730037.1 - Planning Data Pull Fails With Message - Another Planning Data Pull Process Is
Running

20 Setup and Run Data Collections


Revisions
Author: david.goddard@oracle.com
Rev 2.0 April 2010 – complete revision to replace note 145419.1

21 Setup and Run Data Collections

You might also like