Professional Documents
Culture Documents
Application Summary
For Master Production Scheduling
Revision: 1
Date: 28-Feb-2016
All Specifications, claims, features, representations, and/or comparisons provided are correct to the best of our knowledge as of the date
of publication, but are subject to change without notice. While we will always strive to ensure our documentation is accurate
and complete, this document may also contain errors and omissions of which we are not aware.
THIS INFORMATION IS PROVIDED BY KINAXIS ON AN “AS IS” BASIS, WITHOUT ANY OTHER WARRANTIES OR
CONDITIONS, EXPRESS OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, WARRANTIES OF MERCHANTABLE QUALITY,
SATISFACTORY QUALITY, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE, OR THOSE ARISING BY LAW,
STATUTE, USAGE OF TRADE, COURSE OF DEALING OR OTHERWISE. YOU ASSUME THE ENTIRE RISK AS TO THE RESULTS
OF THE INFORMATION PROVIDED. WE SHALL HAVE NO LIABILITY TO YOU OR ANY OTHER PERSON OR ENTITY FOR ANY
INDIRECT, INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES WHATSOEVER, INCLUDING, BUT NOT LIMITED TO,
LOSS OF REVENUE OR PROFIT, LOST OR DAMAGED DATA OR OTHER COMMERCIAL OR ECONOMIC LOSS, EVEN IF WE
HAVE BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES, OR THEY ARE FORESEEABLE. WE ARE ALSO NOT
RESPONSIBLE FOR CLAIMS BY A THIRD PARTY. OUR MAXIMUM AGGREGATE LIABILITY TO YOU AND THAT OF OUR
DEALERS AND SUPPLIERS SHALL NOT EXCEED THE COSTS PAID BY YOU TO PURCHASE THIS DOCUMENT. SOME
STATES/COUNTRIES DO NOT ALLOW THE EXCLUSION OR LIMITATION OF LIABILITY FOR CONSEQUENTIAL OR
INCIDENTAL DAMAGES, SO THE ABOVE LIMITATIONS MAY NOT APPLY TO YOU. All product, font and company names are
trademarks or registered trademarks of their respective owners.
Copyright © 2017 - 2017 Kinaxis Inc. All rights reserved. This document may not be reproduced or transmitted in any form
(whether now known or hereinafter discovered or developed), in whole or in part, without the express prior written consent of
Kinaxis.
Kinaxis RapidResponse contains technology which is protected under US Patents #7,610,212, #7,698,348, #7,945,466, #8,015,044,
and #9,292,573 by the United States Patent and Trademark Office. It is also protected under the Japan Patent Office, Japanese
patent #4393993 and the Intellectual Property India Office, patent # 255768. Other patents are pending.
This document may include examples of fictitious companies, products, Web addresses, email addresses, and people. No association
with any real company, product, Web address, email address, and person is intended or should be inferred.
Printed in Canada.
The Master Production Schedule (MPS) is reviewed at the detailed part level. The MPS will drive the
production planning and purchasing functions to support the achievement of the MPS through the MRP
process. It is generated for the short term horizon (usually 1‐3 months depending on production lead
times) in weekly, sometimes monthly, buckets. The MPS looks to achieve a smooth demand signal for
production planning through spreading (month to week) or manual adjustments of unconsumed
forecast / sales orders. The finalized MPS is used as input for the Supply Plan in the next S&OP planning
cycle.
1.2 Functionality
The functional capabilities of the Master Production Scheduling application include:
1.3 Measurements
In addition to the standard corporate measures of revenue, margin, inventory value, and on‐time
delivery metrics associated specifically with master production scheduling, the following measures are
also included in the application’s out‐of‐the‐box dashboards. This allows for focused management of the
performance measures that are most applicable to the function at hand.
MPS Risks
1) MPS Exceptions: A summary of the quantity of master production scheduling‐related
exceptions presented in the current master production schedule.
2) Late Revenue Caused by MPS Orders: A summary of the revenue that is expected to be late due
to late MPS parts. Only reports late revenue within the MPS Window. Only renders data for the
MPS parts for which the User is responsible. The amount of late revenue is plotted in the period
that the demand is due.
3) Constraints Impacting MPS: A summary of the quantity of MPS Parts that will be late due to
insufficient load available at a constraint. Only reports constraints that are causing lateness
within the MPS horizon. Only renders data for the MPS parts for which the User is responsible.
The constraint causing the highest quantity of MPS parts to be late will be represented on the
top row of the chart. Constraints Impacting MPS will only apply to constrained constraints.
There may be overloaded constraints that are “Load only” which will impact the actual
execution but won’t show up here. This also doesn’t reflect CRP capacity overloads.
4) Component Shortages Impacting MPS: A summary of the quantity of MPS Parts that will be late
due to late components. Only reports components that are causing lateness within the MPS
horizon. Only renders data for the MPS parts for which the User is responsible. The component
causing the highest quantity of MPS parts to be late will be represented on the top row of the
chart.
5) Component Shortages by MPS Start Date: A summary of the days to MPS start date that an
MPS part will be late, due to late components. Only reports components that are causing
lateness within the MPS horizon. Only renders data for the MPS parts for which the User is
responsible. The component causing lateness for the earliest scheduled build of the MPS part
will be represented on the top row of the chart.
Performance Metrics:
1) Master Scheduling Summary: A summary of the master production schedule for MPS parts, and
visual comparison of supply and demand within the current planning horizon. It is a graphical
representation of the Master Scheduling workbook at the family level in Summary form.
Stacked Bar represents the MPS Supply & Planned Orders. Lines represent the Demand and
Aggregate Supply Plan. Only reports Supplies within the MPS horizon.
2) Misaligned MPS Parts to Demand: A summary of the quantity of MPS Parts whose availability is
later than the required date. Reports MPS parts where the Supply DueDate is late to its Demand
Date within a defined tolerance.
3) MPS Attainment %: A summary of the count of on‐time MPS orders as a function of the total
count of MPS orders, represented as a %
4) Number of Constraints Overloads/Underloads: A summary of the count of constraints which
are either overloaded or underloaded, based on the load from the master production schedule.
Role Responsibility
Master Scheduler Establishes the anticipated end item build schedule by disaggregating the supply
2.1.2 Outputs
The outputs for the MPS process include:
Approved Master Production Schedule (firm planned order changes)
Closed loop transactions back to host system, when applicable:
o Part Planning Properties
o Planning BOM changes
o Constraint Adjustments
o Inventory Transfers
o MPS quantity and date changes
o MPS New scheduled Receipts
o Purchase Order / Work Order quantity and date changes on sub‐assemblies or components
Note: The ‘Project Baseline’, ‘Supplier Collaboration Automation’ and ‘Supplier Collaboration’ scenarios are only installed when those
Applications are purchased.
When working with multi‐server environments containing a Data Integration server (or a server where separate business processes take place),
it is recommended that a minimal scenario tree be maintained and utilized on the Data Integration Server.
The following scenarios would be required:
Data from the ERP system will be regularly updated into Enterprise Data. Any automated data modification would occur in Approved Actions,
with data updated into Baseline where any user data modification occurs. Data update into the Production server (post‐data integration server)
will happen from the Baseline scenario. It is likely that, depending on the process(es) moved to the Data Integration server, additional scenarios
will be required as children of Baseline to accommodate additional processes. If this is the case, scenarios will need to be committed to Baseline
prior to data update so all pertinent data is moved across servers.
Master Scheduler
Revenue
Ending Inventory Value
Margin %
On Time Delivery to Request
Key Constraint Utilization
3.1.2 Scorecard
The following role has a Scorecard available to them within the Master Production Scheduling
application.
Master Scheduler
On Time to Commit
Revenue
Margin %
Key Constraint Utilization
Ending Inventory
3.2 Other Resources: Processes, Task Flows, Workbooks
3.2.1 Task Flows
The following task flows are provided in support of this application:
3.2.2 Workbooks
In addition to those workbooks referenced in the Business Processes section of this document, the
following workbooks are provided in support of this application:
Manual Closed Loop MPS – This workbook is used to report changes that were made to various
planning parameters during the Master Production Scheduling process that might need to be
manually adjusted in the host ERP. The workbook is designed to report changes to:
o Planning BOM: Effective In/Out, Option Ratio, Quantity Per
o Part Properties: Number of Days Supply
o Time‐Phased Safety Stock: Date, Quantity, Days
o Sourcing Properties: Lot Size Quantity Min/Max/Mult, Fixed Lead Time, Scrap, Effective
In/Out
o Order Priority
o Inventory Transfer: Quantity, Cost, Shipping Date, Transit Time
o Pre‐Plan Limit
Closed Loop MPS – This workbook is used in the automated closed‐loop process, and identifies
records that have been modified through the Master Production Scheduling process and will be
closed‐looped via automated transactions. It is the source workbook for those automated
transactions. The workbook is designed to report changes to:
o Planned Order Create
o Transfer Order Change, Cancel
o Transfer Requisition Create, Change, Cancel
o Work Order Change, Cancel
o Purchase Order Change, Cancel
o Purchase Requisition Create, Change, Cancel
o Safety Stock Qty Change
o Planning Time Fence Change
Data Reconciliation MPS – This workbook is used to verify for duplicate scheduled receipt and
inventory transfer records that exist or may have been introduced. It is used in the closed‐loop
reconciliation process.
Data Integrity – Material – While there isn’t a Data Integrity workbook designed specifically for
the MPS Application, the Data Integrity – Material workbook can be used.
Write History Records – This workbook is used in the process to write the MPS Plan records to
historical tables. Uses the Supply Plan Category value = ‘MPSSupplyPlan’ for MPS orders
Supply Waterfalls – This workbook is used to review historical actual supply and forecast over
time. Macro configuration is required to include the MPS plan. See the Macros section.
Prior to release 2014.1 RapidResponse supports approximately 1B input records on all input tables in all
scenarios. As of release 2014.1 RapidResponse supports approximately 4B input records on all input
tables in all scenarios. You do not want to come anywhere near this number for just one table.
For example, forecasting will create a series of detailed ForecastDetail records for eligible PartCustomer
records from a very high‐level summary perspective. The fact that we are forecasting at a high level does
not mean that we are not creating these detailed records and the scope of this can be easily missed. If
you are forecasting for 200K parts with an average of 200 customers each, then you are forecasting 40M
PartCustomers. Further, if your DisaggregationCalendar (pre 2014.4, was $DP_BaseCalendar) is set to
weekly and you are forecasting for 2 years, then this will result in 40M * 104 or greater than 4 billion
ForecastDetail records. This CANNOT be supported in RapidResponse and even if it could the
performance would be unusable
In addition to the input record limits mentioned above, there is a 4 billion Record ID limit per table
across all scenarios that can impact our customers on Historical and ForecastDetail tables. An input
record is assigned a record ID at creation that is simply a counter for each table. The Record ID values
are not reused and (except in one very special case) they are not recovered by a server restart or data
update operation. So, if you were to import or create 600M input records on a table (like ForecastDetails
or, more often, HistoricalDemandSeriesDetail or HistoricalDemandActuals), and then delete them and
bring in another set on the next cycle, you would never hit the 4 billion records at any given time.
However, you would still be consuming the record IDs 600M at a time. Therefore, on your 7th update
cycle, you would crash the RapidResponse server, having exhausted all available Record IDs.
The following profile variables are used in this application to support the rendering of other data
elements in the Master Scheduler Dashboard, and are defined in the RapidResponse Application
Settings workbook > Application Variables worksheet.
The following profile variables are used in this application to support the rendering of the Overloads and
Underloads and MPS Exceptions on the Master Scheduler Dashboard, and display counts of overloaded
or underloaded constraints exceeding either a Warning level or a Critical level. They are defined in the
Macros and Profile Variables workbook > Profile Variables worksheet. They are applicable if the
Capacity Planning (Constraint) application is being utilized.
The following profile variables are miscellaneous “Other” configuration variables related to Master
Production Scheduling resources. Each needs to be configured.
4.1.2.2 Macros
The following macros are indicated for use in the Master Production Scheduling application, and can be
configured in the Macros and Profile Variables workbook:
The following automation resources are provided with the Master Production Scheduling application:
Write MPS Plan to Scheduled Task Runs the “Write Historical At minimum, at the end
History MPS Plan” contained of the MPS planning
within the MPS Write process, after the
History Records workbook, approved MPS plan has
which takes the records been finalized and
from the MPS Plan published, prior to
Records worksheet (where beginning a new
Category = $MPS_Plan) planning cycle.
and writes them into
history.
4.1.2.4 Alerts
Seven standard Alerts are included for the Master Production Scheduling application. They support
Manual Closed‐Loop operations, and provide an ability to alert the user(s) to changes that were made to
various properties and parameters as part of the MPS process that might need to be changed in the
source ERP system. If applicable, these will need to be configured for scheduling and notification.
They are:
There is a standard filter that is shipped with RapidResponse called “NPI Items”. It is described as
follows:
This filter can be used to focus on items that are newly introduced, or are going to be introduced.
The idea of this filter is to list Part (or PartCustomer) records that have not yet made it into production.
The filter expression is designed to look for parts where the sourcing data has not yet come into effect.
This may not be an appropriate test. For example, if the process for defining these new parts is to create
PartSource records with effectivity that runs from past to future, then the filter expression may need
some other means to identify these parts.
Careful thought needs to be put into what the expression will look like if it needs to be redefined to
meet the customers need. Ideally, NPI parts should be identified by a discrete code value in a field (no
wildcards), preferably in a referenced table.
4.1.2.6 Hierarchies
Hierarchies are used extensively throughout the S&OP, Demand Planning, Aggregate Supply Planning,
Capacity Planning and Master Production Scheduling applications to allow the entry of data values at a
specific intersection, and allow the forecast detail records to then disaggregate to the lowest level,
PartCustomer. As such any hierarchies defined must be compatible with (have a direct path to) the
PartCustomer table.
Site
Constraint Group
Constraint Name
Region Group
Region
Constraint
Customer Group
Customer Name
Region Group
Region
Customer
Region Group
Region
Site
Supplier Group
Supplier
Part
Region Group
Region
Supplier
If additional hierarchies are requested by the customer, they can be defined as required; however, if the
intended use is within the context of S&OP or related processes, there needs to be a direct link to the
PartCustomer table to support disaggregation logic.
Warning: do not use concatenated values. If you do, some disaggregation sheets will not work properly
if the sheet displays the “level below” the selected hierarchy. Also the system cannot use the
referenced table to enhance query performance. If you need a user‐friendly value, add a custom field
and do the concatenation in the data load or with a data change
4.1.2.7 Collaboration and Responsibility
Collaboration is achieved through the creation and sharing of scenarios with all responsible parties to
review and determine if the necessary action is acceptable. Therefore in order to collaborate, one first
needs to be able to determine who is responsible for the parts or constraints that require review. To
achieve this, a user must be assigned to a particular part or constraint to be the Responsible individual.
A simplified responsibility model was introduced in 2014.4. A new type of resource is available called
Responsibility Definitions. This resource makes it possible for non‐administrators to assign responsibility
for data. Users or groups can be granted permission to create and share responsibility definitions, the
same as other RapidResponse resource types.
Note that to use this new type of resource, users must be able to view the responsibility scenario. It is
therefore recommended that a responsibility scenario is selected as one that is kept current and can be
shared with all users involved in creating responsibility definitions or assigning responsibility for data.
The responsibility model allows any workbook author to include responsibility information in workbook
columns, and format it so that users can easily contact the person responsible for data. This way, rather
than relying on an administrator to build a responsibility workbook to define which data changes are
important, users can decide on a case‐by case basis what is important.
It must be a scenario that everyone who needs to use responsibility definitions can be given
View access to.
It must be a scenario that is kept reasonably current, so that when a new record is added,
responsibility for that data can be assigned.
Choose a permanent scenario to reduce the risk of the responsibility scenario being accidentally
deleted.
System administrators can change the responsibility scenario in the Administration pane > System
Settings > Global Settings, listed in the Responsibility scenario under the Scenarios category:
4.1.2.7.2 Responsibility Definitions
Responsibility Definitions that are accessible to a user are used to view responsibility assignments, or to
assign responsibility data to either themselves or other users.
Multiple Responsibility tables can be identified if multiple variables exist on which to set Responsibilty.
For example, the Sales Responsibility Definition, based on the PartCustomer table, is configured to be
associated with four responsibility tables:
The standard data model contains a UserId field on the PlannerCode and BuyerCode tables that is a
string data type. The Part table references both of these tables, while the Constraint table references
the PlannerCode table. The field on these two tables allows you to assign a user to a specific Planner
code or Buyer code.
Responsibility assignments are listed in the Assignments table, which can be seen when the
Responsibility definition is opened.
The Resources workbook now contains a Responsibilities worksheet where all responsibility definitions
are listed.
A Responsibility Roles worksheet has also been added to the RapidResponse Administration workbook.
This worksheet is used to map Responsibility definitions to variables that are used in predefined
resources.
4.1.2.7.5 Collaboration
Once responsibility is assigned, the name of the responsible person can be rendered in a worksheet
using the RESPONSIBILITY function and by formatting the column to ‘Display as User’.
When the user is displayed in a worksheet, you have the ability to hover over the name to display a
contact card with pertinent information about the user, such as, email address, phone number, etc.
There is also the ability to ‘Send a Message’ through the displayed contact card. The message is sent to
the user’s message center and can be sent to the user’s email address if they choose to do so.
To send the message to an email address the individual user sets it up in their user options (assuming
the system administrator has configured their install to send email messages).
In order to initiate collaboration with the users, one would create a scenario and share it with the
appropriate responsible users and include a message about what data they require assistance/feedback
for. To initiate this collaboration with multiple users directly from the worksheet, the user does a multi‐
select on the names in the Responsible column, right clicks and selects “Share Scenario…”.
On install, responsibility is exposed and collaboration through user name is present in a number of
workbooks. In order to view resources where responsibility is exposed and collaboration through user
name is present, use the Tools > Search Resources function, and search for “Responsibility” in
expressions.
Of importance to note when identifying collaborators is that care should be taken to ensure that
multiple collaborators are not working on resolving the same part/component issue, as their changes
could conflict with each other.
The selected collaborators will receive notification that a collaboration scenario has been shared with
them, requiring them to review the data and possibly make changes to it in order to support the
request. Each collaborator should create a child scenario of the parent collaboration scenario in order
to simulate any required actions. If the issue is resolvable, the collaborator should commit the private
scenario with data changes and select to accept the collaboration request, while including any message
required to the requestor.
If the issue cannot be resolved, the collaborator would not commit their private scenario, but would
instead select to reject the collaboration request, including any message required to the requestor. In
this manner, the requestor can keep track of the collaboration progress and which changes can be
made. The private scenario can be deleted once it is no longer required.
If the collaboration was successful and the required changes can be implemented, the collaboration
owner will commit the collaboration scenario into its parent. This should only be done when it is
confirmed that the required actions will be implemented, as the changes may be sent via closed‐loop to
the transactional ERP system.
If the collaboration was unsuccessful and the required changes cannot be implemented, the
collaboration owner will change the status of the collaboration scenario to the appropriate status. Once
the unsuccessful collaboration scenario is no longer required, it should be deleted.
The collaboration scenario owner can decide how to be notified of changes and updates throughout the
collaboration process. This can be done when sharing the scenario by selecting the appropriate check‐
boxes in the “Notify me when” section on the Notify tab of the Share Scenario dialog. It can also be
changed throughout the collaboration process, if required, by accessing the Share Dialog at any time.
The following describes the functionality of each selection and the bulleted points below identify where
the messages appear:
Anyone responds – responses will be logged in the Scenario Properties Activity Log.
Everyone has responded ‐ responses will be logged in the Scenario Properties Activity Log.
Anyone modifies data in this scenario—sends a message to your Message Center when anyone
modifies data in the scenario.
At any time, the collaborators can send a note to the scenario owner indicating progress or issues
arising. This can be done by accessing the collaboration scenario properties and Responding via the
toolbar icon.
An Activity report can also be distributed to highlight scenario changes and activity, by accessing the
collaboration scenario properties and clicking on the ‘Distribute Activity Report’ icon.
4.1.2.8 Miscellaneous
Below is a list of other configuration points to be aware of for this application:
Master Scheduling Workbook, MPS Attainment Treemap, MPS Metrics, MPS Widgets Workbook and
MPS Widget Details
Specifically for the Master Production Scheduling application, the measure of On Time Delivery to
Commit was deemed likely more relevant to the Master Scheduler. As such, the OTD Commit metric
was included in the Master Scheduler scorecard.
Changes to either the Corporate Metrics tab of the Master Scheduler dashboard or the Master
Scheduler scorecard will be required if the customer wants equivalent measures across both resources.
For example, users in the Master Schedulers role would have the Master Production Scheduling task
flow selected to open on sign in as shown below:
In support of the Master Production Scheduling Application, Users in the following groups should have
the task flow noted below set to open on sign in:
4.1.3.2 PartSourceType
This control table defines the characteristics of each potential supply source (for example, Make, Buy,
and Transfer). For example, it controls the effectivity of sourcing features and determines the type and
status of planned orders when supplies are required using this source. This table is referenced from the
PartSource table.
CostRule: Specifies how the enterprise system calculates the effective cost of a part. Valid
values are:
o UnitCost—sets the PartSource.EffUnitCost to a fixed value equal to the value imported
into the PartSource.UnitCost field or Part.StdUnitCost if UnitCost is 0.
o MaterialLaborOverheadCosts—calculates the PartSource.EffUnitCost by adding the
material, labor, and overhead costs.
Important to note that that if the PartSourceTimePhasedAttributes table has an
effective record for the part source then its values are used instead of those
defined on the PartSource record
Important to note that this setting also impacts the calculation of the
NewPurchaseCost fields found on many tables in the data model
EffectivityRule: Specifies the effectivity rule that is used by the enterprise system to interpret
the dates on PartSource records (which are time phased). Valid values are:
o InclusiveExclusive *recommended
o Always
o Inclusive
o Exclusive
o Never
PlannedOrderSupplyStatus: Specifies how to control Order.Line level configuration settings
(such as the generation of dependent requirements) for planned orders. References the
SupplyStatus table.
PlannedOrderSupplyType: Specifies, for planned orders, how to control order level
configuration settings (such as rescheduling logic and whether the order is a Make, Buy or
Transfer). References the SupplyType table.
PlannedOrderToSRSupplyType: Specifies the supply type to use when converting planned
orders to scheduled receipts, and used when assigning a part source to a scheduled receipt.
o It is likely that the Type values will be different between PlannedOrderSupplyType and
PlannedOrderToSRSupplyType in order to distinguish between a calculated planned
order versus a planned order that was converted to a Scheduled Receipt
PlannedOrderToSRSupplyStatus: Specifies the supply status to use when converting planned
orders to scheduled receipts, and used when assigning a part source to a scheduled receipt.
TransitRules: Specifies how the enterprise system calculates lead time. Lead time values are
considered in calculating a “start date” (PlannedOrder.StartDate or ScheduledReceipt. Valid
values are:
o Cumulative: Essentially, use this one if the Dock‐To‐Stock, TransitTime and PreShipLT
time is not included in the Fixed‐Lead‐Time (rough rule of thumb).
o Inclusive: Essentially, use this one if the Dock‐To‐Stock, TransitTime and PreShipLT time
is included in the Fixed‐Lead‐Time (rough rule of thumb).
4.1.3.3 BOMType
This control table specifies the different types of bill of material records and the processing rules for
them. The BOMType.Value is a text string assigned to each parent‐child record in the BillOfMaterial
table. Each BOMType.Value defined represents a set of configuration rules that control the behavior for
that parent‐child relationship.
Since Master Production Scheduling is done at the end item level, this is relevant for planning BOMS.
Fields of particular interest to Master Production Scheduling may include:
DateRule: Specifies which date to test for in determining bill of material date effectivity. Valid
Values are:
o AnchorDate
o BlowThroughCalcDueDate
Important to note that unless the assembly is a phantom, this option is identical
to CalcDueDate.
o CalcDueDate
o CalcDockDate
o CalcStartDate
o Ignore (Default)
EffectivityRule: Specifies for each BOMType how ranges of units and dates are interpreted
when determining effectivity. This field applies to all effectivity rules. Valid values are:
o InclusiveExclusive *recommended and default
o Never
o Always
o Exclusive
o Inclusive
MPSConfigDemandSource: Specifies which components on the BillOfMaterial table are
considered when looking for component actual sales to consume assembly forecast within a
PlanningBOM relationship. Four conditions must be satisfied for this to occur:
o The assembly part has a PartTpe.ProcessingRule of ‘MPSConfig’
o The BillOfMaterial record defining the relationship between the Assembly and
components must have a BOMType.MPSConfigDemandSource set to ‘Y’
o The component has independent demand records where the
DemandType.ProcessingRule is ‘SalesActual’
o The Assembly has independent demand records where the DemandType.ProcessingRule
is ‘SalesForecast’ or ‘ProductionForecast’
o Important to note that this setting is ignored on all BOMType records that define a co‐
product or by‐product configuration
RatioRule: Specifies how the BillOfMaterial.OptionRatio is used. Valid values are:
o Ignore
o Fraction (default)
o Percent
RoundingRule: Specifies for each BOMType the rules for rounding numbers in calculations.
Valid values are:
o None (default)
o Up
o Down
o Nearest
o QtyPerOrder
4.1.3.4 MPSConfiguration
*New to 2015.3* Control table that contains settings that support the Master Production Scheduling
application. It is expected that this table be configured for the MPS application, and will contain exactly
one record, which can be modified on a scenario by scenario basis. The settings in this table act as
global defaults, some of which can be overridden at the Part level using corresponding fields in the
PartSolution table.
The settings are used in multiple resources, including Master Scheduler dashboard widgets and Master
Scheduling workbook, but do not impact any analytic calculations.
AutoFirmWindow: Defines the auto‐firm window, which is the period within the MPS planning
horizon where planned orders are converted into firm MPS orders. The value identified in this
field is added from the start of the latest SOPAnalyticsConfiguration.CycleCalendar interval that
is contained in the MPS horizon.
o Should be expressed in CycleCalendar units
o This value is used as a global default for all parts where the
PartSolution.MPSAutoFirmWindow is set to ‐1.
o If a non‐negative value is provided in the PartSolution.MPSAutoFirmWindow, it is used
for the part and overrides the value provided in this table.
o Used in the Master Scheduling workbook command “Auto Firm MPS”
Calendar: Reference to the Calendar whose intervals define the bucket size in the MPS
application (eg. Weekly)
FrozenWindow: Indicates the length of the MPS frozen window in Calendar intervals. The MPS
frozen window is the period of time in which is it strongly recommended that changes not be
made to MPS orders, usually for lead time or cost reasons.
o The value provided in this field is used as a global default for all parts where the
PartSolution. MPSFrozenWindow is set to ‐1.
o If a non‐negative value is provided in the PartSolution.MPSFrozenWindow, it is used for
the part and overrides the value provided in this table
o Used in several MPS resources, including Master Scheduling workbook, MPS Widgets
workbook, Planning Parameters workbook.
Horizon: Indicates the length of the MPS planning horizon in Calendar intervals.
o The value indicated here defines the intervals after the CalcStartDate, which make up
the planning horizon.
o From a business process perspective, it is assumed that all MPS orders within this
horizon are fully under the control of the master scheduler.
o The value provided in this field is used as a global default for all parts where the
PartSolution.MPSHorizon is set to ‐1.
o If a non‐negative value is provided in the PartSolution.MPSHorizon, it is used for the
part instead of the value provided in this table.
RunDate: Used to determine the starting point for the MPS Planning Horizon. This field is a
reference to the calendar used to set the current system date.
Tolerance: Specifies whether tolerance settings contained in the ToleranceInterval and
ToleranceCalendar fields are used by MPS resources in determining whether a supply is
considered to be late. Valid values are:
o N (default); Tolerance settings are not used and a supply is considered late if its
available date is after the due date of the demand it satisfies
o Y; Tolerance settings are used and a supply is considered late if its available date is
greater than the number of ToleranceCalendar intervals after its demand due date.
ToleranceCalendar: A reference to the calendar that determines how the ToleranceInterval
value is interpreted
ToleranceInterval: Specifies the maximum limit on lateness for MPS orders
o Used in MPS resources to report whether the master schedule is being achieved, where
if an MPS order is more than this number of ToleranceCalendar intervals late, then it is
considered outside of tolerance and reported as late.
StartOffset: Specifies the number of Calendar intervals after the current system date on which
to begin the MPS planning horizon
4.1.3.5 SOPAnalyticsConfiguration
Control table that contains settings that support S&OP, demand and supply planning. It is primarily used
to set up parameters for disaggregation and statistical forecasting, however there are field(s) that are
used in defining calendar intervals for the MPS application.
4.1.3.6 SupplyType
Defines processing rules for handling a particular supply order and its associated line items. For
example, can allocations be generated for the line items. Also used by the PartSourceType control table
to define characteristics for processing planned orders using this part source.
ProcessingRule: Specifies a processing rule for each scheduled receipt. The processing rule
controls, at the order number or header level, both the reschedule recommendations as well as
whether the supply is ignored. Valid values are:
o ExplodeOnly
o Ignore
o InTransit
o InTransitExact
o NonReschedulable
o RecommendOnly
o Reschedulable (Default)
Source: Specifies a source for each SupplyType. The Source value works in conjunction with the
BillOfMaterials to control the generation of dependent requirements (allocations) on
components. It also works in conjunction with the PartSource, in the case of ‘Transfer’, to
handle inter‐site requirements. Valid values are:
o Make
o Buy
o Transfer (Default)
4.1.3.7 SupplyStatus
Defines processing rules for handling a particular line item (scheduled receipt). For example, a line item
can be reschedulable. Used by the PartSourceType control table to define characteristics for processing
planned orders using this part source. Referenced by the ScheduledReceipt table.
4.1.3.8 AvailableRule
This control table controls how available dates for a part’s supplies and demand are calculated. The
default rule of determining available dates using all capable‐to‐promise rules can be ignored for parts
that are considered unimportant in the calculation of availability using this table. The values in this table
are only applicable for parts whose Part.AvailableRule reference field is populated. Otherwise, the
part’s availability rule comes from the PartType.AvailabilityRule.
ProcessingRule: Specifies how the availability of a part’s supplies and demands is calculated.
Valid values are:
o FromType (default)
o Ignore
o Normal
4.1.3.9 OrderPolicy
This control table specifies the rules for determining the size and dates for placing orders.
Fields of particular interest in Master Production Scheduling may include:
4.1.3.10 OrderPriority
The OrderPriority table might become important in prioritizing the order in which demands are satisfied
or to prioritize the order in which supplies are consumed.
If the customer plans with order prioritization, it’s important that this table be verified and configured
appropriately based on the business rules.
AllocationRule
MaintainCommitments
PeriodEffective
PlanningPriority
TwoDatePlanningRule and TwoDatePriorityLimit (New to 2015.3)
4.1.3.11 OnHandType
This control table specifies whether the stock or inventory is considered in netting and specifies the
rules for determining the availability of OnHand records in netting.
ProcessingRule: Specifies whether the stock is used in netting. Valid values are:
o Nettable — considered in inventory netting calculations.
o NonNettable — not considered in inventory netting calculations.
4.1.3.12 SourceRule
This control table determines how PartSource values are interpreted when a part has more than one
active part source eligible to satisfy a given requirement. It can be used to determine how the target
percentage splits between suppliers are interpreted and thus used to determine the amount of the
planned requirements are sourced from each.
Assumption Statuses:
Closed
Open
4.1.3.14 HistoricalDemandCategory
The categories of demand must be defined here. There are a standard set of categories defined with a
variety of types, which would be defined in HistoricalDemandCategoryType.
In order to determine Supply Plan records, and write the Supply Plan records to history, the Supply Plan
category must be defined, where Value = $DP_SupplyPlanCategory as outlined in the Profile Variables
section. It would typically have a ProcessingRule set to “None”.
Supply Planning targets (eg. SupplyPlanUnitTarget) are set in HistoricalDemandCategory as well, with
ProcessingRule set to “Target”.
Different types have differing AggregationRule, ProcessingRule and UnitType settings. It is the
ProcessingRule that typically defines the purpose of the category. It can be set to:
“None”: Use this to ignore standard Target categories that you are not using. This will keep
them out of workbooks and drop‐down lists.
“Actual”: This is used for storing historical demand actuals. Values in this category can be used
in calculating the statistical forecast.
“Target”: This is used for defining target metric values against which the S&OP annual plan is
measured.
“Forecast”: This is used for storing streams of forecast demands or forecast adjustments. Values
in this category can be used in calculating the consensus forecast.
“ForecastOverride”: This is used for specifying values to override the calculated consensus
forecast.
“ReBalancingForecastOverride”: This is used for specifying values to override the calculated
consensus forecast based on demand and supply balancing.
All Part Types loaded for this must have a ProcessingRule of either MPS or MPSConfig in order
for the consensus forecast to be generated, and be used in supply planning.
AverageSellingPrice: In order to use any of the currency disaggregation or reporting, it is
important to populate the AverageSellingPrice.
SourceRule: If there are multiple active part sources defined for a part, the SourceRule
reference defines rules outlining how the source is chosen when satisfying a planned
requirement
AvailableRule: This reference overrides the setting that comes from the
PartType.AvailableRule, and allows the control at a part level
DemandTimeFence: field indicates the number of units of time after the
Part.PlanningCalendar.RunDate.FirstDate to establish a fence for controlling the use of
unconsumed forecast demands.
o Within the demand time fence, any unconsumed forecast is ignored and not planned for
by Netting; that is, it does not contribute to the total demand picture
o Outside the demand time fence, any unconsumed forecast does contribute to the total
demand and will be planned for by Netting.
o The demand time fence applies only to SalesForecast, not Production Forecast.
IncrementalRule: Controls whether incremental availability calculations can be performed for
orders for this part (Referenced table: IncrementalRule)
PartClass: *New field in 2015.3* Indicates the name of the part class, if any, associated with
this part. These classes are used by within the MPS application to classify parts and assign
planning, safety stock and replenishment strategies. Examples of values include Finished Goods,
Work In Progress, Raw Materials, Spare Parts
o It references the nullable Class table (new to 2015.3) and is in the Solutions namespace
o Replaces the PartSolution.PartClass field in previous versions, in order to facilitate
configuration of classes of parts. This option still exists for backward compatibility.
It is recommended that this table be set to not allow data update to delete records (found in the
table properties of the data model) in order to avoid unintentional cascading deleted in the
PartCustomer, and Historical tables.
4.1.4.2 PartSolution
This table stores application‐specific Part information, and is used in resources specific to Master
Production Scheduling (and Order Fulfillment). Note that the values in this table have no impact on
analytic results. New fields were added in 2015.3 to support the MPS application.
Fields of interest in the Master Production Scheduling application include:
4.1.4.3 Class
New nullable table introduced in the Solutions namespace in 2015.3SU6. It is used to define standard
part classes (Finished Goods, Work in Progress, Raw Materials, Spare Parts, etc.) and custom classes
(Consumables). Its intent is to provide functionality similar to the Solutions::PartSolution.PartClass field,
where categories of parts can be identified and used in reporting. It provides more flexibility in
configuration, as it is not an enum field. It is a reference from the Mfg::Part.PartClass field.
Value: specifies the category of parts; it is expected that these values will vary by business, but
will likely consist of FinishedGoods, RawMaterials, WorkInProgress, SpareParts, Consumables,
etc.
The Solutions::PartSolution.PartClass Enum field still exists to provide backward functionality, but it is
expected that part Class values be configured in this table. Resources within the Master Production
Scheduling application have been configured to display the values defined in this table.
It is important that the Class.Value categories be configured at Integration, or the resources referencing
this won’t be populated, and widgets won’t render correctly. The main reference is found in the
Planning Parameters workbook.
4.1.4.4 PartSource
All parts should have valid Part sources defined, with supply planning fields populated. Some fields of
interest to Master Production Scheduling include:
MaximumQty, MinimumQty, MultipeQty: needed to define order quantities, lot sizing for
allocation and availability calculations
PlanningTimeFence: required to determine the timing of generation of new planned orders
OrderPolicy: reference to OrderPolicy table required to specify rules for determining size and
dates for placing orders
Priority: required to dictate the selection of a PartSource to supply a requirement; important
when there are multiple active PartSources that can either provide the supply on time or equally
late
FixedLeadTime, DockToStockLeadTime, VarLeadTime, SafetyLeadTime: used when determining
planned order due dates
Target: required to determine the percentage split of planned requirements among active part
sources (of the same priority)
EffectiveInDate and EffectiveOutDate: define the period over which the part source record is
valid
4.1.4.5 PartCustomer
PartCustomers must be defined for every required Part and Customer combination. This is the base
table for the S&OP process. It is not enough to just have the Part and Customer records defined
separately. Hierarchies require the presence of valid PartCustomer records.
ForecastItem: This is a specific reference to the ForecastItem that will manage the statistical
forecast for this PartCustomer. This is normally maintained by the “Statistical Forecasting Setup”
dialog (or equivalent) and is left pointing to the blank ForecastItem on import.
BaseKey: If you can even see this LEAVE IT BLANK! ALWAYS!!!
DemandType: This reference to the DemandType may be left out and set to null. It is nullable.
However, if the calculated Consensus Forecast is expected to be spread from the
DP_BaseCalendar to something smaller (Month to Week or Day), then you can set this to a
DemandType with the SpreadRule set to Spread and a defined SpreadProfile provided.
Pool: If forecast consumption by Pool is required, then you can set the Pool for this
PartCustomer on this reference. Usually, we expect the Pool to be set to the same value as the
Customer.Id. If the Pool is not equal to the Customer and one customer should belong to more
than one Pool, then you will need to create new Customer records with the Id decorated to
include the Pool. This is because the Pool reference in not a key reference on the PartCustomer
table.
OrderPriority: You can override the Part.DefaultPriority for forecasts from this PartCustomer
here. Used for allocating limited supply to demands.
MinimumShelfLife: You can override the Part.MinimumShelfLife for forecasts from this
PartCustomer here.
At least one aspect of hierarchies is usually associated with customer characteristics, resulting in custom
fields or references from either Customer or CustomerGroup tables.
4.1.4.7 BillOfMaterial
A fundamental table used to show parent and component relationships, and is used in both demand and
supply planning.
Note that the Master Production Scheduling application does not, at this time, support Feature BOMs.
Those records where PartType.FeatureBOMRule and/or BillOfMaterial.FeatureRule are filtered out of
the Master Scheduling resources.
4.1.4.8 PartUOMConversion
If a user is disaggregating a target from a higher level with a particular UOM, that target value will only
end up disaggregating to part/customers where there is a valid UOM conversion factor for the UOM
currently chosen.
Similarly, if the targets are revenue targets, there must be either a Part.AverageSellingPrice defined or
effective records in the CustomerPrice table for the Part/Customers for the targets to include those
PartCustomers.
4.1.4.9 OnHand
This table stores the amount of inventory and other details for a part in a particular location, including
the reference to the OnHandType table and associated processing rule. It is used to determine the
available On Hand quantities during the supply planning process.
The Supplier table is the repository for supplier information, such as a vendor for a purchase order. The
Site field is optional, and can be configured to either uniquely identify records or be ignored in queries.
The ScheduledReceipt table contains supply for a single part that has an assigned due date. This table is
used during the supply planning process to determine scheduled supply available to satisfy demand.
Some fields of interest in Master Production Scheduling may include:
Part
Site
DueDate
StartDate
Qty
Order
Line
OriginalRecordId
Other demands that should not consume forecast may be loaded here as well with an
Order.Type.ProcessingRule of “Regular”.
Occasionally, one DemandOrder is required to have lines (IndependentDemand records) that are a mix
of backlog that can consume forecast and others that may not. When this is true, define a set of
DemandStatus values that can be set on the IndependentDemand records with
ForecastConsumptionOverride set to either Normal or DoNotConsume, as required.
If the backlog is past due, then the date where it consumes forecast from can be either the DueDate of
the backlog or the DataDate depending on the PartType.ForecastConsumptionDateRule and the
Status.ForecastConsumptionDateRule.
4.1.4.13 ForecastDetail
This table holds current forecast data at the detail level (after any disaggregation calculations have been
performed). Each record pertains to a given part, customer, and category combination, and shows
details such as the forecast quantity and date. Different types of forecasts can be stored in this table
such as the statistical forecast, sales forecast, marketing forecast, and so on.
The values shown in this table are based on aggregate values entered or maintained through various
resources in RapidResponse.
This table is also used to store Target records, such as annual plan targets, when the
Header.Category.Type.ProcessingRule = “Target”.
4.1.4.14 HistoricalDemandHeader
This table identifies each unique part, customer and historical demand category combination. Typically
it is referenced by multiple groups of historical demands, forecasts and other items, including
ForecastDetails records.
4.1.4.15 HistoricalSupplyHeader
This table identifies each unique part, supplier, and historical supply category combination. Each of the
entries in this table are typically associated with multiple historical supply series or historical supply
actual records.
References the HistoricalSupplyCategory and PartSupplier tables
4.1.4.16 HistoricalSupplyCategory
This table identifies different categories of historical supply series and historical supply actuals, in the
HistoricalSupplySeries and HistoricalSupplyActual tables.
4.1.4.17 HistoricalSupplyActual
HistoricalSupplyActual records are stored as a set of vector data on the HistoricalSupplyHeader table.
Note that that currency of these records must be defined through the Header and can’t be defined on
the specific HSA record.
OrderDueDate: Specifies the DueDate associated with the historical supply order.
o Contained within the Solutions namespace
o Should be populated for the MPS application in order to properly report on MPS
attainment details (widget and treemap)
The records from these tables are used in the reporting of the Supply Plan in the Master Production
Scheduling workbook and Master Scheduling Summary widget, in addition to reporting MPS plan
records to history in the Write History Records workbook.
4.1.4.19 PlannerCodes
This table contains a list of valid planner codes, which are used to identify a person or job responsibility
for a part. Responsibility is used heavily in the Master Production Scheduling application, in the
resolution of MPS issues, therefore valid planner codes should be populated.
This includes the configuration of Constraint.KeyConstraint field, in order to properly report on Key
Constraint overloads and underloads in the MPS widgets.
NOTES:
1) If integrating with SAP, Supply Types of ‘FPO’ and ‘NB’ must be created to support Firm Planned
Order create and Purchase Requisition create transactions.
2) In order to update Unit Price on ScheduledReceipts that are created and sent back to the host
system (Firm Planned Order create and Purchase Requisition create transactions above) the
value needs to be provided in the RAW value in the EBM. As such the consultant needs to create
a custom field to hold this information until such time as the ability to generate transactions
from worksheet payloads is available. In conjunction with this the two Business Message
definitions will need to be updated to use the custom field rather than UnitPrice as they are
defined out of the box.
As noted in the Scenario Structure section, these closed loop transactions are initiated from specific
scenarios.
Within an S&OP Cycle, upon publishing the S&OP plan the S&OP Candidate scenario is committed to the
S&OP Intermediate scenario which then updates the Current S&OP scenario. Prior to committing the
S&OP Candidate scenario a scenario compare is performed against the S&OP Intermediate scenario.
Those records that have changed (Consensus Demand Plan, Safety Stock Quantity, or Supply Plan
Records) are captured in the Closed Loop Within Cycle workbook. The Run Closed Loop Within Cycle
Script Scheduled Task is initiated by the Process Owner prior to them publishing the S&OP Cycle. The
Closed Loop Within Cycle script performs three actions:
1) Modifies the OriginalRecordId on newly created scheduled receipt records (this is used for
reconciliation purposes which is discussed further down in this section);
2) Performs a cross scenario update of the Consensus Demand Plan records into the
IndependentDemand table in the Baseline scenario (this is used as the demand to drive
requirements for non‐S&OP applications) and;
3) It triggers RI (Rapid Integration) transactions to be sent back to the host system for
new/changed/cancelled ScheduledReceipts or modified Part.SafetyStockQty records. See figure
1 below.
Outside of an S&OP Cycle, prior to committing the Master Production Scheduling or the Order
Management scenarios a scenario compare is performed against the Baseline scenario. Those records
that have changed (Purchase Requisitions/Orders, Work Orders, Sales Orders, Transfer
Requisitions/Orders, Inventory Transfers, Safety Stock Quantity, or Demand Planning Parameters) are
captured in the Closed Loop Outside Cycle Order Management workbook, Closed Loop MPS workbook
or Closed Loop OF workbook, depending on the application(s) purchased. The Closed Loop Outside
Cycle Order Management Script and Closed Loop MPS Script are scheduled to run on a pre‐determined
schedule depending on the scenario prior to the scenario commit in to Baseline. These scripts perform
two actions:
1) Modifies the OriginalRecordId on newly created scheduled receipt records (this is used for
reconciliation purposes which is discussed further down in this section) and;
2) It triggers RI (Rapid Integration) transactions to be sent back to the host system for
new/changed/cancelled ScheduledReceipts, change/cancelled Sales Orders, modified
Part.SafetyStockQty, new Inventory Transfers, or modified Demand Planning Parameter records.
See figure 2 below.
Those transactions marked as ‘M’ or ‘Manual’ are identified to the user through five workbooks: Manual
Closed Loop Constraint – identifies constraint property edits; Manual Closed Loop Demand Planning –
identifies planning BoM edits; and Manual Closed Loop Inventory Planning – identifies order policy,
inventory disposition and safety stock policy edits; Manual Closed Loop MPS – identifies planning BoM,
Number of Days Supply, Part Source properties, Order Priority edits; and Manual Closed Loop OF ‐
identifies independent demand record edits. If the edited records identified in these workbooks are not
manually entered in to the ERP system prior to the next Data Update, the changes will be lost.
For newly created records that are sent back to the ERP system via RI transactions the records created in
RapidResponse do not exist in the host system until they are sent via RI; therefore, there needs to be a
way to know which records those are once inserted into the ERP System so duplicate records do not
result on the next Data Update in to RapidResponse. In order to identify the record a concatenation of
the RapidResponse Record Id + Date is copied into a field called OriginalRecordId on the applicable table
and is sent on the outbound transaction.
This OriginalRecordId needs to be kept on the record when it is imported into the ERP system therefore
it needs to be defined as a custom field in the host ERP system in every table that will have new records
created via closed loop feeds. Also in the case of creating a new Purchase Requisition or Firm Planned
Order, the host system needs to carry this OriginalRecordId info to the Purchase Order or Work Order if
it is being auto‐generated. Then the OriginalRecordId needs to be included in the data extract for the
inbound transaction. Once the data update is performed reconciliation within RapidResponse is
completed using either the Data Reconciliation Within Cycle or Data Reconciliation Outside Cycle
workbooks.
Within an S&OP cycle this reconciliation is done through a multi‐scenario compare between the Baseline
scenario and the S&OP Intermediate scenario. When the duplicate record is identified a command runs
to delete the record from the S&OP Intermediate scenario, then the S&OP Intermediate scenario can be
updated. See figure 1 above.
Outside an S&OP cycle this reconciliation is done through a multi‐scenario compare between the
Approved Actions scenario and the Baseline scenario. When the duplicate record is identified a
command runs to delete the record from the Baseline scenario, then the Baseline scenario can be
updated. See figure 2 above.
An important note to make about the closed loop and reconciliation processes is that there needs to be
clear definition of where planning/execution is done, RapidResponse or ERP system, otherwise
differences in planning policies, calendars etc. could cause unexpected results from closed loop
transactions.
More detailed information about the closed loop process offered in support of all applications is
outlined in the Closed Loop Process training course. It is mandatory that users implementing this
solution participate in this training course.