Professional Documents
Culture Documents
Practices
Copyright
Copyright
©©2018,
2019,
Oracle
Oracle
and/or
and/or
itsits
affiliates.
affiliates.
AllAll
rights
rights
reserved.
reserved.| Confidential – Oracle Internal/Restricted/Highly Restricted 3
Don’t Starve
Scheduled Jobs
Susceptible to
timeouts
Stressed too much on Periodic Analysis of Scheduled
Anti
Synchronous Pattern
I Skipped the Spring
Flows Cleaning
Patterns
Don’t Hold
Don’t keep the target
system under Duress
Resources Read moreHacked
than what I
My
could chew
Consolidate Integration
FlowsDelete Flows No
Longer Needed
• Use Case
– Sync up records in a file or large dataset with external system
(E.g. Synchronizing journal transactions, Upload employee records into HCM)
• Anti-pattern
– Use invoke activity within a looping construct to call external APIs for every record
• Why?
– Downstream applications receiving large number of atomic requests – putting the
entire system under duress
– Usage based pricing model translates to high costs
Copyright © 2018, Oracle and/or its affiliates. All rights reserved. |
Integration which talked too much
• Best Practice
– Leverage applications’ capabilities to accept multiple records in a single request
• Salesforce: 200 records, OSC / ERP Cloud: 100 records, RightNow: 1000 records
– Leverage Adapters’ capabilities to send large data set as attachments / files
• Salesforce Adapter : 10000 records / 10 MB files, ERP Adapter support FBDI Files
– Use Stage File Action for append to file - send file to the destination at the end
Flow 2
Staged
Source File Get Status
Callback upon
Send completion
Notification
On-Prem
• Antipattern
– Scheduled integration looks for all files to process and loops over all of
them to sequentially process till no files left
• Why?
– If large number of files exist, one run of a scheduled job executes for a long
time - starves other jobs, may get killed by framework
– Processing pinned to single server - does not leverage multiple nodes in
a cluster
Write File
to SFTP
Send Purge
Notification UCM
• Antipattern
– Updating the IAR file externally and then importing into OIC
• Why?
– Can lead to metadata inconsistency -> Validation failures
– Activation failures may happen
• Antipattern
– Gigantic synchronous flows modeling a large number of invokes / conditional logic
– Synchronous flow with invokes within a loop with large number of iterations
• Why?
– Susceptible to timeouts - any marginal slowdown adds up
– Blocking call – holds resources, starves other integration flows
• Antipattern
– Every developer creates their own connection using different set of
configuration/credentials
• Why?
– High number of connections makes manageability painful
• Especially when you need to update endpoint, credentials, configuration etc.
– Complicates impact analysis when there is an app upgrade or metadata/coordinate
change
Copyright © 2018, Oracle and/or its affiliates. All rights reserved. |
Too many keys for same door!
• Best Practices
– Have a custodian to create needed connections and make sure duplicate connection
of same types are not created
• Build a best practice for naming conventions and maintaining set of configuration
– Use Configurator Tool to edit/replace connection in integration (coming soon!)
• Antipattern
– Reading the whole file in memory using "read file" and processing record
by record
• Why?
– Consumes large amount of memory impacting other processing in system
– Sequential processing does not leverage built-in "map reduce" capabilities
• BI Cloud Connector – This is used mainly for data lake or warehouse applications where you want to periodically extract
incremental data from ERP cloud. This applies more to data integration (ETL) instead of application integrations.