You are on page 1of 102

Testing Strategy

Workshop

Presented by:

1
BREAK FAST

2
Agenda EST 09:00 AM - 05:00 PM
8
IST 06:30 PM - 02:30 AM
hrs.

• Breakfast -------------------------------- 30 min


• Introduction -------------------------------- 30 min
• Testing Strategy -------------------------------- 30 min
• SolMan Tools & Documentation -------------------------------- 30 min
• Break (10:30 AM - 10:45 AM) -------------------------------- 15 min
• RTR -------------------------------- 30 min
• Vistex -------------------------------- 30 min
• Lunch (11:45 AM - 12:45 PM) -------------------------------- 60 min
• BPC -------------------------------- 30 min
• PTD -------------------------------- 30 min
• PTP -------------------------------- 30 min
• Break (02:15 PM - 02:30 PM) -------------------------------- 15 min
• Fiori -------------------------------- 30 min
• GRC/Security -------------------------------- 30 min
• Break (03:30 PM - 03:45 PM) -------------------------------- 15 min
• OTC -------------------------------- 30 min
• Boundary Systems -------------------------------- 30 min
• CSV -------------------------------- 15 min
3
Testing

4
Test Management
Strategy

5
Testing Approach
• Identify Scenarios
• The Integrated (Boundary) Systems should be connected to SAP Quality System before
starting the SIT1 and SIT2
• Some external systems will be conducting SIT & UAT at the same time during SIT (ex. CTSI, Demand Solutions)
• End to End Scenario Test Data Selection Meeting - prior to SIT testing (tentative week of May
22)
• After post-upgrade remediations, we will run Automated Test Scripts - will need workstream
assistance in data preparedness
• The Test Data in TDC and Test Plan will be ready in Solution Manager before starting the
SIT1 and SIT2
• Execute Manual Test Scripts
• Capture Defects in the Solution Manager
• Publish Test Status

6
T e s t i n g S c h e d ul e w it h Ti m e l in e

• Below is the Schedule with Timeline

Allocated
Testing Phases Cycle SAP System From Date To Date
Weeks
UT Cycle 1 SAP Development 05/01 05/12 2

UT Cycle 2 SAP Development 05/22 06/06 2

Integrated Test Data Session for SIT & UAT 05/22 05/26 1

SolMan (Solution Manager) Training - IT 06/05 06/09 1

SIT1 Cycle 1 SAP Quality 06/12 07/16 5

SIT2 Cycle 1 SAP Quality 08/07 09/03 4

SolMan (Solution Manager) Training - Business Users 08/14 08/18 1

UAT Cycle 1 SAP Quality 09/04 10/29 8

7
Ro l e s a n d R e s p o n s i b il it ie s
Project Activities Responsible​ Accountable​ Consulted​ Informed​
Prepare Test Scenarios / Training
S2 & Avanos Functional Team S2 & Avanos Functional Team S2 & Avanos Testing Team S2 & Avanos Stakeholders
Materials

Review and Selection of Test Scenarios S2 & Avanos Functional Team S2 & Avanos Functional Team S2 & Avanos Testing Team S2 & Avanos Stakeholders

Updating of Test Scripts (new/changes) S2 Functional Team S2 & Avanos Functional Team S2 & Avanos Testing Team S2 & Avanos Stakeholders

Finalize Testing Strategy S2 Testing Team S2 Testing Team Avanos Testing Team S2 & Avanos Stakeholders

Migration of Test Scripts into Solution


S2 Testing Team S2 & Avanos Testing Team S2 & Avanos Functional Team S2 & Avanos Stakeholders
Manager
Solution Manager - Management - Test
S2 & Avanos Functional and Testing S2 & Avanos Functional and
Plan, Defects (Resolution & S2 & Avanos Testing Team S2 & Avanos Stakeholders
Team Testing Team
Management)

Unit Testing S2 & Avanos Functional Team S2 & Avanos Functional Team S2 & Avanos Testing Team S2 & Avanos Stakeholders

Avanos Business users, S2 &


Integration Testing (SIT1 & SIT2) S2 & Avanos Functional Team S2 & Avanos Functional Team S2 & Avanos Stakeholders
Avanos Testing Team
S2 & Avanos Functional Team,
Perform UAT Avanos Business Users Avanos Business Users S2 & Avanos Stakeholders
S2 & Avanos Testing Team

UAT - Coordination S2 & Avanos Testing Team S2 & Avanos Testing Team S2 & Avanos Functional Team S2 & Avanos Stakeholders

Confirmation of User Acceptance Testing S2 Functional Team S2 & Avanos Functional Team Avanos Business users S2 & Avanos Stakeholders

Avanos Business Users, S2 &


UAT - Defect Resolution S2 & Avanos Functional Team S2 & Avanos Functional Team S2 & Avanos Stakeholders
Avanos Testing Team

8
Test Data Strategy

• Test Data will be obtained from


• Production System Data (If available)
• Any Legacy Data provided by the Data Migration Team
• Data entered manually into the system by the testers while performing the testing
• Data generated using automated scripts

• Test Data Repository


• S2 Functional Team and Avanos Functional Team will use the Test Data Container ( TDC ) of Solution Manager to
create and maintain the Test Data

9
T e s t M a n a g e m e n t P ro c e s s
• Test Management Process help for the testing the Test Scripts in 4 processes i.e., Test
Preparation, Test Planning, Text Execution and Test Reporting with the help of Business
Process Experts, Test Manager and Testers

10
Test Reporting

Once the testing is in progress the below reports will be created


• Completeness and Gap Reports Test Cases not included in Test Plan
• Overview Report as Test Suite
• Test Execution Analytics with Multiple Test Plan Status
• Status and Progress Analytics

14
S I T 1 / S I T 2 - S i g n O ff
Test Script Types Test Scripts are considered as Automated and Manual Test Scripts for testing
• Identified Test Scripts aligned with Avanos
• Test Data Repository
Entry Criteria • SAP Quality System Access with required Roles
• Finalized Test Plan.( with Resource mapping )
• Results and Defects Tracking in Solution Manager

Test Scripts Category


A - “Must Test”
B - “Should be tested”
C - “Test if Time Permits”
Passing of Test Scripts with Resolution of Defects
Exit Criteria
• Category A - Test Scripts Pass Percentage is 100% and no Defects will be pending
• Category B - Test Scripts Pass Percentage is 100% and no Defects will be pending
• Category C - If executed then Test Scripts Pass Percentage is 50% and no Defects will be pending
Resolution of All high priority defects
Validation and Sign off by Avanos Functional Workstream Leads

Environment SAP Quality System

Authorizations QS5 SAP Quality System No Authorization Role based Testing

Boundary Systems Considered

Test Data will be created in SAP System while performing testing and if the SAP Workstream requires to maintain the Test Data
Test Data
which will be created in Excel

Reviewer S2 SAP Workstream Lead

Approver Avanos SAP Workstream Lead

Sign off Paul Martin - (Associate Director - Testing)

15
UA T - S i g n Of f
Test Script Types Identified and finalized Automated and Manual Test Scripts by Avanos Business Users
• Identified Test Scripts aligned with Avanos Business Users.
• Test Data Repository
Entry Criteria • SAP Quality System Access with required Roles
• Finalized Test Plan.( with Resource mapping )
• Results and Defects Tracking in Solution Manager

Test Scripts Category


A - “Must Test”
B - “Should be tested”
C - “Test if Time Permits”
Passing of Test Scripts with Resolution of Defects
Exit Criteria
• Category A - Test Scripts Pass Percentage is 100% and no Defects will be pending
• Category B - Test Scripts Pass Percentage is 100% and no Defects will be pending
• Category C - If executed then Test Scripts Pass Percentage is 100% and no Defects will be pending
Resolution of All high priority defects
Validation and Sign off by Avanos Functional Workstream Leads

Environment SAP Quality System

Authorizations Authorization Role based Testing with Business User IDs

Boundary Systems All External Systems are considered as Boundary Systems

Test Data Test Data Container of Solution Manager

Reviewer S2 SAP Workstream Lead

Approver Avanos Business Process Lead

Sign off Dean Bergman - (Director Operations Management and Finance Enablement Global IT Services)

16
Solution Manager -
Tools &
Documentation

17
T e s t M a n a g e m e n t O v e rv ie w
Solution Manager Test Suite
• It is complete Test Management application for manual and automated testing.
• Integrated with other Solman applications: Solution Documentation, Incident and Defects Management, Requirements
Management and Change Request Management. 

Solution
Documentation

Change Request
Requirements
Management Test Management
Management
(ChaRm)

Defect
Management
(ITSM)

18
T e s t M a n a g e m e n t O v e rv ie w
Solution Manager Test Suite Capabilities
• Test Suite in SolMan supports test phases starting from Preparation, Planning, Execution till Reporting
• CBTA (Component Base Test Automation) tool will be used for Test Automation

Solution Change Impact


Test Planning Test Execution and Analytics
Documentation Analysis

19
T e s t P r e p a r a t io n
Solution Documentation - Test Case Assignment Options
• Test Cases will be assigned directly to process structures in Solution Documentation.
• Assign generic test cases to Process Steps in the Process Step Library (Process Step Original).
• Process specific test cases should be assigned to Process Steps Reference.
• End-to-End test cases should be assigned directly to the Business Processes.

20
T e s t P r e p a r a t io n
Solution Documentation - Test Case Types
• Manual Testing
• File Based - Word, Excel etc.
• Focus Build - Provides built-in tool to define test case - Test Step Designer.
• Automated
• CBTA (Component Based Test Automation)

21
T e rm i n o l o g y
Traceability Matrix Solution Manager - Test Suite
Cycle Test Plan

Instance Test Package

Script / Test Case Script

22
Test Planning
Overview
• Define scope of Test, define sequence, segregate scope in small packages and assign to Testers for execution

23
Test Planning
Test Data Container
• TDC is a central repository for all test data. 
• It contains the information of parameters, attributes and variants used in test scripts.
• Test data stored in TDC can be used in multiple test scripts, it can be assigned to Manual and Automated Test cases

24
Test Execution
Test Execution - Basics
• Test Packages and Test Cases which are assigned to user can be viewed using Tester Worklist application.
• Test execution can be started if assigned test case status is ‘Ready to Test’.

25
Test Execution
Tester Worklist Application
• Tester Worklist is the application used by users to check the Test cases assigned to them for execution
• It shows Test Packages and Test Cases assigned to current user id, status, log and any defects created for Test case
• Test execution can be triggered from this screen

26
Test Execution
Documenting Execution Status and Results for Manual Test Case
• Test results can be documented on the Test Results tab.
• The starting point for Test Result creation is using the template in the system or a local file.
• Multiple Test Results can exist per Test Case execution.
• Create or attach a test results document from Test Results tab.

27
Test Execution
Documenting Execution Status and Results for Automated Test Case
• Once the automated test case execution is completed, it will automatically generate Test Result and its log
• It will show execution status report, click on Log in 'Link Log' column to access html report for execution details

28
Test Execution
Managing Test Defects
• Defect Management part of SolMan Incident Management tool supports reporting and processing of errors
encountered during test execution
• Test Defects can be created from ‘Tester Worklist’ or ‘Manual Test Case Execution’ screen
• Test Defects will then be processed through Incident Management tool

Testers Worklist

29
Test Reporting
Test Reports
• Various reports available for tracking progress of Test Preparation, Test Execution and Execution Progress

30
OVERALL

31
O v e r a l l Fu n c t i o na l Le a d s

Paul Karthikeyan
Martin Chellamuthu  

32
P r o j e c t K a n s a s - T e s t i ng S c h e d ul e
05 Go-Live

04 UAT
Today

03 SIT 2 • Scenarios to be
selected by Business
02 SIT1 • SOLMAN for Script
execution
• Role based Testing
01 UNIT TESTING • Automated & Manual Testing • All Boundary
• SOLMAN in Scope systems in Scope
• Manual • Role based Testing for FIORI
Testing • All Boundary systems in
• SOLMAN not Scope
in Scope
• Automation
Testing
(possible)
• Open Access
• Boundary
system
(Minimum)
05 06 07 08 09 10 11 12

2023

33
F u nc t i o n a l A p p r o a c h

Critical Process Fiori


• Identify & align Critical processes • Finalize Scope & align with Business

• Ensure it is prioritized for Testing • Validate roles with the GRC team

Boundary Systems Test Data

• Identify list of all 3 Party systems


rd • Leverage data from PS4 Copy

• Ensure Test scripts for same • Use Data Repository, as recommended

Test Scripts Cross Module Integration

• FIT/GAP Analysis on existing Scripts • Identify key processes across modules

• Create New Test Scripts, as identified • Plan resources & support execution

34
Critical Processes
• Identified priority process to be covered in Testing / Functional processes

• Key Metrics • No Of Critical Scenarios

FIORI
Workstream Scenarios
(Priority A)
200
RTR 149 23
164
149
150
OTC 109 28

109
PTD 164 29 97
100

67
PTP 34 34
50
34
BPC 67 14

0
Vistex 97 3 RTR OTC PTD PTP BPC Vistex

35
F I O R I A p p s - S um m a r y

RTR

OTC

PTD

PTP

BPC

Vistex

0 10 20 30 40 50 60

New Existing - No Change Existing- Change Priority A

36
RTR

37
RT R T e a m

Todd Akanksha Bharath Arjunan


Hagan Gupta

Lissy Radha K Chandra Mohan Sathyaraj Harini Thyagaraja Yasmin Begum


Chinchilla Vemannagiri Rajgopal

38
RT R - T e s t S c e n a r i o s • Execution of Critical process based on
Priority 
• Logging of defects for any issues identified
• Based on the lessons learned in sandbox
testing, ensure those issues are fixed and
tested thoroughly

 Type SIT UAT


Critical 106 106
Non-Critical 65 65
Automated Manual
Prioritization

SIT UAT Priority A 134 134


Automated Manual Automated Manual Priority B 37 37
8 163 8 163 Priority C 0 0

39
RT R - I n t e g r a t e d T e s t S c e na ri o s
Workstream SIT UAT
SAP Concur 1 1
Citibank 1 1
JPM Chase BANK 1 1
MITSUBISHI BANK 1 1
HSBC BANK 1 1
BBVA BANK 1 1
BMG BANK 1 1
RBC BANK 1 1
US BANK 1 1
MUFG BANK 1 1
Vertex 1 1
High Radius 1 1
Axosnet (MX A/P) 1 1
Blackline 1 1
Thomson Reuters Fx 1 1
Total 15 15

40
RT R - I n t e g ra t e d ( B o und a r y ) S y s t e m s
Connected System Interface Type DEV QUAL Strategy with DS4
Validate employee output and inbound financial posting file - manually
SAP Concur RFC - -
(1 in and 1 out)
Citibank FILE FTP X Ingest inbound bank statement  

JPM Chase BANK FILE FTP X Ingest inbound bank statement

MITSUBISHI BANK FILE FTP X Ingest inbound bank statement

HSBC BANK FILE FTP X Ingest inbound bank statement

BBVA BANK FILE FTP X Ingest inbound bank statement

BMG BANK FILE FTP X Ingest inbound bank statement  

RBC BANK FILE FTP X Ingest inbound bank statement

US BANK FILE FTP X Ingest inbound bank statement

MUFG BANK FILE FTP X Ingest inbound bank statement

Vertex RFC THRU SIC X X Create full connection with DS4 


Connect to DS4 - Enable SOA services.  Does not disrupt QS5
High Radius WEB SERVICE X X
connectivity.  Can execute a credit card transaction 
take transports into DS4 from QS4. There are two environments, prod
Axosnet (MX A/P) SAP ROUTER - X and non-prod. DS4 will not be connected to non-prod. Validate file
formats only.
Blackline FILE FTP X Compare outputs

Thomson Reuters Fx FILE FTP X Take a file manually and attempt to process inbound

41
RT R - F i o ri A p p s
25

20
Fiori Apps SIT UAT
New 9 9
Existing No - Change 13 13 15

Existing Changed 0 0
Existing Name Changed 1 1 10
Prioritization
Priority A 23 23
5

0
New Existing No - Existing Existing Priority A
Change Changed Name
Changed
SIT UAT

42
RT R - C ri t i c a l P r o c e s s e s

Supporting Process
Leading
Critical Processes
Module PTP OTC HR

RTR Sales Billing

RTR Procurement Billing

RTR Assets Acquisition

RTR Payments

RTR Payroll

RTR Product Cost

RTR Period Close

43
RT R - T e s t i n g Le a d s a n d T e s t e r

• Below are the Testing Leads and Testers

Testing
Testing Leads Testers
Phase
Todd
Akanksha
SIT Todd / Akanksha Harini
Yasmin
Bharath

UAT TBD Avanos Business Users

44
Vistex

45
Vistex Team

Longin Ross Kishore Prasad Kanch Rosh Raj Samy


Jurkovic Yerden Pinnamaneni u Singamaneni

Karthik Madhusudhan Attada Jagadeesh Geetha Madhavi Tikka Mohammed Ibrahim


Thammisetty Kenguva

46
V i s t e x - T e s t S c e na r i o s
• Approach to prioritize the critical processes 
24.74% • Run the Chargebacks using the manual
claims approach
• Run the customer Incentives process
using the existing source data but new
agreements
• Run the price agreements maintenance
via master maintenance using Vistex UI

 Type SIT UAT*


75.26%
Critical 8 8
Non-Critical 89 89
Automated Manual
Prioritization

SIT  UAT* Priority A 97 97

Automated Manual Automated Manual Priority B 0 0


Priority C 0 0
24 73 24 73

47
V i s t e x - Fi o r i A p p s
3.5

3
Fiori Apps SIT  UAT
New 0 0 2.5

Existing - No Change 3 3
2
Existing - Changed 0 0
Existing - Name Change 0 0 1.5

Priority A 3 3
1

0.5

0
New Existing No - Existing Existing Priority A
Change Changed Name
Changed
SIT UAT

• 2 Standard Fiori Apps with Search, Display and maintain activities


• 1 Vizi Content Area with UI5/Fiori Screen with Catalog/link to different reports (data sources) by process area 

48
Vistex - Integrated Test Scenarios

• Below are the Test Scenarios identified for SIT  and UAT for boundary testing

Integrated
SIT  UAT
Scenarios by
Test Scenarios Test Scenarios
Workstream
Automated Manual Automated Manual

Vistex
00 07 00 07

49
V i s t e x - I n t e g r a t e d ( B o u nd a r y) S y s t e m s

• Below are the Integrated (Boundary) Systems Considered


• SFDC
• EDI (Gentran) for POS inbound data and Price agreements outbound
• Data integration through BW for subsequent systems/processes
• Territory data for Commissions
• Vistex data for Analytical data lakes

50
V i s t e x - C r i t i c a l P ro c e s s e s

Supporting Process
Leading
Critical Processes
Module RTR OTC

Vistex Agreement / Contract maintenance

Vistex Price agreements for Direct Customers

Vistex Chargeback - Settlements

Vistex Customer incentives - Accruals

Vistex Customer incentives - Settlements

51
V i s t e x - T e s t i ng L e a d s a nd T e s t e r

• Below are the Testing Leads and Testers

Testing Phase Testing Leads Testers

Ross Yerden
SIT Prasad Kanchu
IT - Vistex team (Avanos and S2)

Vicki Catt
Ben Waldhauer
UAT Veronica De Garde
Business users
Vittorio Pellizzari

52
LUNCH BREAK

53
BPC

54
BPC Team

Kiran Salikreddy Bharath Arjunan Praveen Kasa Veera Kommineni

55
BPC - Test Scenarios

• Execution of Critical process based on Priority


Scenarios
• Logging of defects for any issues identified
• Based on the lessons learnt in sandbox
testing, ensure those issues are fixed and
tested thoroughly

 Type SIT UAT


Critical 67 67
Non-Critical 58 58
Automated Manual
Prioritization
SIT UAT Priority A 67 67
Automated Manual Automated Manual Priority B 58 58
2 123 2 123 Priority C 0 0

56
B P C - I n t e g ra t e d T e s t S c e na r io s
Workstream SIT UAT
Power BI Report 1 1

57
B P C - I n t e g ra t e d ( B o un da r y ) S y s t e m s

• Below are the Integrated (Boundary) Systems are considered


• Power BI
• There is a financial report which is executed outside of BPC. The dimensions are in the free selection and the key figures are in local
currency and group currency. The report is extracted using Bex query on the fly and the result output is shown in Analysis for office
• The Bex query name is ZSFIN_C008_Q001 - Ad hoc Query for Analysis
• This Query is used as a source to PBI - Power BI , and data is pulled as-is from the source to PBI Report

58
B P C - F i o ri A p ps

Fiori Apps SIT UAT


New 5 5
Existing No - Change 9 9
Existing Changed 0 0
Existing Name Changed 0 0
Prioritization
Priority A 14 14

59
BPC - Critical Processes

Leading
Critical Processes
Module
BPC Consolidation

BPC Net Sales Planning

BPC Cost of Sales Planning

BPC Cost Center Planning

BPC Corporate Forecast Planning

BPC Capital Expenditure Planning

60
BPC - Testing Leads and Tester

• Below are the Testing Leads and Testers

Testing Phase Testing Leads Testers


Veera
Praveen
SIT Kiran Harini
Yasmin
Bharath

UAT TBD Avanos Business Users

61
PTD

62
PTD Team

Paul Martin Mani Aravind

Lars Kalyan Karthikeyan Ali Surti Julie Browning Cheo Walker Dhaval Patel Ananthakrishnan
Rasmussen Kolaketi Pandiyan Udayasankar

Kalyan Ravi Maruthayya Srinivasulu Puli Aditi Bora Mohammed Shahinsha Prakash Nandakumar
Polamreddy Kumaran Maleswaran Padmaja

63
P T D - T e s t S c e n a ri o s
• Approach to prioritize the critical processes

• MRP Run in S4
• MRP through Demand Solutions
• Production Confirmation
• Picking and Packing for Shipment

SIT UAT
Critical 148 Critical 148
Non-Critical 62 Non-Critical 62
Automated Manual
Prioritization
SIT UAT Priority A 164 Priority A 164
Automated Manual Automated Manual Priority B 31 Priority B 31
29 181 29 181 Priority C 15 Priority C 15

64
P T D - I n t e g r a t e d T e s t S c e na r io s

Workstream SIT UAT

Integration Point (EDI, Idocs, FTP) 10 10

Spec & Revision from EtQ to Ma  & Quality hold inter 2 2

Demand Solutions Interface from S4 to DSX  5 5

EWM Vendor receipt, OB process , returns 8 8

Total 25 25

Key Test Scenarios:

• GC Interface  BOM Costing Interface from S4 to DSX

• IP GTM Classification Origin FTA   IP IMMEX Inventory Extract   FIFO Interfaces 

• PTD  APAC/EMEA  10900  N  ZDSI- Demand Solutions Interface from S4 to DSX (Forecast)

• PTD  APAC/EMEA  10901  N  ZDSI- Demand Solutions Interface from S4 to DSX (Replenishment)

65
P T D - I n t e g r a t e d ( B o und a r y ) S ys t e m s
Boundary Systems Interface Type Dev QAS

SAP Peppol Service WEB SERVICE X X


OMP PLN Namespace W/ rfc X X
OMP FCT Namespace W/ rfc X X
HCPT(In house SQL db) FLAT FILE TRANSFER X X
Demand Solutions FILE FTP
CTSI Transportation IDOC TO FILE FTP X
DHL US EDI X
KTN EDI X
Livingston Denied Party Screen FILE FTP X
Gentran EDI Sub System EDI X X
Net EDI (Europe) File FTP X
GHX EDI EDI X X

66
35
P T D - F i o ri Ap ps
30

Fiori Apps SIT UAT 25

New 19 19 20
Existing 4 4
No - Change 15

Existing Changed 14 14 10
Prioritization
5
Priority A 29 29
Priority B* 8 8 0
New Existing No - Existing Priority A Priority B
Change Changed

• 34 Standard Fiori Apps and 3 Custom Fiori Apps SIT UAT


• 16 Apps with “Display Only” functionality which is applicable and extended to other workstreams is not part of testing
• 21 Apps with Update Functionality
• *Priority B Apps to be included to Priority A based on Unit Testing (DS4)

67
P T D - C r i t i c a l P ro c e s s e s

Supporting Process
Leading
Critical Processes
Module PTP OTC RTR

PTD Material Master

PTD MRP

PTD Discrete Manufacturing

PTD Contract Manufacturing

PTD Inbound Warehouse

PTD WM Operations and Outbound

PTD Third-Party Logistics

PTD TMS

PTD Inventory Transactions

68
P T D - T e s t i n g Le a d s a nd T e s t e r

• Below are the Testing Leads and Testers

Testing Phase Testing Leads Testers

Mani, Srinivasalu, Aditi, 


Shahinsa, Paul, Willi, Ali,
SIT Paul/Mani
Ananth, Julie, Lars, Cheo,
Kalyan

UAT Paul/Mani TBD

69
PTP

70
PTP Team

Wili Thomason Sharnesh


Jr. Sekar

Anil Janani Suresh


Vipparti D Natarajan

71
P T P - T e s t S c e na ri o s
• Approach to prioritize the critical processes

• Service PO Approval & Delegation


• Service PO Change
• Manage Purchase Order & Purchase Requisition
• Create Purchase Order & Purchase Requisition
– Advance
• Manage PIR
• Vendor Creation & App Workflow

SIT UAT
Critical 15 Critical 15
Automated Manual Non-Critical 55 Non-Critical 55
Prioritization
SIT UAT Priority A 34 Priority A 34
Automated Manual Automated Manual
Priority B 24 Priority B 24
29 81 29 81

72
P T P - I n t e g r a t e d T e s t S c e na r io s

Workstream SIT UAT

Vendor Invoice Management 8 8

ETQ to S4 Interface 7 7

Total 15 15

Key Test Scenarios:

• Supplier Invoice are generated against the Purchase Order using Vendor Invoice Management (VIM)

• Vendor is Initially created the status will be updated through ETQ to S4

73
P T P - I n t e g r a t e d ( B o un d a r y ) S ys t e m s

Boundary Systems Interface Type Dev QAS


Vendor Invoice Management X X
Quality’s Vendor Management ETQ X X
Demand Solution X X
OMP - Planning Software X X
AXOSNET X X
VERTEX X X

74
40
PTP - Fiori Apps
35

Fiori Apps SIT UAT 30

New 32 32 25
Existing No - Change 19 19
20
Existing Name changed 1 1
Existing Replaced 2 2 15

Existing Changed 4 4 10
Prioritization
5
Priority A 34 34
Priority B* 24 24 0
New Existing Existing Existing Existing Priority A Priority B*
No - Name Replaced Changed
Change Changed
SIT UAT
• 25 Standard Fiori Apps and 9 Custom Fiori Apps (including VIM apps)
• 12 Apps with create and changed functionality
• 9 Apps with Display functionality which is applicable and extended to other workstreams.
• *Priority B Apps to be included to Priority A based on Unit Testing (DS4)

75
P T P - C r i t i c a l P ro c e s s e s

Supporting Process
Leading
Critical Processes
Module ETQ VIM

PTP Service PO Approval

PTP Service PO Change

PTP Service Approval Delegation

PTP Manage Purchase Order - New

PTP Create Purchase Order – Advanced

PTP Manage Purchase Requisition

PTP Manage PIR

PTP Vendor Proposal

PTP Vendor Create App Workflow

PTP Create Purchase Requisition - Advance

PTP Create Vendor

76
P T P - T e s t i n g Le a d s a nd T e s t e r

• Below are the Testing Leads and Testers

Testing Phase Testing Leads Testers

Sharnesh Janani
SIT
Thomason Jr., Wili Suresh

UAT Thomason Jr., Wili TBD

77
BREAK TIME

78
Fiori

79
F io r i T e a m

Deepak Manjunath Saket Amraotkar

80
Fi o r i / M a s t e r D a t a W o rk f lo w s

• GUI Launch Test


• Maintain user profile
• My Inbox
• SSO GUI
• Common Apps for all Workstreams (ex. Display Reports)
• Master Data Workflows:
• Vendor Master
• Material Master
• Customer Master

81
GRC/Security

82
S e c ur i t y T e a m

Mohan Kalla Sathish Kumar

Jabez Vikram Jayanthi Rethinachala Prabhakar Sandeep Krishna


Tatapudi Dhoolipala m Pendyala Kumar B

Dinesh Balakumar Prasanna Pavan Umesh B Charan Namburi Bhuvana Kokila


Kumar KS Muttanapalli Senthil Kumaran

83
G R C / S S O - T e s t S c e na r i o s
• Below are the Test Scenarios identified for UT2, SIT2 and UAT
Scenarios by UT 2 SIT 2 UAT
Workstream Test Scenarios Test Scenarios Test Scenarios

Automated Manual Automated Manual Automated Manual

To be picked
GRC
based on
00 75 00 75 00
availability of
testers

SSO 00 05 00 05 00 05

• There are no Fiori Apps or Integrated Test Scenarios with other process streams
• Systems included in the testing are GRC and S4

84
TEA BREAK

85
OTC

86
OTC Team

Michelle Smith Sujeet Kumar

Timothy Prakash Madhusudhan Koteswara Dhaval Deepak Rajani


Wanyoike Kumaran Beari Vintha Patel Krishnamurthy Brunda

87
• Execution of Critical process based on
O T C - T e s t S c e na r io s
Priority 
• Product Availability Check
• Credit Management
• Material Determination
• Pricing Determination
• Billing & Output Management
• Route & Shipping Point Determination
• Warranty Process
• EDI order creation with different order reasons

SIT UAT
Critical  188 Critical  188
Non-Critical 25 Non-Critical 25
Prioritization
SIT UAT Priority A 124 Priority A 124
Automated Manual Automated Manual Priority B 35 Priority B 35
59 127 59 127 Priority C 5 Priority C 5

88
OTC - Integrated Test Scenarios

Workstream SIT UAT

Integration Point (EDI, Idocs, FTP) 14 14

Cross module dependencies 84 84

• EDI 850 - Inbound IDOCs Order creation in SAP


• EDI 855 - Outbound IDOCs Order acknowledgment from SAP
• EDI 867 - Inbound Rebate processing
• EDI 845 - Outbound IDOCs Pricing details from SAP 
• SFDC to S4 Vistex Price Deviation Request, Invoice Data
• S4 to SFDC - Warranty data, Customer, Sales Order, Price Simulation, OnDemand Present Future Pricing for
Distributors

89
60
OTC - Fiori Apps
50

40

Fiori Apps SIT UAT


30
New 39 39
Existing No - Change 39 39
20
Existing Changed 5 5
Existing Replaced 3 3 10
Prioritization
Priority A 56 56 0
New Existing No - Existing Existing Priority A
Change Changed Replaced

SIT UAT
• 86 Standard Fiori Apps and 0 Custom Fiori Apps
• 26 Apps with “Display Only” functionality which is applicable and extended to other workstreams is not
part of testing 
• 57 Apps with Update Functionality

90
O T C - I n t e g r a t e d ( B o u nd a ry ) S y s t e m s
DS4 In Scope ?
Connected System Interface Type Responsible DEV QUAL Strategy with DS4
(Yes/No)
Namespace W/
OMP PLS Yes Cheo X X connect to OMP DEV
rfc
Namespace W/
OMP FCT Yes Cheo X X connect to OMP DEV for FCT and PLS
rfc 
No non-prod for DS. For QS5 Manually create files for
Demand Solutions No FILE FTP Paul
comparison / test ingestion and coordinate with Allen
Livingston Denied Party
Yes FILE FTP Ali - X compare file format
Screen
Gentran EDI Sub System Yes EDI Julie X X Compare idoc file formats

Net EDI (Europe) No FILE FTP Julie - X Wait till QS5


Connected to DS4. But must validate NEW transactions
Open Text VIM Yes IDOC / ODATA Paul X X
only 
SAP Peppol Service Yes WEB SERVICE Julie - X Peppol connection exists for DS4 - just need to verify
FLAT FILE
HCPT(In house SQL db) No Cheo X X to be tested in QS5
TRANSFER
IDOC TO FILE
CTSI Transportation Yes Ali - X connection to be changed to DEV temporarily
FTP
DHL US Yes EDI Julie - X compare file format

KTN Yes EDI Julie - X compare file format


Compare file format. Need to test for EMEA. Attached PDF
GHX EDI Yes EDI Julie X X process. PDF can be tested in QS5 only. Deepak will
manually copy the PDF file into the folder and process.

91
OTC - Critical Processes

Supporting Process
Leading
Critical Process
Module OTC PTP PTD RTR VISTEX SFDC

OTC Product Availability Check

OTC Credit Management

OTC Order Management

OTC Logistics Execution

OTC Billing

OTC Master Data

92
O T C - T e s t i n g Le a d s a nd T e s t e r
• Below are the Testing Leads and Testers
Testing
Testing Leads Testers
Phase
Madhu
Tim
Prakash
Michelle
SIT Koteswara
Sujeet
Dhaval
Rajani
Deepak

UAT Michelle Avanos Business Users

93
Boundary Systems

94
B o u nd a r y S ys t e m s

• As per the Current Landscape of S/4 Hana 1709, the Integrated ( Boundary ) Systems are given in the next
slide.

• Interfaces & Bolt-Ons Testing Strategy.xlsx

95
SAP LANDSCAPE SAP SAP Vistex Bi-Directional
SAP PO BW/4HANA Concur
Vertex
One-Directional
SAP
Solution
Manager RFC Open Text

SAP GRC
SAP BI
SAP FIORI
(HUB)

MIDDLEWARE SAP
S/4HANA Success
Factors
Current System
Version 1709

Multiple PWC
Livingston Thompson High Deloitt Enterprise Salesforce
Winshuttle Bank  DEM+
Denied Reuters Fx Radiu e Insights SAP Cloud
Interfaces
Party s Cortex Connector
Screen

Axosnet
(MX
A/P) SAP BTP
(SCP/CPI)
OTHERS
Demand
Solutions

Blackli Other EDI SAP


(Dist Thompso
ne HCPT CTSI Gentran n Reuters Peppol
Centers, STIBO MS Power Services
(Inhouse Transport EDI Sub ETQ Trade Libelle OMP
Suppliers, PIM BI
SQL DB) ation System Complian
Customers
) ce
96
CSV

97
C S V Te a m

Susan Hsu

98
W H O i s r e q u i r e d t o c o nd uc t C o m p ut e r S y s t e m V a l id a t i o n?

99
W H AT i s C S V ?

• Confirmation by examination and provision of objective evidence that computer system


specifications conform to user needs and intended uses, and that all requirements can be
consistently fulfilled.

• CSV it is a continuous process, from conception to the system retirement.

• To objectively confirm that the software is validated for its intended use, all prospective CSV
efforts must include at minimum the following deliverables/activities:
• Defined User Requirements
• Creation and approval of a Validation Protocol
• Execution of the validation activities and tests
(qualification stages)
• Creation and approval of a Validation Report
100
W H AT S o f t w a r e n e e d s V a l i da t i o n ?

Medical Device Software


• Software used as a component, part, or accessory of a medical
device
• Software that is itself a medical device

Production Software
Software that • Software used in the production of the FDA regulated product
requires
Validation

Quality Management Software


• Software used to implement the FDA-required quality
management system

Software for FDA-Regulated Records


• Software used to create, modify, maintain, archive, retrieve,
or transmit FDA-required records. And electronic records
submitted, per FDA requirement.

101
WHY Validate?

• Safer Devices and Processes


• More Effective Devices and Processes
• Can Speed the Development Process
• Assures Accuracy

• And yes, because it is a Regulation and required by external standards

102
O t h e r R e q u i re m e nt s

• CSV Results need to be documented as part of UAT and not SIT


o Unless it will only be tested during SIT and not retested during UAT

• CSV Training Required for individuals performing CSV Tests

103
Questions?

104
TH∆NK YOU

105

You might also like