You are on page 1of 19

Blueriver QA Assessment Results

Detailed Results

Blueriver

Authors: Tony Zapata, Carlos Choc, Galib Saab, Alexis Gutierrez and Erick Tescum
Version: 1.0
Blueriver - Data Assessment Results

Contents
Detailed Results...........................................................................................2

Foreword......................................................................................................4

Confidentiality 4

Assessment Checklist..................................................................................5

Identification 6
Process and Documentation 6
Trust 7
Integration and interoperability 8
Presentation, Visualization and Analysis 8
Data Review 9
Improvements and recommendations. 9
Engineers and Specialists Blueriver can provide 9
Data Definitions 10
Tools summary. 10
Data Infrastructure Improvement Example 11
Current Implementation 11
After Blueriver improvements 11

Blueriver 3
`
Blueriver - Data Assessment Results

Foreword

Confidentiality

The material contained in this template represents proprietary, confidential information pertaining to Blueriver products
and methods. The information in this assessment template shall not be disclosed outside of the company until explicitly
stated and shall not be duplicated, used, or disclosed for any purpose other than to internal analysis.

Blueriver 4
`
Blueriver - Data Assessment Results

Assessment Checklist
The following checklist uses status scores to identify opportunity areas; along with the score, it is expected a detailed
comment for reference. The scoring system is defined as follows:

Status Score:

Status Score:
5 (Everything well documented no need to further questions.)
4 (Most of the information is clear, minimum of follow up questions will be required.)
3 (Information is Good, a few clarifications and follow up questions are required.)
2 (Information is not Good, clarifications, research and follow up questions are required.)
1 (Lack of information, corrective measures needed.)

Project Aventiv

Client Aventiv

Date 10/25/2022

*Disclaimer: The following items up until the accessibility section are to be used as a general template for all QA
assessments, the automation, performance and mobile sections should be used as needed based on the type of QA assessment
performed.

Key items
Result ID Concept Status Comments

1 Identify stakeholders/escalation contacts The core stakeholders in the project must


be identified.

This is not only within the project but


also the ones which interact in the
project, sometimes external systems
affect or interact with it and sometimes
information is required for those parts.

2 Testing goals/milestones

3 Desired timeline

Blueriver 5
`
Blueriver - Data Assessment Results

Documentation

Result ID Concept Status Comments

4 Business Rules

5 Functional requirements If it is possible to obtain the initial


proprietary requirements of the project that
will give a full picture and better
understanding of it:

• Technology used for application


development
• Product & domain knowledge
(including functional specifications and
product roadmap)
• Product architecture (APIs, Web
Services or DB connections)
• Automation tool preferences
• List of any challenges faced during
any earlier efforts of automation

If it is not possible to gather the


requirements, then based on existing test
cases or the application itself, the
requirements can be extracted

6 Training manuals

7 System architecture diagram/manual

8 Installation guide

9 Test cases

10 Test plan

11 Testing reports

12 Regression suites

13 Deliverables

Technology and Tools

Result ID Concept Status Comments

14 Test manager tool

Blueriver 6
`
Blueriver - Data Assessment Results

15 Defect tracking tool

16 IDE (with license)

17 CI/CD tools Example: Docker, Jenkins

18 Hardware tools: Example: client expedited laptop

19 Other:

Accessibility

Result ID Concept Status Comments

20 VPN Credentials

21 FTP Accounts

22 Virtual Machine credentials

23 Corporate email credentials

24 Communication groups (Slack, teams, skype)

25 Repositories/DevOps server(s) access

Blueriver 7
`
Blueriver - Data Assessment Results

Automation Review
Result ID Concept Status Description

26 Initial Meeting Automation team leads, developers


and other stakeholders meet to discuss
whether automation could be
implemented or not in the project and
then purpose, needs, requirements and
plans for test automation.

27 Use Cases list & review • The development team


provides a list of test cases and their
priorities as well as the most critical
transaction.

• The Automation team can


now make decisions early on about
the framework strategy and tool
selection to avoid future rework or
wrong interpretation of the system
functionalities.

28 Estimation on scenarios to be automated It is important to define which


scenarios are the most robust to
determine which ones will require
most of the effort of automation.
Based on this it can estimated how
much time would be required to
automate the solution

29 Test Strategy The automation team is now able to


identify:

 How the tests will be


prepared, what test language
to use, the test interface and
the test data (inputs/outputs)
to ensure a maintainable and
portable automation
solution.

Blueriver 8
`
Blueriver - Data Assessment Results

30 Generate/Obtain quality data Once the scope of the automation is


determined, it is good to define if
possible which data could be
generated in case it is required.

 This data should match the


criteria required by the
solution as well to be created
with quality to have better
coverage during the test
execution.

31 Automation Framework In case we already count with an


automation framework that applies to
the project that would help saving
some time, but if that is not the case,
with all the information already
acquired the automation framework
should be defined since it will ease
the implementation and maintenance
of all the scripts for the project.

32 Define Reporting Strategy The client should be informed


regarding the results of the
automation, this could include the
direct results of the automation
execution or even also include the
impact of cost saving / execution time
in the project.

Blueriver 9
`
Blueriver - Data Assessment Results

Performance Review
Documentation

Result ID Concept Status Comments

33 Business Rules

34 Non-Functional requirements (SLA’s)

35 Training manuals

36 System architecture diagram (Prod and Testing


environments)

37 Business critical transactions

38 Workload related data

39 Performance Test plan

40 Performance testing reports /Historical data

41 Documentation:
 Use cases
 Test plan
 Scalability forecast

Blueriver 10
`
Blueriver - Data Assessment Results

Technology and Tools

Result ID Concept Status Comments

42 Performance Testing tool

43 APM Monitoring tool

44 CI/CD tools

45 Hardware tools:

46 Performance Testing environment

47 Other:

Boundaries

Result ID Concept Status Comments

48 Testing goals

49 Project timeline

50 Performance Testing types

51 Performance Exit Criteria

52 Performance Testing environment setup

Blueriver 11
`
Blueriver - Data Assessment Results

Mobile Review
Result ID Concept Status Comments

It must be defined on which platforms the


53 Operating systems app will be released (iOS, Android, etc.)

54 Earliest versions of the relevant OS

55 Most popular mobile devices among your target


audience

Mobile apps usually interact with a number


56 Feature functionality of features – both built into devices and built
into the app. These interactions are assessed,
documented, and thoroughly tested.

The approach is to test on a single device and


then conquer various platforms during
compatibility testing

There are three main types of mobile apps:


57 Type of application native, mobile-web, and hybrid.

Mobile-web: The website opened in the


device through a web browser.

Native: The application is developed


specifically for individual platforms

Hybrid: A mix between native and mobile-


web applications.

Checks anything that is visible client-side,


58 Front-end testing also known as Graphical User Interface
(GUI).

Test types to be performed on the GUI of an


app include:

 Regression tests
 Performance checks

Blueriver 12
`
Blueriver - Data Assessment Results

 Changes or updates to app files that


might break front-end functionality

Checks the server side of the mobile app.


59 Back-end testing Anything that is entered and/or stored in the
front-end is tested in the back-end.

The security and performance of the mobile


application are tested at this stage.

The ever-growing popularity of smartphones


60 Multiple network compatibility and IoT devices has led to an explosion of
different brands and platforms. While it is
impossible to perform every test on all
possible devices, mobile compatibility testing
is indispensable.

This process includes tests such as:

 Install and uninstall


 Functionality
 Traversal
 Data exchange

Today’s mobile devices do not have enough


61 Storage storage for the vast amounts of games, music
streaming services, and hi-res photos
competing for space.

From how much data your app requires to


how this might affect monthly data plans,
these limitations must be kept in mind during
the mobile app testing.

Blueriver 13
`
Blueriver - Data Assessment Results

Load times are a major source of frustration


62 Data exchange for users, as such, the team must create tests
that represent network conditions, specific
devices, and geographic locations to
accurately represent real users.

Test the flow of the mobile app architecture


63 Application flow is necessary to find elements of the design
that would impede users from completing
their desired tasks.

Blueriver 14
`
Blueriver - Data Assessment Results

Glossary
Requirements

A Functional Requirement (FR) is a description of the service that the software must offer.

Business requirements

They are high-level requirements that are taken from the business case from the projects. For example, a mobile banking
service system provides banking services to Southeast Asia. The business requirement that is decided for India is account
summary and fund transfer while for China account summary and bill payment is decided as a business requirement

Architectural and Design requirements

These requirements are more detailed than business requirements. It determines the overall design required to implement the
business requirement.

Example requirement shown below:

Banking use case

Bill Payment. This use case describes how a customer can login into net banking and use the Bill Payment Facility.

Requirement

The customer will can see a dashboard of outstanding bills of registered billers. He can add, modify, and delete a biller detail.
The customer can configure SMS, email alerts for different billing actions. He can see history of past paid bills.

The actors starting this use case are bank customers or support personnel.

System and Integration requirements

At the lowest level, we have system and integration requirements. It is detailed description of each requirement. It can be in
form of user stories which is really describing everyday business language. The requirements are in abundant details so that
developers can begin coding.

Non-Functional Requirements

Blueriver 15
`
Blueriver - Data Assessment Results

A Non-Functional Requirement (NFR) defines the quality attribute of a software system, example: performance testing.

Maintenance

Regression and maintenance suites.

Specifications/Technical specifications (backend testing, frontend testing)

Difference between Specifications and requirements:

Requirements are what your program should do, the specifications are how you plan to do it.

The specification represents the application from the perspective of the technical team. Specifications and requirements
roughly communicate the same information, but to two completely different audiences.

Timeline for the client

The time span defined by the nature of the testing estimation, this depends on the amount of resources and seniority that the
proposal contains to complete the project.

Quality Assurance

The key concept of manual testing is to ensure that the application is error free and it is working in conformance to the
specified functional requirements.

Training materials

Research if the client possesses a library of documents for training of their employees or customers on the products usage and
setup.

Current state of testing analysis

The degree of test formality depends on:

· The type of application under test

· Standards followed by your organization

Blueriver 16
`
Blueriver - Data Assessment Results

· The maturity of the development process.

Current coverage, documentation and involved team must be analyzed or defined.

How many test documents do they have?

Here, are important Types of Test Documentation:

Test policy

It is a high-level document which describes principles, methods, and all the important testing goals of the organization.

Test strategy

A high-level document which identifies the Test Levels (types) to be executed for the project.

Test plan

A test plan is a complete planning document which contains the scope, approach, resources, schedule, etc. of testing
activities.

Requirements Traceability Matrix

This is a document which connects the requirements to the test cases.

Test Scenario

Test scenario is an item or event of a software system which could be verified by one or more Test cases.

Test case

It is a group of input values, execution preconditions, expected execution postconditions and results. It is developed for a Test
Scenario.

Test Data

Test Data is a data which exists before a test is executed. It used to execute the test case.

Defect Report

Blueriver 17
`
Blueriver - Data Assessment Results

Defect report is a documented report of any flaw in a Software System which fails to perform its expected function.

Test summary report

Test summary report is a high-level document which summarizes testing activities conducted as well as the test result.

General access to client’s application infrastructure

Once technology infrastructure has been identified, access to every level related to the needs of testing must be provided to
guarantee compromised coverage.

Initial scope of testing

The set that contains the coverage of requirements/test cases defined and this could be divided into prioritized categories.

Risk analysis: A 3-Step process

 Identify the Risks

o Organizational: It is a risk related to your human resource or your Testing team. For example, in your
project, lack of technically skilled members is a risk. Not having enough manpower to complete the project
on time is another risk.

o Technical Risk: Technical Risk is the probability of loss incurred during the execution of a technical
process such as untested engineering, wrong testing procedure…etc.

o Business Risk: The risk involves an external entity. It is the risk which may come from your company,
your customer but not from your project.

 Analyze Impact of each Identified Risk

 Take countermeasures for the identified & Analyzed risk

Performance Definitions

 KPI
o Key Performance Indicators. Metrics that enable measuring the performance results and success, according
to parameters chosen to be relevant and important.
 SLA
o Service Level Agreement. An agreement or contract between an organization and the IT department that
details the obligations and expectations in terms or response times and user experience.
o The SLA should include a description of the services to be provided and their expected service levels,
metrics by which the services are measured, and the duties and responsibilities of each party

Blueriver 18
`
Blueriver - Data Assessment Results

 Business critical transactions


o Transactions identified by the client that are:

 Executed frequently
 Are critical for the business
 Are high resource utilization

 Workload related data.


o Business critical transaction historical data or expected transactional data

 Application usage patterns identified by the client.


 Transactional volumes
 Response times
 APM tools
o Application Performance Manager/Monitoring tools.

 Refers to the monitoring and management of performance and availability of software


applications. APM strives to detect and diagnose complex application performance problems to
maintain an expected level of service.

 Monitors or Counters
o Used to continuously keep track of the status of the system under test, to have live warning of failures,
defects, or problems and to improve them. There are monitors for servers, networks, databases, security,
performance, website and internet usage, applications, etc.

Blueriver 19
`

You might also like