0% found this document useful (0 votes)
61 views9 pages

Service Now - QA Reference

Uploaded by

Shrimoyee Biswas
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
61 views9 pages

Service Now - QA Reference

Uploaded by

Shrimoyee Biswas
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Automated Test Framework (ATF)- What is ATF in ServiceNow?

- Benefits of automation
in ITSM testing- Creating and running ATF test cases- Automating Incident, Request, and
Change workflows- Debugging and scheduling ATF tests

1. Automated Test Framework (ATF) - What is ATF in ServiceNow?

• Definition:
ATF is a built-in application in ServiceNow used to automate the testing of applications
and configurations.
• Purpose:
Ensures that customizations or updates don’t break existing functionalities after
upgrades or deployments.
• Example:
Automatically test if the Incident form loads correctly after a UI policy change.

Benefits of Automation in ITSM Testing

• Faster regression testing


o Run hundreds of tests in minutes, saving time compared to manual QA.
• Improved accuracy
o Removes human error from repetitive testing tasks.
• Early defect detection
o Run tests during development to catch issues before production.
• Reusable test steps
o Common test components can be reused across different tests.
• Supports Agile and CI/CD
o Integrates well with DevOps pipelines for faster releases.

Creating and Running ATF Test Cases

• Test Case:
A test scenario built in ATF to simulate user actions and verify expected results.
• Steps to Create:
1. Navigate to Automated Test Framework > Test Cases.
2. Click New, give the test a name.
3. Add Test Steps like Open Form, Set Field Values, Submit Form, Assert Field
Value.
4. Save and Run Test from the UI or schedule it.
• Example:
Test that creating a new Incident with priority 1 auto-assigns to the correct support
group.
Automating Incident, Request, and Change Workflows

• Incident Workflow Automation:


o Test that a new Incident is assigned to the correct team based on the category.
o Example step: Assert that Assignment Group = 'Network Support'.
• Request Workflow Automation:
o Validate that a request item creates tasks and sends notifications correctly.
o Example: Automate end-to-end flow for a laptop request.
• Change Workflow Automation:
o Ensure that when a normal Change is created, it requires approval.
o Example: Test if CAB approval is requested for "Normal Change".

Debugging and Scheduling ATF Tests

• Debugging ATF Tests:


o Use Step-by-Step execution to identify which step fails.
o View Logs and Error messages from the test results page.
o Add Breakpoints to isolate issues.
• Scheduling ATF Tests:
o Use Test Suites to group related test cases.
o Create scheduled jobs to run tests after deployments or nightly.
o Example: Run a suite of Incident tests every day at 2 AM.

Test Suites and Reusability

• Test Suites:
Group multiple test cases to run together.
• Reusable Test Steps:
Create shared test steps for logging in, opening forms, setting values.
Upgrade Testing- Importance of testing after ServiceNow upgrades- Running ATF
for regression validation- Upgrade impact analysis- Clone testing and post-clone
validation

Upgrade Testing – Definition & Explanation

• Definition:
Upgrade testing in ServiceNow ensures that customizations, configurations,
workflows, and integrations continue to work properly after the platform or
application upgrade.
• Purpose:
To validate system stability post-upgrade and prevent issues from affecting production.
• Example:
After upgrading to the Tokyo release, testing if the Incident assignment rules, UI
policies, and email notifications still work as expected.

Importance of Testing After ServiceNow Upgrades

• Upgrades can override customizations


o Core files or scripts may be replaced by base system versions.
• Prevent business disruption
o Ensures ITSM processes like Incident, Change, and Request remain functional.
• Check integration stability
o Validates third-party integrations (e.g., LDAP, Azure AD, email, APIs).
• Regression risk
o New features or changes can unintentionally break existing functionalities.
• Maintain compliance and SLAs
o Ensures no violations due to broken workflows or rules.

Running ATF for Regression Validation

• Regression Testing with ATF:


o Use Automated Test Framework (ATF) to re-run existing test cases after
upgrade.
• Steps:
1. Create or reuse ATF Test Suites.
2. Execute test suites to verify critical flows (e.g., create Incident, approve Change).
3. Review pass/fail logs and fix issues.
Advantages:
o Saves time vs manual testing.
o Detects breakages quickly and consistently.
o Supports parallel testing of multiple apps (ITSM, HR, CSM).
Example:
Test suite that validates:
o Incident form loads correctly
o Assignment group auto-populates
o SLA is applied properly

Upgrade Impact Analysis

• Definition:
Identifying what parts of the instance (e.g., scripts, workflows, UI components) might be
impacted by the upgrade.
• Tools Used:
o Upgrade Monitor
o Upgrade History
o Compare Versions (on Script Includes, Business Rules, etc.)
• QA Role:
o Review Skipped Changes (customizations skipped during upgrade).
o Compare before and after versions of updated records.
o Work with dev/admin teams to remediate or accept base changes.
• Example:
A customized incident.create business rule is skipped during upgrade; QA validates if it
still triggers correctly.

Clone Testing and Post-Clone Validation

• What is Cloning?
Copying production data into a sub-production instance (e.g., Dev, Test, QA) for safe
testing.
• Why QA Cares:
o Ensures the clone environment reflects production accurately.
o Prevents invalid test results due to broken data or missing configurations.
• Post-Clone Validation Tasks:
o Check MID Servers, Integrations, Email configurations are disabled or re-
pointed.
o Validate user roles, assignment groups, and sample records.
o Run ATF test cases to validate core functionality.
• Example:
After a clone from Prod to QA, ATF tests confirm that Incident creation, email
notifications, and Change approvals work without error.
Summary Table:

Test Area QA Activity


Upgrade Testing Test functionality after version upgrade
ATF for Regression Run automated tests to validate system behavior post-upgrade
Upgrade Impact Analysis Analyze and validate skipped or changed scripts/configurations
Clone Testing Validate cloned environments for accuracy and functionality

Post-Upgrade Testing – QA Checklist

General Post-Upgrade Checks

• Validate that the instance is accessible and stable.


• Check system logs for upgrade errors or warnings.
• Review skipped changes using Upgrade History.
• Confirm all integrations (LDAP, Email, MID Server, APIs) are functioning.
• Validate user login and role-based access.

Functional Regression Testing (Using ATF or Manual)

Incident Management

• Create a new Incident – verify form loads and mandatory fields.


• Auto-assignment works based on category/subcategory.
• SLAs are attached correctly.
• Notifications are triggered on assignment/update.
• Incident can be resolved and closed without error.

Request Management

• Submit a new Service Catalog request (e.g., Laptop request).


• Requested Item (RITM) and Tasks are generated properly.
• Approval flow is triggered.
• Request closure process works as expected.

Change Management

• Create Normal Change – validate form and default values.


• Approval flow (CAB/Manager) is initiated.
• Change moves through states: New → Assess → Authorize → Implement →
Review.
• Check for Change conflicts or blackout windows.

ATF Test Suite – Post-Upgrade Validation (Example)

Suite Name: Post-Upgrade ITSM Regression Suite


Test Case Name Description

Incident Form Load Test Validates that the Incident form opens correctly.

Incident Auto Assignment Test Checks assignment logic based on category/subcategory.

Incident SLA Test Ensures SLAs apply correctly after submission.

Service Catalog Order Test Verifies RITM and Task creation from catalog item.

Change Request Creation Test Confirms Change record creation with correct defaults.

Change Approval Workflow Test Ensures approval records are created as expected.

User Role Validation Test Verifies assigned roles still allow proper access.

Email Notification Test Confirms notifications send for Incident updates.

Post-Clone Validation Checklist (Optional after Upgrade + Clone)

• Disable outbound email (to avoid sending test emails).


• Validate test data and reference fields.
• Reconfigure integrations to point to test/staging endpoints.
• Run ATF smoke test suite in cloned environment.
• Confirm MID Servers are not pointing to production.

Reporting & Dashboards- Creating reports in ServiceNow- Validating Report


data and Filters- Performance Analytics Overview- Testing Widgets and
Dashboards

Reporting in ServiceNow

• Definition: Reports in ServiceNow are used to present data from tables in a visual or
tabular format for analysis, tracking, or auditing.
• QA Focus:
o Ensure report generation aligns with business requirements.
o Validate the data source (table), filter conditions, and output format.
o Confirm permissions and visibility are restricted to correct roles.
• Examples:
o Report: “Open Incidents by Assignment Group”
▪ QA Validation:
▪ Check if only "incident" table is used.
▪ Verify status is filtered to "New", "In Progress", etc.
▪ Confirm data matches actual incidents on the system.

Creating Reports in ServiceNow

• Definition: The process of configuring a new report by choosing a source table, defining
filters, selecting a visualization type, and saving it for users.
• QA Tasks:
o Validate that created reports pull the correct data using proper filters.
o Check chart types (e.g., bar, pie, time series) render correctly.
o Test if reports load without performance issues or data delays.
• Example:
o A Bar Chart Report showing “Number of Requests per Department”
▪ Ensure departments match request table data.
▪ Confirm color legends and x/y axes are labeled correctly.

Validating Report Data and Filters

• Definition: QA ensures that the data displayed in reports is accurate and matches the
filtered criteria.
• Key QA Checks:
o Cross-check data in the report with table records.
o Validate each filter (date ranges, assignment group, state).
o Ensure filter logic (AND/OR conditions) is applied as expected.
• Example:
o Report Filter: "Created > Last 30 Days AND Priority = 1"
▪ Validate report does not show records older than 30 days.
▪ Confirm only “Priority 1” tickets are included.

Performance Analytics (PA) Overview

• Definition: Performance Analytics is an advanced analytics feature in ServiceNow that


provides trend-based visualizations using historical data, scorecards, indicators, and
breakdowns.
• QA Focus:
o Validate Indicator Source: Confirm it pulls the correct data from the correct table.
o Test Data Collection Jobs: Ensure jobs run as scheduled and collect expected
data.
o Scorecard Testing: Check if historical trends match expectations.
o Validate Widgets: Ensure that charts, KPIs, and breakdowns work correctly.
• Example:
o PA Indicator: “Average Resolution Time (7 days)”
▪ QA Task: Compare historical resolution times to calculated PA score.
▪ Confirm PA dashboards update daily with the correct trend.

Testing Widgets and Dashboards

• Definition: Widgets are individual components (charts, scorecards, lists, etc.) on a


dashboard. Dashboards are collections of widgets providing a visual summary of data.
• QA Tasks:
o Verify widgets load data correctly and quickly.
o Check visual elements (charts, legends, labels) for accuracy.
o Validate filter interactivity – applying a global filter updates all widgets.
o Role-based access: Ensure only authorized users can see certain widgets.
• Example:
o Dashboard: “Change Management Overview”
▪ Widgets:
▪ “Open Changes by State”
▪ “Average Approval Time”
▪ QA Checks:
▪ Click filters like “This Month” and validate widgets update.
▪ Confirm calculations for averages or percentages are correct.
Summary Table (Quick Reference)

Feature QA Responsibility Example Scenario


“Open Incidents by Assignment
Reports Validate data, filters, visibility
Group”
Test configuration, filters, output
Creating Reports Bar chart of requests by department
type
Cross-check records vs filters “Created > 30 Days AND Priority =
Validating Filters
applied 1”
Performance Test indicators, data jobs, “Average Resolution Time”
Analytics scorecards scorecard
Check visual accuracy, “Change Management Dashboard”
Widgets/Dashboards
interactivity, access with filters

You might also like