Professional Documents
Culture Documents
JMeter
Gentle Introduction to writing,
running and maintaining Test
Scripts
JMeter concepts
Test Plan
Thread Group (Test Script)
Thread (Users)
Configuration (input values, usernames/IDs, etc)
Sampler (Requests, Clicks, Debugging, etc)
Assertion (Tests for validity or pass/fail of script)
Listener (Reports, execution summary, etc)
Controller (Throughput, Filter % app usage, etc)
Timer (Scheduling, delays, simulate user reactiontime, etc)
Test Plan
Project/feature-specific
Thread Group
Application/webpage functionalityspecific
Thread
IMPORTANT Thread = User
Sampler
Event(s) to perform and observe
Assertion
What to check for each Test
View Results Tree
Listener
Type(s) of Monitoring to do during
Tests
Controller
Any custom code to run or values to
set before finishing each Test script
Timer
When to run each test
Should there be a delay in between
or should they all run at the same
time?
Should it simulate a real human user
clicking, reading/waiting then clicking
again, etc or flow through quickly
like a robot
DEMO
Well run a few sample Performance
Tests against the version of the
website delivered so far
Performance Metrics
Throughput = Requests/sec
Application Optimizations
Databases often the choke point for
application performance
JMeter properties
Launch param values from command-line to
customize, for example HTTP Request Default
Selenium
Front-end testing & automation tool
How to record your front-end
interactions and import them into
Jmeter, two options:
Selenium IDE
BlazeMeter
JMeter Shortcuts
CTRL+E = Clear all Test Results from
past runs
CTRL+SHIFT+E = Clear current Test
Results
CTRL+T = Toggle an item of Test
Plan on or of
Questions?
I have a few for you:
What types of input data would you need to
Performance Test your respective business
units?
What should happen to your business unit(s)
when the application fails and in what cases
would that be related to usage/performance?
What kind of performance are you expecting?
Any minimum level that would prevent launch?