Professional Documents
Culture Documents
PC-Dex
Team: Richard Rivera & Kevin Swei
Parts of Testing
1. What is your target audience? (age, occupation, comfort level with technology,
etc.)
The target audience for the PC parts e-commerce website is quite broad, aiming to cater
to anyone who has an interest in PC gaming and computers. This includes everyone from
casual enthusiasts just starting to explore the world of PC building to experienced system
administrators. The age range is also varied, hoping to attract both younger audiences in
their teens and twenties looking to customize their first gaming rig, as well as older
professionals looking to upgrade their workstations. While some potential customers may
have a lower comfort level with technology if they are new to PC assembly, the site also
aims to provide value for experienced users already adept at picking components. By
stocking a wide selection of parts at competitive prices and including educational guides,
the goal is to make the purchasing process simple and accessible for all levels of interest
and technical proficiency. Whether a casual hobbyist or professional, any consumer
2. Who will be testing your project? If you have a user group, please state
everyone's names (first names okay) and describe how they fit your target
audience demographics. Otherwise, just do this part for your client.
To gather initial feedback and test the usability of their website project, Richard and
Kevin demonstrated the site to various family members and friends. Kevin's younger
cousin Mark, who is 17 and heavily into PC gaming, was able to easily navigate the
catalog and checkout process. He provided feedback on improving some tooltip text and
product descriptions. Their friend Michael, a 28-year-old Best Buy computer salesman,
tested more on the searching and filtering of components of the website. Richard
constantly researched upgrades, he felt the site would be a useful one-stop shop.
Richard's cousins John and Christian, both of whom work in IT, also had a look and
ensured all technical specifications were displayed correctly for professionals. Overall,
the target demographic responded positively to the clean interface focused only on parts.
Both novices like Mark and experts like Michael like the simple concept of the website.
This early user testing provided invaluable guidance that Richard and Kevin are now
3. What are the main tasks you would like your client/users to be able to complete
while testing? Your tasks should be specific and measurable (i.e. Can they sign
up and log in? Can they navigate to one specific part of the site? Are they able to
play through the first level of the game?). To get good feedback, you should have
3-5 tasks to test them on.
To gather useful feedback, Richard and Kevin have identified several important tasks for
their clients and test users to complete on the PC parts e-commerce website. First, they
want to ensure the overall site design and structure are simple and intuitive to navigate.
Users should be able to easily browse product categories and find what they need without
confusion. A second task is to check that all product information like specifications,
description details, and images are displaying correctly on individual pages. Thirdly,
users will review the order of pages like the cart, checkout, and payment screens to make
sure the purchasing process flows seamlessly. A final task involves attempting to create
an account and log in to the administrator panel to manage orders and inventory. From
testing these core website functions, Richard and Kevin hope to determine if any
improvements need to be made. While initial feedback has been positive, more user
testing may reveal opportunities to enhance the user experience further before the full
launch.
4. Observe and take notes while your tester tries out your product. Testing is best
done face-to-face. It also helps if your testers speak out loud so you can hear
their thought process as they attempt your tasks. That way, if they get stuck or
aren't able to complete a task, you have an idea of why and it will be easier for
you to fix.
Richard and Kevin observed enthusiastically as their test users put the e-commerce site
through its paces. They had prepared notepads to document any issues observed. As their
friend Michael began browsing product listings, they listened closely to his verbalized
thoughts. Michael commented positively on the clear category filters but suggested
adding search autocomplete for ease of finding specific items. The other testers provided
equally valuable feedback, such as requests to reorganize the admin dashboard and add
product reviews. By actively watching users and hearing where they got stuck, Richard
and Kevin gained key insights. Making notes of problems and pain points experienced
will allow them to strategically prioritize improvements. With further iterative testing and
refinements based on user feedback, they can continue enhancing the customer
experience.
1. Set up a meeting with your client to go over your work. This should ideally be
face-to-face so you can get their initial reactions and see how they navigate your
product in real-time. If they test on your device, you may also be able to record your
screen and audio to capture their test to review later (with their permission). Other
good alternatives would be to have them share their screens over Zoom, etc. while
testing or have them record themselves using the product and narrating what
they're doing.
Richard and Kevin were eager to demo their PC parts e-commerce website for their client
Jason. As this was a beginner project for the capstone festival, they kept the design and
features very simple. During the face-to-face meeting, they had Jason navigate the site to
see how easy it was to use. As Jason browsed product listings and placed a mock order,
Richard and Kevin observed his basic reactions. They noticed Jason seemed pleased with
how organized everything was laid out for someone new to PC building. When checking
out, Jason stated the process was straightforward as intended for their target audience. To
conclude, Jason told the team the site appeared like a good starting point for novices,
with its minimal but focused presentation. While still very basic, Jason's positive
feedback confirmed Richard and Kevin had achieved their goal of creating an initial
2. Share your tasks with the client. Encourage them to share their thought process
aloud so you have an idea of how your project will be used. If they get stuck, resist
the urge to give them help or hints. Encourage them to find the solution on their
own, and if they can't, remind them that your project is still a work in progress, it's
not their fault that they are not able to complete the task, and then move on to the
next one.
Richard and Kevin presented the testing tasks to client Jason, encouraging him to think
out loud as he interacted with the site. The first task was to browse product listings, and
understand the differences in components. For the second task of adding items to his cart,
Jason got momentarily stuck but then figured out on his own how to click the "Add to
Cart" button. When checkout presented some confusion, Richard and Kevin reminded
Jason that this was still a work in progress, and not to feel bad if he couldn't complete
everything successfully. They moved on to the admin dashboard, where Jason smoothly
simulated adding a new product. Throughout, Jason's narration of his thought process
was invaluable feedback to help Richard and Kevin further enhance the user experience.
3. Report on their feedback. Were they able to complete all of the tasks? Did they get
stuck along the way? What will you do to improve your project in the short term
before the festival? In the long term?
Overall, Jason was able to complete most of the tasks Richard and Kevin outlined for
testing their PC parts e-commerce website. He breezed through browsing products and
adding items to his cart with no issues. However, Jason did encounter some confusion
while going through the checkout process, specifically with entering his billing address
details. Based on getting stuck at that point, Richard and Kevin plan to redesign the
checkout forms, likely changing to dropdowns for fields like country and state. In the
short term before their capstone festival demo, they will focus testing and fixes on
smoothing out any remaining rough edges to the purchasing workflow. In the long term,
Jason also suggested adding more product information and reviews to provide more value
to customers. Richard and Kevin are excited to continue expanding on these basics as
they further develop their skills. They appreciate the valuable feedback that will help
1. Although called a user group, you may want to conduct one person test at a time.
You should have at least 3 people in your target audience test your product.
To get feedback from their target audience, Richard and Kevin had 3 people test their PC
parts website. First was Kevin's friend Mark, who is learning about PC building as a
gaming hobby. Mark felt the site was easy to navigate and helpful for a novice like him.
Next, they tested with Richard's cousin John, who works in IT. John said the admin
god-brother Michael tested it. As an avid PC gamer constantly upgrading his rig, Michael
appreciated being able to easily compare different component specs. While all 3
encountered minor issues like typos, overall the clean interface was positively received.
Their feedback will help refine areas like product pages and checkout. Going forward,
Richard and Kevin plan wider audience testing. However, this initial round with family
and friends who share an interest in PCs provided valuable perspectives on usability and
2. Set up a meeting(s) with your test users to go over your work. This should ideally be
face-to-face so you can get their initial reactions and see how they navigate your
product in real-time. If they test on your device, you may also be able to record your
screen and audio to capture their test to review later (with their permission). Other
good alternatives would be to have them share their screens over Zoom, etc. while
testing or have them record themselves using the product and narrating what
they're doing.
Richard and Kevin scheduled face-to-face meetings with their test users to get feedback
on the PC parts website. When they met with Mark, John, and Michael, the developers
explained they wanted honest reactions but wouldn't be taking any photos out of respect
for privacy. As their project was still early-stage, Richard and Kevin knew adjustments
would be needed based on usability testing. They had each user navigate the site while
thinking aloud so any issues could be directly observed and addressed. Though recording
screens may have provided additional insights, the testers appreciated not being filmed
due to the preliminary nature of the tests. All three were still able to provide extremely
helpful commentary to improve the customer experience. While photos could have
documented initial impressions, Richard and Kevin felt the personal meetings without
recordings respected their testers' wishes while yielding quality feedback to refine the
work-in-progress site.
3. Share your tasks with your test users. Encourage them to share their thought
process aloud so you have an idea of how your project will be used. If they get stuck,
resist the urge to give them help or hints. Encourage them to find the solution on
their own, and if they can't, remind them that your project is still a work in
progress, it's not their fault that they are not able to complete the task, and then
move on to the next one.
Richard and Kevin met with Mark, John, and Michael to have them test specific tasks on
the PC parts website. Before getting started, they explained that as a work in progress,
there were likely to be some errors or incomplete aspects. They wanted candid feedback
but didn't want the testers to feel bad about any issues encountered. Richard outlined the
tasks - browsing products, adding to cart, checkout, and admin functions. They
encouraged thinking aloud so any problems could be understood. When Mark hit a snag
checking out, Richard reminded him not to feel discouraged, as bugs were to be expected
at this stage. John ran into a similar difficulty but was able to work around it on his own.
Michael smoothly completed most tasks. Their open communication set appropriate
expectations that the project was developmental. The testers' patience and honesty
4. Report on their feedback. Were they able to complete all of the tasks? Did they get
stuck along the way? What will you do to improve your project in the short term
before the festival? In the long term?
When Richard and Kevin reviewed the feedback from Mark, John, and Michael, they
found it very helpful for improving their PC parts website project. While all testers
encountered some issues along the way, they were generally able to complete the
designated tasks. Mark and John both got stuck on certain steps of the checkout process,
specifications to aid research and comparison shopping. For the short term before their
capstone festival demo, Richard and Kevin will focus on fixing bugs like those
encountered at checkout. They also plan to bolster the information on product pages per
the clients' advice. Looking further ahead, the testers recommended expanding the
categories and catalog. Richard and Kevin are grateful for the suggestions, which will
only serve to make their project for the Capstone Festival. The clients' feedback
confirmed areas still need work. Hopefully, Richard and Kevin will work harder on their