You are on page 1of 26

Introductory Note

The attached report, Usability Evaluation of PeopleSoft Administrative Software, is
posted here with the generous consent of its authors, Steven Lahoda, Ashlee Sroba,
and one other anonymous group member. They submitted this report last year to Jo-
Anne Andre's section of this course.

As you'll see, it's a first-rate report in nearly every respect. I am especially impressed by
the care with which the researchers explained their assumptions and research criteria,
justified their research methods, and presented and interpreted their findings. While the
report isn't flawless, it is, for the most part, superbly written with excellent attention to
editorial detail. The report flows effortlessly from its opening to its final
recommendations, which are clearly anchored in the results and analysis presented in
the report. In short, it's an intelligently written, informative, and well organized report that
is a pleasure to read.

As noted, there are a few minor flaws in the report. Specifically:

 The title wording shifts on p. 1 and the report includes a sprinkling of misused
colons)
• After the “Introduction” heading, rather than launch directly into a subsection headed
“Background”, I would recommend including a short paragraph introducing the topic
and the report. In this case, one possibility would be to simply delete the
“Background” heading and to add a sentence after this opening paragraph to
introduce the report itself (e.g., “This report…”). Generally, it’s best to include an
overview after a major section heading rather than beginning immediately with a
subheading.
• I would recommend discussing any concerns about bias in your research methods
section. Alternatively, you might briefly discuss limitations to your study as part of
your conclusions section. (Note: I’m also not keen on the wording for this heading:
“Bias concerns” as the heading.)
• The "outline" section is rather late. This section would be fine after the purpose
section. It could also work (without the heading) as the second paragraph of the
introduction section.

Regardless, these minor flaws do not detract significantly from an otherwise excellent
piece of work.

Your report doesn't have to be quite this good (or have this slick a design) to still get an
A. But it gives you a sense of something to aim for.

Doug Brent (Jeremy Leipert)

i

University of Calgary
2500 University Drive N.W.
Calgary, Alberta
T2N 1N4

December 7th, 2007

Matt Gray
Marker, COMS 363
2500 University Drive N.W.
Calgary, Alberta
T2N 1N4

Dear Matt:

Re: PeopleSoft Student Center Usability Study

We have conducted a simple test to evaluate the usability of the PeopleSoft Student Center
website. Our research and conclusions can be found in the attached report: Usability evaluation
of PeopleSoft administrative software.

Our analysis suggests that PeopleSoft is a highly functional website. PeopleSoft has useful as
well as new features, some of which are superior to its predecessor: Infonet. However, the modes
of navigating the site are unsuitable. In order to increase usability and user satisfaction, three
main features should be reconsidered: swapping the contents of the left side tab bar with the
drop-down menus; removing all dead links; and enabling the use of the back button.

We began our study by first familiarizing ourselves with the suggested course readings and
websites, followed by the PeopleSoft site itself. The core of our research consists of usability
tests conducted on non-university students, which allowed us to form the majority of our
conclusions about the website.

If there are any questions or concerns about our report please feel free to contact us at . . . [emails
included here]. We look forward to hearing your views on our report.

Sincerely,

Ashlee Sroba
Steven Lahoda
& Anonymous

ii

Usability evaluation of PeopleSoft administrative software Submitted: Coms 363: Final 12/06/2007 Report Prepared for: Matt Gray. Coms 363 Marker i .

Our hypothesis was that most of the user criticism resulted from unfamiliarity with the software. extra course information) and that it effectively united almost all aspects of student administration. Prepared by Ashlee Sroba. The goal of this project was to objectively evaluate the usability of Peoplesoft and to provide recommendations for any major flaws identified. High school students were chosen for this role because they are both unfamiliar with PeopleSoft and likely to use it sometime in the future. The results show that users were generally unsatisfied with their experience despite being able to perform effectively and efficiently. ii . class swap. We focused our research on the student users of PeopleSoft. Participants were also timed and we attempted to keep track of the navigational errors made during their tasks. frustration caused by the lack of functionality of the back button was the most common response. as both investigators and users. irrelevant features and illogical section organization. which was conducted by us. and a usability test conducted on a sample high school student population. We analyzed the usability of PeopleSoft in two stages: a preliminary evaluation of the software . may introduce to the study. All six of our participants successfully completed the usability test and answered usability related questions. The purpose and target audience of PeopleSoft were analyzed. The particularly useful features of PeopleSoft were generally hidden in small lists or drop-down menus. & Anonymous University of Calgary Executive Summary Following its implementation in 2007. the new administrative software PeopleSoft became the target of much end-user criticism. Bias concerns were immediately discussed to address the potential for conflict of interest which we. It was decided that the participants of our research must not be university students so that they are introduced to PeopleSoft without prejudice. Steven Lahoda. Users were asked to specifically identify features they found effective and ineffective. These strengths were contrasted by multiple dead links.e. It was found that PeopleSoft boasted several very relevant and functional features (i. restricted applications. followed by an evaluation of its layout and content.

Swap the content of the left-hand side tab bar and the drop-down menus 2. Further details can be found inside the report. Table of Contents Introduction 1 Background 1 Purpose 1 Bias Concerns 1 Outline of Report 2 Research Methods 3 Preliminary Assumptions 3 Criteria for Analysis 3 Layout Evaluation 4 Content Evaluation 6 Usability Test 8 Results & Analysis 9 Effectiveness 9 iii .We have provided three simple suggestions for improvement which cover the general findings of our research: 1. Remove all links leading to dead ends or inaccessible applications 3. Enable the use of the back button We feel that the functional objectives of this project were successfully accomplished.

Time required for participants to complete usability test tasks 11 Figure 8. PeopleSoft displaying all of its customizable content 8 Figure 6. Efficiency 11 Satisfaction & Preference 12 User Feedback 13 Conclusions 13 Recommendations 15 References 17 Appendix A: Usability Test Provided to Participants 18 Appendix B: Ethics Document Provided to Participants 19 List of Figures Figure 1. Clearly visible frequented links contrast hidden frequented links 5 Figure 3. Two different sections titled Personal Information 6 Figure 4. Responsive measures taken by the Office of the Registrar 7 Figure 5. Participant responses regarding learnability of PeopleSoft 12 Figure 9. Standard tab positioning as found in PeopleSoft 5 Figure 2. Participant responses regarding usability of PeopleSoft 10 Figure 7. Participant responses regarding their satisfaction with PeopleSoft 12 iv .

v .

students had become accustomed to the interface and had accepted it despite its flaws. usability refers to how well users can learn and use a product to achieve their goals and how satisfied they are with that process. Although generally functional for students.) Much of the criticism PeopleSoft has received has been directed at its lack of usability.gov (n. In general. specifically that it is difficult or confusing to use. The introduction of PeopleSoft was met with a great deal of frustration from both student and administrative users.PeopleSoft Usability: Final Report INTRODUCTION Background Until February 6. Usability. or a combination of the two. the University of Calgary used the Student Information System (SIS) as a platform for its student administrative software. This change in software may have caused the large amount of criticism PeopleSoft has received. However. between 2006 and 2007. the university invested in new administrative software: PeopleSoft. as defined by Joseph Dumas and Janice (Ginny) Redish. Purpose The goal of this research project was to analyze the usability of the PeopleSoft Student Centre.d) defines usability as follows: Usability measures the quality of a user's experience when interacting with a product or system —whether a website. Usability may also consider such factors as cost-effectiveness and usefulness. Infonet.p. Because Infonet had been used for several years. SIS’s unique code and dated framework became more difficult to keep up-to-date and made the software far too expensive to maintain. (n. means that people who use the product can do so quickly and easily to accomplish their tasks. mobile technology. or any user-operated device. Usability. We also wished to provide suggestions for improvement for any weaknesses that were 1 . simply due to their unfamiliarity. a software application. Subsequently. people can become frustrated when the norm changes and they are forced to learn something new. Our aim was to identify specific attributes of PeopleSoft that were particularly effective and others which were particularly ineffective. 2007. We intended to determine whether criticism from its end-users was truly merited.

We then describe in detail the usability test which we conducted. are affected by a bias against PeopleSoft. High school students are ideal candidates because they are realistic users who share much in common with current students. Our alternative approach may present novel findings. Jerz (2002) offers a similar example: …if all your test subjects were students in the same class. (n. We have chosen high school students as our population for this research. Usability Testing found. whether aware of it or not. we draw conclusions regarding the usability 2 . We then present the methodology used in defining criteria for analysis. we felt that it was important to conduct our research on individuals similar to students.p. The preliminary evaluation of PeopleSoft was still carried out by us. Bias Concerns It was extremely important to us to conduct this research as objectively as possible. Outline of Report We begin our report by discussing the basic assumptions made about PeopleSoft and its users. Ideally. Most students no longer give PeopleSoft the benefit of any doubt they may encounter and this has not resulted exclusively from their personal experience. They generally possess the same levels of computer knowledge as university students. due to time constraints our usability test population consisted exclusively of non-students. However. however. without having been exposed to an environment of prejudice toward PeopleSoft. Finally.) Although we appreciate the experience that current students do have with PeopleSoft. then the attitude of all your test subjects on that particular topic will probably not accurately reflect the attitudes of the general public. followed by the results and analysis of the findings. It can also be a disadvantage. We have attempted to shed as much bias as possible while conducting our evaluation of PeopleSoft. a more thorough analysis could have been conducted with both current students and non-students. followed by an initial evaluation of the layout and content of the software. but who were not. and will likely be using PeopleSoft or similar software sometime in the near future. for bias reasons mentioned above. and the professor has a bee in his bonnet about why HTML frames suck. allowing some input from current students. We recognize that this can be an advantage because we have enough experience with the software that we know of certain features that a short survey may not get into enough depth to uncover. We felt further justified in this decision because we knew of other PeopleSoft research reports being conducted for this same course which were focusing entirely on current students. Most current students. The amount of criticism of PeopleSoft in casual conversation around the university has likely influenced most students’ view of the software. and also because we have already become accustomed to the software and may have overlooked important features a new user may notice.

relevant background would include familiarity with web browsers and administrative software. However. or to check their grades. we established criteria to help us focus our research on the most relevant aspects. such as to access payroll for employees. After answering these questions and consulting online sources which specialize in usability analysis. For the purpose of this study. Criteria for Analysis Although we did not wish to limit the features of PeopleSoft we would consider in our analysis. For the purposes of our study. and are able to complete tasks by themselves. therefore we have assumed that users will generally have electronic capabilities at a level which allows them to comfortably navigate Blackboard. users access it in the form of a webpage. it will be referred to as both throughout the report. The more common reasons a student would use PeopleSoft include: to add/change/drop classes. resulting in less face-to-face time with university administrators. Usability Testing of PeopleSoft and make recommendations for improvement. to pay fees. we will be focusing on the student end-users of PeopleSoft. • What relevant background will users have? University students have a variety of backgrounds. Traditional measures of 3 . to check their class schedule. The benefits of electronic administration are that users: can access it from any computer with an internet connection. whether student or employee. students are expected to navigate Blackboard if enrolled in University courses. • Why would someone consult PeopleSoft? Users would log onto PeopleSoft for a variety of reasons. but not all. The changes made through this method are virtually instantaneous. • Who is PeopleSoft meant for? PeopleSoft is meant for almost everyone at the university. Although PeopleSoft is technically software. we were prepared to define our criteria for analysis upon which we would base our evaluations and usability test. RESEARCH METHODS Preliminary Assumptions We began our research into the usability of PeopleSoft by defining the purpose and intended users of the software. Most users are accustomed to electronic navigation. This was accomplished by answering the following fundamental questions: • Why is PeopleSoft provided? PeopleSoft is provided so that university administrative tasks can be carried out electronically. or to check classes for students.

seen in Figure 1. Ease of learning considers whether or not users require training to effectively use the software.g. 4 . we defined what we would consider an error as well as a method of quantifying them. or disliked their experience but did perform well. 2004). but avoided inferring conclusions about performance from our findings. In our usability test. Graves and Graves (2007) suggest that “one way to make a document more usable is to provide structuring devices such as tabs” (p. We felt. For our usability test. A study was conducted which concluded that up to 30% of users’ satisfaction did not correspond to the level at which they performed with a software (Usability. we attempted to evaluate user satisfaction. or neglecting to fill in required data. internal miscalculation). we attempted to quantify measurements of efficiency. Set up SACR and Reporting Tools.d. Websites traditionally contain tabs either on the left- hand side or the top of their interface to simplify navigation. and includes mistakes like: clicking on a wrong link. In PeopleSoft. are mistakes that are made because of incorrect user action. User errors. that preference should be considered in our analysis. n. The time efficiency of software refers to the speed at which users can accomplish their task(s). These links. We also included two open-ended questions in the usability test so that participants could comment on any area they wished.gov.d. If users are able to accomplish their tasks effectively and efficiently. Layout Evaluation Before developing the usability test. although not necessarily the user’s fault. However. ease of learning. That means that almost a third of those tested either enjoyed their experience but did not perform well. inputting incorrect data. do not fully correlate (Barnum. Usability Testing usability include: time efficiency. Errors can refer to both user and software errors. Therefore. include: My Favourites. User satisfaction is a slightly more controversial criterion. eMerge. there is no reason why a website should not be enjoyable as well as effective. performance and preference. error frequency and user satisfaction. Self Service. These errors could result from unclear instruction from the software. the links located in these positions lead to sections which an average student user would scarcely use. not specific to our criteria. however. Hood & Jordan. error tolerance and user satisfaction (Usability. if at all. These locations are usually where people look for links to major sections of a webpage. we conducted an evaluation of the layout and content of PeopleSoft.gov. 248). An example of a software error is unexpected behavior (e.). we would expect them to be satisfied with their experience. research has shown that these two items. Henderson. n.).

can be found by simply scrolling down the main Student Centre page. as can be seen in Figure 3. grade point average calculator and tax data). a section called Personal Information can be found.ca Under the heading Self Service of the tab bar on the left. add/drop classes.ucalgary. Another section. This is conveniently located because it is the page that users see when first arriving in PeopleSoft.ehs.ca The sections of the website that a student would frequent (i. schedule. This title implies that its contents are rarely used. credits and grades) are quite clearly marked under the Academics section of Student Centre (orange box in Figure 2). 5 .ucalgary. when in fact it contains several important and frequently used links (i.e. Below these links is a drop-down menu titled “other academic…” (green box in Figure 2).e. Standard tab positioning as found in PeopleSoft Adapted from https://prdrps2. exam schedule. their contents are not identical. Figure 2. Usability Testing Figure 1.ehs. Although these two sections carry the same name. also titled Personal Information. Clearly visible frequented links (orange) contrast hidden frequented links (green) Adapted from https://prdrps2.

class schedules. payroll information. Many more course details are also now documented in the add/drop/swap tool. After 15 minutes of inactivity. If users wish to sign back into PeopleSoft. 2007. In this section users are able to switch between the two-column and three-column display options. they must do so through the My UofC portal. this was a dead link and simply refreshed the page. Usability Testing Figure 3. degree requirements. Content Evaluation Compared to its predecessor Infonet. Two different sections titled Personal Information Adapted from https://prdrps2. leading to a page explaining the situation and providing a single link: “Sign in to PeopleSoft”. grade information. This 6 . One feature of PeopleSoft that was particularly welcomed by the student community was the ability to swap classes.ehs.ucalgary. Previously. contact information and tax information. PeopleSoft boasts many more features relevant to students. students were required to drop one class to make room to add another. right-hand corner. PeopleSoft automatically signs users out. PeopleSoft brings together many aspects of student life and packages them all into one website. An improved course search tool is also an improvement over the late Infonet. As of December 1. users can access a section called Personalize Layout.ca By clicking the Home link at the top. removing the need for students to continuously refer to the university calendar and master timetable. These aspects include: course registration. awards information. as well as many irrelevant features. This is the only feature of the Personalize Layout section. locker registration.

It allows users to add a list of the reports they have submitted. PeopleSoft also contains technical support features in the form of a Help button which links directly to the University of Calgary Registrar’s website. 7 . Responsive measures taken by the Office of the Registrar Adapted from http://www. or both. top issues and contact information for if you are still having trouble. as can be seen in Figure 5. Supporting material has been put in place by the university to help users with their PeopleSoft experience. This material includes tips. tutorials.ca/registrar/students It is also possible for users to customize the content of their PeopleSoft page by clicking on the Home link at the top of a webpage.ucalgary. The list of major sections is the exact same as the tab bar on the left of the screen. Usability Testing sometimes resulted in classes filling up during the few moments between dropping and adding a course. These support features are not features of PeopleSoft but rather responsive measures put in place by the university in anticipation of technical difficulties (e. except with icons. a list of majors sections of PeopleSoft.g. Now students are able to swap one class out for another all in one step. Figure 4. Figure 4).

In the usability test (Appendix A). Subsequently. In fact. PeopleSoft displaying all of its customizable content Adapted from https://prdrps2.ca These new items only appear on the Home page. the Reporting Tools link similarly leads to a dead end containing only an empty folder. Usability Test We conducted a usability test to help us evaluate PeopleSoft. The eMerge link leads to a migration management application which students are not authorized to access. Finally. Out of respect for their time. Jerz (2002) supports this figure by stating that five users should be able to identify 80 percent of significant errors. so we do not expect our results to identify quite as many errors as our sources suggest.ehs. which is not the same as the page that a user starts on.ucalgary.250). we only asked our participants to explore a few sections of the website. six participants were able to participate in our test and thus improved our sample size. Graves and Graves (2007) report that “research by web usability experts Jakob Nielsen and Tom Landauer has shown that five users will usually uncover up to 85 percent of the problems with a website” (p. our goal was to conduct our usability test on five high school student participants. Our aim was to set up a test which collected both quantitative and qualitative data. We wished to focus the test on our criteria for analysis while including an open-ended component allowing any criteria we may have overlooked to be addressed. PeopleSoft also contains several extra content links which students either would not likely use or do not have access to. The Set Up SACR link leads to dead end with an empty folder which may be of use for technical personnel. participants were asked to assume the role of a student and to 8 . Usability Testing Figure 5.

included: checking grades. These tasks. Following each test. and their confidence with the website on a scale from one to five. We wished to identify the strong points of PeopleSoft as well as the weak. Participants were informed that they had the right to refuse or quit at any point. Immediately after completing the tasks. Participants were then asked to rate: their satisfaction of the experience. users were asked to rate the usability of PeopleSoft on a scale from one to five. Usability Testing complete three tasks while logged onto PeopleSoft. Informed consent was obtained from all participants before each test. Participants were thanked for their time and cooperation and were informed that they could request to be informed of the results of the study. 9 . 2007). This was achieved by carefully editing the task descriptions on the test page to minimize any potentially unclear components. Without making it obvious to the participants. It was important for us to write the questions in an objective manner so that participants did not feel they had to conform to any biased hidden agenda of the test. For this test. representing commonly used features of PeopleSoft. the test supervisor timed the three tasks of each participant and counted the number of errors they encountered. The word “test’ was not used during the process so that participants were not subject to any unnecessary stress. We also wished to measure the efficiency and error frequency of the tests. Finally they were asked two open-ended questions giving them the opportunity to identify any aspects of PeopleSoft which they found particularly effective and particularly ineffective. RESULTS & ANALYSIS Effectiveness All of the participants of the usability test were able to successfully complete all three tasks we assigned them. The supervisor also noted any comments or questions participants had. The results of this question can be seen in Figure 6. a debriefing took place in which participants had an opportunity to voice any questions or further comments they had relating to any aspect of our research. The participants completed the usability test in a setting similar to where they might use PeopleSoft in real life. defined by a participant attempting to press the Back button or returning to a previous page. checking class schedule and enrolling in a course. It was emphasized to the participants that the evaluation being conducted was of the website and not their abilities. the usability of the website. and participants were notified of the anonymity of the evaluations (see Appendix B). We also wanted to have minimal interaction with the participants once the test had begun. we only counted navigational error. as research has suggested should be the case (Graves & Graves. This setting was at a computer in the break room of their workplace.

Participants would not know this until they experienced it at least once.” Five out of the six participants experienced at least one navigational error (i. It was noted that the participants generally would attempt to hit the Back button after completing one task and beginning another. stating that “the windows. Three of the respondents specifically mentioned that the fact that the back button did not work was particularly frustrating. it was observed that the participants who struggled more with the tasks attempted to complete them hastily and without fully reading the information on each page.e.” When asked if there were any aspects of the website that they found particularly confusing or ineffective in terms of ease of use. One respondent provided the following example: “There were certain parts that were confusing that you wouldn’t think to do like press the ‘Change’ button when you switch from fall to winter session. but it was noted that most participants would continue hitting the Back button throughout the other tasks out of habit. Usability Testing Figure 6. while others discovered that pressing Back a second time took them to the previous page. In PeopleSoft. attempting to press the Back button). four of these five participants had made at least three navigational errors. pressing the Back button causes the page to expire. Participant responses regarding usability of PeopleSoft Participants were then asked if there were any aspects of the website that they found particularly helpful or effective in terms of ease of use and if so. 10 . Only one out of the six participants responded to this question. particularly when enrolling in courses. Three respondents also found that there was too much information on each page to easily find relevant information. all six participants responded. Some users proceeded to attempt to navigate the tab bar on the left side of the screen to return to the beginning. It was noted by the supervisor that the participants who performed the tasks better and had less negative comments about PeopleSoft were also the ones who took more time reading all of the options available on each page. Four of the participants also stated that it was unclear how to proceed to the next step. what they were. Before losing track. …options given for choices… [and] choices for next steps were [all] easy to use. We were unable to obtain exact values because the supervisor was unable to keep up with the speed at which the errors were made. In general.

These results are presented in Figure 8. This was expected. we did ask participants how quickly they felt they learned the software. Although we do not have anything to compare these time values against. Our observations revealed that because the third task was more complex than the others. 11 . participants were asked to complete three different tasks (see Appendix A for exact wording). Time required for participants to complete usability test tasks The data shows that it took participants far longer to enroll in the course than to complete the other two tasks. Usability Testing Efficiency During the usability test. A supervisor took note of the amount of time it took each user to complete each task. users were forced to slow down and read all the available options before proceeding. Task three asked users to enroll in Coms 363 lecture 7 and tutorial 7 for the winter 2008 semester. Figure 7. These times can be viewed in Figure 7. Task two asked users to find a weekly calendar view of their fall 2007 schedule. Task one asked users to check their grades for the winter 2007 semester. because enrolling in a course comprises far more steps and therefore more possible navigational errors than the other two.

One participant explicitly stated that it “takes time to work out the site. Satisfaction & Preference We also asked the participants to rate their satisfaction of their experience with the site. but once it’s known it’s easy to use.” The participants of the study had no previous experience or training with PeopleSoft. Participant responses regarding learnability of PeopleSoft From the data presented in Figure 8. Figure 9. Usability Testing Figure 8. and yet they felt confident after only ten to twenty minutes worth of experience. Participant responses regarding their satisfaction with PeopleSoft 12 . The results of this question can be seen in Figure 9. we see that users feel fairly confident navigating the site after only exploring three features.

We also evaluated the content of PeopleSoft and found that it contained many relevant and necessary features for students. they were surprised that the common tasks were not more clearly marked. we allowed our participants to voice any general comments they had about PeopleSoft. Plenty of technical support is available through the Office of the Registrar for any users who may encounter difficulties. This may have been caused by the fact that this was the only aspect of the website that the users were familiar with. we noted that users became particularly frustrated when they attempted to hit the Back button and were presented with an error screen. our participants expected the Back button to function in a way they were used to. Several were frustrated that it took so long to complete basic tasks User Feedback After the usability test. One participant felt that the font used by the website was unnecessarily hard to read. 13 . CONCLUSIONS After evaluating the layout of PeopleSoft. the placement of links on each page. PeopleSoft brings together most of the administrative tasks a student would be concerned with and packages them all together in one place. Finally. such as the ability to swap classes and greatly enhanced course information. Our participants commented that it seemed overly complex to complete tasks which they expected to be fairly simple. two participants simply stated that the whole process of selecting a course for enrolment was overly complicated. It seems that PeopleSoft includes several features which would be relevant to other administrative contexts. Another participant said that it took too long for the pages to load. we have concluded that its navigation methods are not suitable for its purpose. In general. Three of the five major sections of the website are not even accessible by students. so they had no notion of whether a feature was working correctly or not. or to address any items that were not covered by the questions. our results show that participants were unsatisfied with their experience. and became frustrated when it did not. but they are scattered throughout the page and often hidden in drop-down menus. Finally. All of the information a student would even require is somewhere within the Student Centre section. although this could have been caused by a slow internet connection during the test. Compared to its predecessor InfoNet. More than one participant stated that the features which would be regularly used by students were relatively hard to find. the rest of the website was completely new. We are referring specifically to the tabs on the left-hand side of the page. excellent features are available to users. PeopleSoft has several improved features. Within the Student Center. but not for university student administration. Usability Testing In general. the breakdown of sections within the Student Centre does not seem to follow traditional logic. Participants also noticed that much of the important content of PeopleSoft is found in drop-down menus and that these menus are easy to miss due to the amount of other information on the pages. and the breakdown of sections. While observing the participants during the tests. such as for a business. rendering the tab bar on the left effectively obsolete.

Although the font may in fact be hard to read. This is further emphasized by the fact that some suggestions were made by our participants that did not reflect the feelings of the group as a whole. this suggestion was not echoed by any other participants. and three of them specifically identified it as particularly frustrating. making it difficult for users to identify what to do so that they may proceed. was a large cause of this frustration. For example. It is clear that the non-functionality of the Back key is a very ineffective feature of PeopleSoft’s usability. the subject who was able to complete all three tasks in 5 minutes gave the site higher than average ratings overall. it does address the needs of students and deals with them effectively and efficiently.). After attempting only the three tasks of our usability test. our data should have been clustered toward the lower end of the usability rating question. n. This suggests that although PeopleSoft may not be preferable to use.. but our participants as well.d. The conclusions presented in this report certainly do not cover all aspects of the usability of PeopleSoft. The overall layout and organization of PeopleSoft’s content played a large role in the dissatisfaction of our surveyed users. our participants felt quite confident navigating the remainder of the site. Usability Testing From the usability test we conducted we were able to draw additional conclusions. and this experience was reflected in our results. n. In particular. the lack of functionality of the Back key. In general. One general trend that can be seen in the data is that our results cover a wide range of possible responses. “Several research studies have shown that about half of the usability issues identified by evaluators are not truly problems and that evaluators miss at least 25% of the real usability issues” (Usability. We also found that users became confident with the software quicker than we anticipated. it seems that the antagonistic position held by many students towards PeopleSoft is not entirely to blame on its usability. Our observations and feedback from users agreed that this lack of satisfaction was generally due to the tedious navigation of the website. and therefore was not considered a real problem based on our population sample. Users also disliked this software because links that they expected to easily find were hidden or in illogical locations. PeopleSoft performed far better in our 14 . Instead what we see is a relatively even distribution across the possible answers. This suggests that although the software may seem illogical or confusing to begin with. It can also be seen from our results that the satisfaction of our participants was generally lower than the level at which they were able to effectively and efficiently complete their tasks and learn the software. From our findings.gov.p. users are generally able to learn the site quickly and do not require training. Five out of six of our participants attempted to use the Back button during the test. suggesting that the participants’ responses were a function of themselves as well as of the website. as described above. In fact. our data is not extremely concentrated for any of our questions. This participant’s employment requires him/her to use administrative software on a daily basis. This suggests that the criteria that we were testing were strongly influenced by the individual participants. users were not satisfied with their experience using PeopleSoft. Although far from perfect. An example of this type of suggestion would be that one of our participants felt the font was hard to read. Our test also found that too much information was presented on each page. We are suggesting that our results not only reflect PeopleSoft. If the site was in fact completely unusable.

which would also reduce the areas which users might mis-navigate to. payment of fees. the logical place to look for site navigation. • Swap the content of the left-hand side tab bar and the drop-down menus Currently the left-hand side tab bar. addition of a course). Their frustration was likely also fuelled by other factors. none of whom had used PeopleSoft previously. All such links should be removed for obvious reasons. 15 . as was discussed at the beginning of our report. • Remove all links leading to dead ends or inaccessible applications The links that lead to empty folders and applications to which students are denied access are a waste of students’ time. we feel we have succeeded in providing recommendations for increased usability of PeopleSoft. RECOMMENDATIONS We would like to conclude this report by suggesting a small number of specific recommendations which we feel would improve the usability of PeopleSoft. The current tab bar content is inappropriate for student users and would be far more effective and logical if it contained links to sections and subsections students are interested in. five out of our six participants were at least half way to fully confident in their abilities to use the software. Enabling the back button in even part of the website would greatly increase the satisfaction and preference of PeopleSoft users. Usability Testing usability test with non-students than we expected. These links include the page to which users are redirected if their session times out. After less than twenty minutes of familiarization with the software. Although we are unable to quantify the level of frustration current students have had with PeopleSoft. Although we are sure there are many issues that have not been address by this project. This would help to increase the organization of the site and make its layout more clear to users. but there is no reason it should not work for simple navigation. From this we can conclude that the frustration of current students upon implementation of PeopleSoft was not exclusively due to its usability. • Enable the use of the Back button The fact that the Back button does not work is one of the major reasons for user frustration. The content students would be interested in is hidden in small lists and drop-down menus scattered throughout the Student Center page. is populated by links which lead to empty folders and inaccessible tools. their frustration seemed significantly higher than that experienced by our participants. All three tasks were also effectively completed by all of our participants. likely the difference between the new and old administrative systems. It is necessary to disable to Back button on pages where important changes have taken place (e.g.

ucalgary. & Jordan. R. Retrieved December 3. (2002). Technical Communication.edu/design/usability/intro. 2007. 2007..ca University of Calgary.gov.gov 16 . Index versus full-text search: a usability study of user preference and performance. from http://www.ehs. from http://www. from http://www.). Hood. A strategic guide to technical communication. from https://prdrps2. & Graves. (n.d.. Peterborough ON: Broadview Press. R. (n.).d. Usability testing: what is it? Retrieved December 3. H. (n. A. 185-206. Henderson. 2007.. Enrolment Services. Retieved December 3. from http://jerz. PeopleSoft.).).ucalgary. (n.d.setonhill. PeopleSoft Support. Retrieved December 3.ucalgary. E. (2007). G. Usability Testing References Barnum. D.ca/ps University of Calgary. Graves.htm University of Calgary. 2007.usability. Retrieved December 3. C. (2004). 51(2). 2007. Jerz.ca/registrar/students Usability.d.

You will begin at the front page of PeopleSoft Student Center. has a lecture and a tutorial component. We would like to make it clear to you that it is the website that we are evaluating and not your abilities. please complete the following tasks: • Check your grades for the Winter 2007 semester. Were there any aspects of the website that you found particularly helpful or effective in terms of ease of use? If so. and because there are several of both available. The course you wish to add. How would you describe the usability of this website? 1 2 3 4 5 Not at all user. although this person is not you. Very user- friendly friendly 3. Pull up a weekly calendar view of your Fall 2007 class schedule. How satisfied do you feel about your experience with PeopleSoft Student Center? 1 2 3 4 5 Not at all Completely satisfied satisfied 2. We very much appreciate your volunteering and would like to respect your time. please pretend that the account is yours. COMS363. Were there any aspects of the website that you found particularly confusing or ineffective in 17 . We would now like to ask you a few quick questions about your experience completing the tasks: 1. • Add a course to your Winter 2008 schedule. what were they? 5. logged in as University of Calgary student. Add the course COMS 363 with lecture section 7 (L07) and tutorial section 7 (T07). you must specify which you want. Usability Testing APPENDIX A: USABILITY TEST PROVIDED TO PARTICIPANTS Thanks so much for taking part in our study of the PeopleSoft Student Center. Having attempted the three tasks. When you are ready. how confident would you feel about navigating the rest of the website? 1 2 3 4 5 Not at all Very confident confident 4. • You wish to see what your timetable looks like. so you are under no obligation to finish the evaluation if at any point you feel you can’t continue.

including after the interview begins and any time up to the submission of my (our) assignment. will be used as the basis for our research project on the usability of PeopleSoft Student Center. Your participation in the study: Your participation will involve you logging onto PeopleSoft Student Center and completing a set number of tasks while verbalizing your mental procedure. Risk to you: If you allow me to identify you and your organization by name in my writing. April 13. a copy of which has been given to you. The information given will be accessible to the instructor. Purpose of study: The goal of the project is to conduct a small. Publication of results: Your interview. or information not included here. I need your informed consent. Please take the time to read this carefully and to understand any accompanying information. You can provide this by reading and signing this form. Informed consent: Before we can begin the PeopleSoft usability study. The study will be based on the efficiency and ease with which you are able to complete these tasks. This consent form. It should give you an idea of what the project is about and what your participation will involve. the marker. If you would like more detail about something mentioned here. then that information will be shared with the audience specified. there are no risks associated with this project. 2007 PeopleSoft Student Centre Usability Study Error! Bookmark not defined. and potentially to other students in the Coms 363 course. which I am completing as part of my course in Professional and Technical Communication (COMS 363) at the University of Calgary. Principal Investigator Dear Participant: Thank you for agreeing to participate in my research project on the usability of the University of Calgary’s PeopleSoft Student Centre. The information we obtain will be used only for the purpose of completing the Coms 363 final report. and any other material we collect. Your participation is entirely voluntary and you can withdraw at any time.. Usability Testing terms of ease of use? If so. If you withdraw. any material collected during my contact with you 18 . what were they? APPENDIX B: ETHICS DOCUMENT PROVIDED TO PARTICIPANTS October 9. You may be asked for your comments and overall feedback on the system as we conduct our study. is only part of the process of informed consent. 2007. you should feel free to ask. unbiased study investigating the usability of the PeopleSoft Student Center and to identify a small number of specific areas that could be improved upon (if any are found). Apart from that.

and you are also free to refuse to answer any specific questions during the interview. Names. Your signature on this form indicates that you have understood to your satisfaction the information regarding participation in the research project and agree to participate as a subject. and any notes I (or my group) might make. Jo-Anne Andre. You are free to request more information about the study and you are also free to refuse to answer any specific questions during the interview. so you should feel free to ask for clarification or new information throughout your participation. You are free to request more information about the study or to withdraw from it at any time. transcripts of tapes. Confidentiality and anonymity: Unless you agree to allow me to use your name or your organization’s name in my assignment. Your continued participation should be as informed as your initial consent. at andre@ucalgary. Do you agree to participate in the study according to the conditions outlined above? YES NO 2. you have the right to waive anonymity. Of course. If you have any issues or concerns about this project that are not related to the specifics of the research. ) at (968-4274) or hotsauce1523@hotmail. and care will be taken to ensure that any descriptions of situations or direct quotations cannot be connected to you. identities will be disguised in my submitted assignment. 1. will remain confidential.com or the instructor of COMS 363. Usability Testing will destroyed and will not be used in any way in the analysis and writing of the research results. If you have further questions concerning matters related to this research. any information collected will remain anonymous and confidential. please contact me (Error! Bookmark not defined. Storage of materials: All materials. including tapes. May I identify your organization’s name? YES NO Participant’s Signature __________________________ Date _________________ Investigator’s Signature _________________________ Date _________________ 19 . May I identify you by name in my assignment? YES NO 3. you may also contact Bonnie Scherrer at the Research Services Office at 220-3782. including during the data collection phase.ca or by phone at 403-220-7429. Any research notes and consent forms will be stored for two years under lock and key. Only I (or my group) plus the course instructor will be able to access that information. In no way does this waive your legal rights nor release the investigator or the university from their legal and professional responsibilities.