Professional Documents
Culture Documents
PROGRAM EDUCATIONAL OBJECTIVES DEFINITION AND ASSESSMENT FOR QUALITY AND ACCREDITATION
34 http://www.i-jep.org
PAPER
PROGRAM EDUCATIONAL OBJECTIVES DEFINITION AND ASSESSMENT FOR QUALITY AND ACCREDITATION
ments. Typical educational objectives cover the follow- mission. To show clearly these links between different
ings: missions, a breakdown of each mission to its key elements
• Achievement in terms of technical skills required in is necessary (Table II).
the profession for which the program prepares stu-
dents (software engineers, computer programmers,
database architects, etc.). TABLE I.
EXAMPLE OF PEOS DEFINED FOR A SOFTWARE ENGINEERING PROGRAM
• Achievements in terms of professional, ethical, and
communicational aspects required by the profession PEO # Description
for which the program prepares students (team work, Possess essential professional software engi-
ethical behavior, effective communication, etc.) PEO 1 neering skills that make them confident to
• Achievements in terms of management and leader- develop high-quality software solutions in
ship skills (project managers, directors, CTOs, CEOs, various application domains under various
etc.) realistic constraints.
• Achievements in terms of life-long learning and con- Engage and succeed in their professional
tinuous education (certifications, conferences and PEO 2 careers through team work, ethical behavior,
workshops attendance, etc.) proactive involvement, and effective com-
• Achievements in terms of advanced and graduate munication.
studies pursuing (graduate studies, research careers, Demonstrate an understanding of the im-
etc.)
PEO 3 portance of life-long learning through pro-
• Other aspects could be considered when defining ed- fessional development, practical training,
ucational objectives such as the ability to engage in and specialized certifications.
entrepreneurship activities (for engineering programs
in particular). Assume progressively managerial, leading,
PEO 4 and influential roles in their organizations
Defining the program educational objectives should be and communities.
done with the full involvement of all key stakeholders
Pursue postgraduate studies and succeed in
including faculty members, students, advisory board
members, alumni, and employers of graduates. Usually PEO 5 academic and research careers.
the faculty members, involved in the program, propose a
first draft of educational objectives, and then the other An example of missions at different levels along with
stakeholders interfere in the process and bring their addi- their mapping to program education objectives defined for
tions to the first draft. Input from different stakeholder a software engineering program are given below.
could be collected through various means including meet- University mission: To provide students with a quality
ing (such as the advisory board meeting), surveys (such as education, conduct valuable research, serve the national
the employer survey), etc. and international societies and contribute to Saudi Ara-
A list of program educational objectives, following the bia’s knowledge society through learning, creativity, the
guidelines mentioned above, defined for a software engi- use of current and developing technologies and effective
neering program is given in Table I. Note that program international partnership.
educational objectives must be consistent short statements. College mission: To advance the frontiers of
They should be published and accessible to all key stake- knowledge and to prepare creative minds in computing
holders including students, faculty members, members of disciplines with commitment to serve the community in
the advisory board, parents, etc. They should be published order to participate in moving towards a knowledge-based
at least on the web site of the program and program hand- economy through developing an environment that stimu-
book/catalog. In addition, they can be posted on various lates excellence, creativity, and innovation in education
new media such as facebook, twitter, etc. The idea is that and research.
all stakeholders and interested people in the program Department mission: To make a significant contribu-
should be aware of the program education objectives of tion to the national goal of promoting the knowledge soci-
the program. ety through high-quality education, innovative research,
IV. LINKING PROGRAM EDUCATIONAL OBJECTIVES TO and services to the community in the field of software
engineering.
INSTITUTIONAL MISSION
Program mission: To produce highly qualified soft-
Program educational objectives should be in total symbio- ware engineers that serve the society needs and contribute
sis with the mission of the institution in which the pro- to transform the society into a Knowledge Society.
gram is offered. As depicted in Fig. 1, the institutional
mission can be at different levels including the university The university, college, department, and program mis-
mission, the college mission, the department mission, and sions along with program educational objectives must be
the program mission. The program mission serves the public and known to all stakeholders including students,
department mission; the department serves the college faculty members , advisory board, employers, etc. They
mission; and the college mission serves the university
Figure 2. Example of Relationships between missions of the Program, the Department, the College, and the University
can be available in students’ handbooks and university tribute towards moving the society towards a knowledge-
catalogs as well as on the web sites of the university, col- based society.
lege and department. The different missions are linked and
aligned with each other as follows: V. LINKING PROGRAM EDUCATIONAL OBJECTIVES TO
• The mission of the program contributes towards STUDENT OUTCOMES AND CURRICULUM
achieving the mission of the department. In fact, the While student outcomes represent the knowledge,
mission of the program focuses on producing high- skills, and capabilities students should possess by the time
quality software engineers and transforming the soci- of graduation, program educational objectives represent
ety into a knowledge society, which are in total the achievements graduates should attain few years (3 to 5
alignment with the mission of the department which years and more) after graduation. Student outcomes
insists in particular on high-quality teaching and con- should necessarily contribute to prepare students to
tribution towards the knowledge society. achieving the program educational objectives. In order to
• The mission of the department contributes towards show clearly the link between student outcomes and pro-
achieving the mission of the college. In fact, the mis- gram educational objectives, a mapping should be provid-
sion of the department focuses on high-quality teach- ed to show which outcome is contributing to the achieve-
ing and contribution towards the knowledge society, ment of which program educational objective.
which are in total alignment with the mission of the Table III provides an example of mapping between
college which insists in particular on the preparation program educational objectives and student outcomes
of analytic and creative minds as well as transform- done for a software engineering program. Note that stu-
ing the society into an avant-garde society. dent outcomes correspond to criterion 3 of the ABET gen-
• The mission of the college contributes towards eral accreditation criteria. For more details about these
achieving the mission of the university. In fact, the outcomes, refer to [1, 4, 8].
mission of the college focuses on the preparation of Note that student outcomes are mapped to different
analytic and creative minds as well as transforming courses in the curriculum of a specific program. Conse-
the society into an avant-garde society, which is in quently there is a link between the program curriculum
total alignment with the mission of the university and the program educational objectives via student out-
which insists in particular on providing distinctive comes. In fact, specific courses in a curriculum will cover
education and building the knowledge economy. specific student outcomes and these student outcomes will
participate to prepare students achieving specific educa-
In addition, and in order to show clearly the links be- tional objectives. Table V at the end of this paper shows a
tween program educational objectives and institutional sample of mapping between student outcomes and courses
mission, a mapping should be provided between each pro- in a Software Engineering curriculum. For example, from
gram educational objective and different elements of the Table V we can see that student outcome (e) that deals
institutional mission at different levels (university, col- with the ability of students to identify and solve software
lege, department, and program). Table II shows an exam- engineering problems is covered in several courses includ-
ple of mapping between program educational objectives ing software requirements engineering, software process
and the institutional mission. It shows the contribution of and modeling, software architecture and design, software
each PEO towards achieving the University, college, de- construction, software project management, and capstone
partment, and program missions. All missions defined at project. From Table III, we can see that student outcome
the university, college, department, and program levels (e) contributes to prepare students towards the achieve-
insist on the importance of moving the society towards a ment of PEO 1 (Possess essential professional software
society based on the knowledge economy. The PEOs of engineering skills that make them confident to develop
the program insist on professional skills, team work, ethi- high-quality software solutions in various application do-
cal behavior, continuous learning, leadership, and gradu- mains under various realistic constraints) and PEO 5 (Pur-
ate studies. These dimensions of the PEOs of the program sue postgraduate studies and succeed in academic and
are obviously consistent with the mission of the program, research careers).
department, college, and university. They strongly con-
36 http://www.i-jep.org
PAPER
PROGRAM EDUCATIONAL OBJECTIVES DEFINITION AND ASSESSMENT FOR QUALITY AND ACCREDITATION
TABLE II.
EXAMPLE OF MAPPING BETWEEN PROGRAM EDUCATIONAL OBJECTIVES AND INSTITUTIONAL MISSION
University Mission
Provide quality education H H H H H
Contribute to the knowledge society H H H H H
College Mission
Prepare creative minds H M M M H
Participate in moving towards a H H H H H
knowledge-based economy
Department Mission
Provide high quality Teaching H H H M H
Promote the knowledge society H H H H H
Program Mission
Produce highly qualified software H H H M M
engineers
Contribute to transform the society H H H H H
into a knowledge society
(H: High; M: Medium; L: Low)
VI. ASSESSING PROGRAM EDUCATIONAL OBJECTIVES der to serve as indicators as for the attainment of tar-
gets.
A. Assessment Instruments and Processes
• An improvement plan consisting of a list of im-
Assessing program educational objectives is very im-
provement actions are then decided based on the
portant and aims to discover whether the defined educa-
analysis findings for the non-attained targets. These
tional objectives are being attained by graduates after they
improvement actions can impact any element of the
hit the employment market or not. If not, corrective and
program including curriculum, PEOs, SOs, facilities,
improvement actions must be taken. Fig. 2 depicts the
faculty, etc.
instruments (methods) to be used in the assessment and
evaluation processes related to PEOs and shows how they • Then another assessment cycle is conducted in order
will be used in closing the assessment loop: to assess and evaluate whether the changes brought
to the program based on the assessment and evalua-
• The instruments intended to be used to collect data tion in the previous cycle have resolved the identified
for the assessment of educational objectives are: ad-
issues or not.
visory board meeting, focus group survey, alumni
survey, employer survey, and face-to-face meetings PEOs assessment must be evidence-based. That is, each
with alumni in their employer site. Table VI summa- assessment applied on PEOs must be documented and
rizes these assessment instruments along with other kept as a proof. The involvement of faculty members in
important attributes including frequency of assess- the assessment process is very important and can be orga-
ment, who collects the data, from whom the data is nized through various committees. The main committees
collected, how the data is maintained, etc. that should be involved in the assessment and evaluation
• The data collected through the PEOs instruments are of program education objectives include at least an as-
analyzed to identify clearly potential issues. A set of sessment & continuous improvement committee as well as
metrics are calculated (depending on the data) in or- the department council for final approval.
TABLE III.
EXAMPLE OF MAPPING BETWEEN STUDENT OUTCOMES (SOS) AND PROGRAM EDUCATIONAL OBJECTIVES (PEOS)
38 http://www.i-jep.org
PAPER
PROGRAM EDUCATIONAL OBJECTIVES DEFINITION AND ASSESSMENT FOR QUALITY AND ACCREDITATION
TABLE IV.
EXAMPLES OF LEVEL OF SATISFACTION OR TARGET DEFINED FOR EACH EDUCATIONAL OBJECTIVE
Program educational objectives will not be all attained 1) Advisory Board Meetings
in the same way and with the same level. For example if The advisory board is one of the main stakeholders or
we consider PEO #1 in Table I, the target is 70% while for constituents of a program. They play a key role in the as-
PEO #4 the target is 30%. Targets or levels of satisfaction sessment of the PEOs. A meeting should be held with the
for different PEOs are summarized in Table IV. The target members of the advisory board on a regular basis (each 2
or level of satisfaction defines the minimum level of satis- or 3 years … or even each year) and get their views. Evi-
faction in order to consider a PEO attained. A target ex- dence related to Advisory board meeting could be the
presses the minimum percentage of respondents (for a minutes of meetings held and improvement actions taken
specific instrument) who answered “agree” or “strongly based on these meetings.
agree” (in surveys where possible answers are “strongly
agree”, “agree”, “neutral”, “disagree”, and “strongly disa- Defining clearly the objectives and roles of members of
gree”). For example: for PEO #1 in Table I, 70% of re- the advisory board is extremely important in order to in-
spondents should answer “agree” or “strongly agree” in crease its effectiveness [9]. To satisfy the ABET accredi-
order to consider PEO #1 met while, for PEO #4, 30% of tation criteria with respect to program educational objec-
respondents should answer “agree” or “strongly agree” in tives, the advisory board members, as one of the key
order to consider PEO #4 met. stakeholders, must be fully involved in defining and as-
sessing the program educational objectives for a particular
degree program. However the role of the advisory board is expectations of the different constituents in attaining PE-
not necessarily limited to the definition and assessment of Os few years later, based on what they have seen in the
program educational objectives. It can also include influ- program so far: curriculum, faculty, facilities, etc. Such
encing the curriculum itself, helping in recruiting gradu- preliminary assessment can be conducted using a focus
ates, participating in fundraising campaigns, etc. group survey only (among the instruments cited above)
2) Focus Group Survey conducted with students, faculty members, and external
members of the advisory board (alumni and employers are
A focus group survey with the main constituents - Ex- not yet there).
ternal members of the Advisory Board, faculty members,
and students – can be conducted to measure the views of B. Metrics
each group about the relevance and attainment of PEOs. Once data is collected through different instruments, it
Evidence related to focus group survey could be the an- must be analyzed in order decide whether each PEO is
swers sheets of all constituents as well as the analysis and attained or not. Two important questions arise here:
improvement actions taken based on the findings from this
survey. • How to combine the results given by different in-
struments?
3) Alumni Survey
• What metrics should be used in order to decide
Alumni survey should be conducted with graduates whether a PEO is met or not.
from the program, few years after they graduate (3 to 5
years at least). The Alumni survey aims to measure clearly 1) Combination of different instruments
whether those graduates attained the PEOs defined for the Actually different instruments can be combined or not.
program they graduated from or not. Evidence related to When combining all instruments, there is a need to assign
alumni survey could be the answers sheets of all alumni a weight to each instrument and calculate a final score for
who participated as well as the analysis and improvement each educational objective as a weighted average of the
actions taken based on the findings from this survey. scores given by each instrument. If we decide to not com-
Note that some alumni could be also members of the bine them, they can be seen as a multidimensional view on
advisory board. The research survey described in [9] did each PEO. The decision then would be to consider results
not find any negative aspect resulting from the inclusion of the majority of instruments. In the Software Engineer-
of alumni in the advisory board; it even indicates that ing program we used the second option.
alumni, when they are members of the advisory board of a Note that the focus group survey, the alumni survey,
program are the best advocates of the program. and the employer survey give quantitative results while
the advisory board meetings and the face-to-face meetings
4) Employer Survey
with alumni in their employment site give qualitative re-
Employer survey should be conducted with employers sults. So only the instruments holding quantitative results
of alumni from the program. The employer survey aims to can be combined. The instruments holding qualitative
measure clearly whether those graduates attained the PE- results must be analyzed separately.
Os defined for the program they graduated from or not.
Instruments holding quantitative results can be designed
Evidence related to employer survey could be the answers
as questionnaires and surveys. A good guide as to how
sheets of all employers who participated as well as the
design effective surveys is described in [10]. In the Soft-
analysis and improvement actions taken based on the find-
ware Engineering program we designed surveys with a
ings from this survey.
scale of 5 points: “Strongly Agree” corresponds to 5,
5) “Face-to-Face” Meetings with Alumni in their “Agree” corresponds to 4, “Neutral” corresponds to 3,
employment site “Disagree” corresponds to 2, “Strongly Disagree” corre-
A Face-to-Face meeting with alumni could be a strong- sponds to 1.
er instrument than simply an alumni survey as it allows, to 2) Decision Metrics
some extent, a direct contact and discussion with alumni. Different metrics can be calculated from the data col-
The face-to-face meeting with alumni aims to measure lected. In the Software Engineering program we used two
clearly whether those graduates attained the PEOs defined metrics:
for the program they graduated from or not. Evidence re-
lated to face-to-face meetings could be the minutes of the • Average score: the average score on a scale of 5 as a
meeting, a video-recording of the meeting, as well as the combination of the percentage of respondents with
analysis and improvement actions taken based on the find- respect to each scale point (Strongly Agree, Agree,
ings from this meeting. Neutral, Disagree, and Strongly Disagree) as a
weighted average.
There is an important note about Newly Established
Programs. In fact, in the ABET procedures and policy • Percentage of respondents achieving the satisfac-
manual (APPM) [3], it is clearly indicated that in order to tory level or above: the percentage of respondents
apply for accreditation; a program must have graduates achieving the satisfactory level or above. We consid-
prior to the on-site visit. For a newly established program, er here only respondents who answered “Strongly
and once the first batch of students graduate, normally Agree” or “Agree”
need to wait at least 3 years in order to assess correctly the C. Sample of results and analysis
attainment of program educational objectives. Fortunately,
ABET allows new programs to apply for accreditation As explained earlier, when analyzing data, collected
immediately after having the first batch of graduates alt- through instruments mentioned, two (2) metrics will be
hough PEOs cannot be really assessed. However, a pre- calculated:
liminary assessment can be done. This preliminary as- • The average score of all respondents.
sessment could for example measure the confidence or
40 http://www.i-jep.org
PAPER
PROGRAM EDUCATIONAL OBJECTIVES DEFINITION AND ASSESSMENT FOR QUALITY AND ACCREDITATION
• The percentage of respondents above the satisfactory Source: First Advisory Board Meeting + Focus Group
level (those who answered “strongly agree” and Survey with advisory board members
“agree”). Description/Rationale: Include in the objectives the
The average score is used as indicative only while the concepts of pro-activity and anticipation. According to the
% above the satisfactory level will be used to judge the advisory board members, graduates must have a proactive
attitude in order to predict and solve problems. PEO # 2
attainment or not of an educational objective. The level of
satisfaction or target depends on each education objective was as follows: “Engage and succeed in their professional
as mentioned above (see Table IV). careers through team work, ethical behavior, and effective
communication.” and was changed to be as follows: “En-
Table VII shows a sample of results from the focus gage and succeed in their professional careers through
group survey conducted with 3 groups of key constituents: team work, ethical behavior, proactive involvement, and
students, faculty members, and external members of the effective communication.”
advisory board. From the results in Table V, we can see
that globally external members of the advisory board, fac- Improvement #3: Curriculum Update – include “dif-
ulty members, and students have high confidence in at- ferent types of acquisition strategies and contracts in
taining the educational objectives of the program. PEO #4 Software projects” as a topic in 2 courses
and PEO #5 have lower score compared to PEO #1, PEO Source: First Advisory Board Meeting + Focus Group
#2, and PEO #3 as the former ones have a target lower Survey with advisory board members
(see Table IV) than the latter ones (not all graduates will Description/Rationale: Include topics related to differ-
assume leadership positions or pursue graduate studies). ent types of acquisitions and contracts (outsourcing, inter-
Based on the level of satisfaction/target defined for each nal development, customizing, etc.) in the “Software Pro-
educational objective (see Table IV), we can say that all ject Management” course as well as in the “Software En-
key program constituents have high confidence in attain- gineering Ethics and Professional Practice” course.
ing educational objectives. However (from Table VII): Improvement #4: Communication with student to ex-
• For PEO 1 the score is 65% for students when using plain PEOs and other related topics
the % of students who are above the satisfactory level Source: Focus group survey with students
metric. This means that some students are not confi- Description/Rationale: Efforts have been doubled to
dent enough to possess enough essential Software explain to students the educational objectives we expect
Engineering skills. A possible explanation of this is them to achieve after they graduate. An open communica-
that, at the time of conducting this survey, students tion meeting was organized with students to explain to
where in their 6th semester. This means that, at that them PEOs in particular and other topics such student
time, they did not take yet all courses and graduation outcomes as well as the importance and the role of stu-
projects in the curriculum. So the perception of some dents (through indirect assessment) to improve the pro-
them was that they do not possess all essential Soft- gram.
ware Engineering skills. Communications actions
should be undertaken with students to discuss and Improvement #5: Include Students’ representatives in
eventually adjust this perception. the advisory board
• For PEO 3, the score is 66% for faculty when using Source: Focus group with faculty members
the % of faculty who are above the satisfactory level Description/Rationale: Representatives of students are
metric. This indicates that some faculty members are added as members of the advisory board. They are respon-
not confident enough that all students, after gradua- sible to bring the point of view of students with respect to
tion, will be life-long learners. So life-long learning various subjects related to the program.
skills should be reinforced in the program. Improvement #6: Improve laboratories and practical
work
D. Improvements Actions
Source: Focus group with Students
Examples of improvement actions are given below.
These improvement actions are derived from the assess- Description/Rationale: The program will have at least
ment of PEOs for a software engineering program using 3 dedicated laboratories in which various software sys-
two instruments: focus group survey and meeting with the tems and tools needed in the program must be installed
advisory board. and used in different Software Engineering courses. These
tools include Rational RequisitePro, Rational Software
Improvement #1: Vision changed Modeler, Rational Software Architect, Oracle DBMS,
Source: First Advisory Board Meeting + Focus Group Oracle Designer, Oracle SQL developer, Eclipse/Java,
Survey with advisory board members etc.
Description/Rationale: Make the program vision “in- In order to close the so called “assessment loop”, and
ternational” instead of “national” or “regional”. The pro- once these improvement actions are implemented, a new
gram vision was as follows: “The Software Engineering assessment cycle should start in order to see whether the
Program aims to become the top education program in identified issues have been solved or not. Depending on
Software Engineering at the national and regional levels.” the outcome of this assessment, a new list of improvement
and was changed to “The Software Engineering Program actions might be defined. And the assessment process
aims to become the top education program in Software continues!
Engineering at the national, regional, and international
levels.”
Improvement #2: PEO # 2 updated
E. Systematicity and Sustainibility of the Assessment from the 2013-2014 accreditation cycle, it is no longer
Process required to assess the attainment of PEOs. Still it is im-
The assessment process is time and effort consuming portant to define them correctly, link them correctly with
activity that may be perceived by faculty members as a the institutional mission, and involve various constituents
burden and overhead. This may make some faculty mem- in their definition and revision. We believe that even if
bers resistant and refrain from fully engaging in this pro- ABET has lessen the requirements in terms of PEOs as-
cess. In order to make the process easier, it is definitely sessment, programs can still assess their PEOs in order to
helpful to make it systematic and well organized so that, know whether their graduates are attaining the PEOs de-
with time, faculty members and all stakeholders get used fined for the program or not, and if not attained, eventual-
ly adjust the curriculum and other program components in
to it through a cumulative learning process.
order to solve the resulting issues.
Furthermore, if the assessment process is not well de-
signed and faculty members are not fully engaged in it, it ACKNOWLEDGMENT
may become not sustainable in the sense that the assess-
ment process will not continue in the future or it will con- We thank our colleagues, members of the ABET-
Software Engineering Department Committee and the
tinue but with a degraded quality.
ABET-College Committee at the College of Computer
Definitely making the process systematic and engaging and Information Sciences (CCIS) at King Saud University
faculty members and stakeholders in it will help to make for the fructuous discussions during our regular meetings.
the assessment process sustainable. We believe the ap- We thank also the Research Center, at the College of
proach proposed in this paper goes a step in this direction Computer and Information Sciences, for its support.
as it defines a list of guidelines to define and assess PEOs.
REFERENCES
VII. CONCLUSION
[1] Accreditation Board of Engineering and Technology (ABET):
We discussed in this paper some guidelines related to http://www.abet.org
the definition and assessment of program educational ob- [2] ABET Computing Accreditation Criteria (CAC) and Engineering
jectives for ABET accreditation by focusing on computing Accreditation Criteria (EAC), 2013-2014 Accreditation Cycle:
and engineering programs. PEOs have a link with student http://www.abet.org/forms.shtml
outcomes as well as with curriculum; thus the importance [3] ABET Accreditation Policy and Procedures Manual (APPM),
of defining and assessing them correctly. Examples of 2013-2014 Accreditation Cycle: http://www.abet.org/forms.shtml
PEOs definition and assessment were given from a soft- [4] N. Abbadeni, A. Alghamdi, M. Batouche, A. Ghoneim. Combin-
ing Multiple Methods and Metrics for the Assessment of ABET
ware engineering program. The definition of PEOs should Students Outcomes - The case of a Software Engineering Pro-
cover both technical and professional aspects of the pro- gram. To appear in the International Conference on Engineering
fession for which the program prepares students and Education (ICEE), The international Network of Engineering Ed-
should involve the main constituents or stakeholders of ucation and Research (iNEER), August 21-26, Belfast, UK, 2011.
the program including faculty member, students, advisory [5] N. Abbadeni and A. Alghamdi: A Practical Review of ABET
board, alumni, and employers. PEOs must be in total Accreditation Criteria – The Case of Software Engineering Pro-
symbiosis and alignment with the mission of the institu- grams. To appear in the International Conference on Engineering
Education (ICEE), The international Network of Engineering Ed-
tion where the program is offered. The institutional mis- ucation and Research (iNEER), August 21-26, Belfast, UK, 2011.
sion could be at different levels including the university [6] N.J. Mourtos, “Program Educational Objectives and Outcomes:
level, the college level, the department level, and the pro- How to Design a Sustainable, Systematic Process for Continuous
gram level. The PEOs must also be mapped to student Imporvement”. Proceedinfs of the 36th ASEE/IEEE Frontiers in
outcomes and, for each PEO, a list of student outcomes Education Conference, San Diego, CA, USA, October 28-31,
that helps in the preparation of students to achieving that 2006.
PEO must be clearly identified. PEOs must be published [7] N.J. Mourtos, “A Systematic Approach for Defining and Asses-
and accessible to all stakeholders and interested public. sing Program Educational Objectives and Outcomes”. Procee-
dings of the World Congress on Computer Science, Engineering,
Assessment of PEOs is fundamental and includes data and Technology Education, March 7-10, 2006.
collection, data analysis, and improvement actions plan. [8] M. Besterfield-Sacre, L. J. Shuman, H. Wolfe, C. J. Atman, J.
Data collection could be done through different instru- McGourty, R. L. Miller, B. M. Olds, and G. Rogers, “Defining the
ments including advisory board meeting, focus group sur- Outcomes: A Framework for EC 2000”, IEEE Transactions on
vey, alumni survey, employer survey, etc. Data analysis Education, Vol. 43, No. 2, 2000, pp. 100-110.
http://dx.doi.org/10.1109/13.848060
could be done through the calculation of two metrics: av-
erage score and percentage of respondents achieving the [9] S. R. Genheimer and R.L. Shehab, “A Survey of Industry Adviso-
ry Board Operation and Effectiveness in Engineering Education”,
satisfactory level. Based on the results of analysis, an im- Journal of Engineering Education (JEE), pp.169-180, April 2009.
provement plan can be decided including a list of im- [10] D.S. Walonick, “Survival Statistics”, StatPac inc., 1997-2010. A
provement actions to be implemented progressively in short guide (24 pages) from this book is available at:
order to resolve the eventual issues identified during the http://www.statpac.com/surveys
analysis. Then, another assessment cycle should start after [11] Y. Wang, ”The Development of the IEEE/ACM Software Engi-
implementing the list of improvement actions to assess neering Curricula”, IEEE Canadian Review, 51(2), 16-20, 2005.
whether those improvement actions were effective and [12] T. Lethbridge et al, “SE2004: Recommendations for Undergradu-
efficient in resolving the identified issues or not. Depend- ate Software Engineering Curricula”, IEEE Software, Novem-
ing on the outcome of this assessment, a new list of im- ber/December 2006. http://dx.doi.org/10.1109/MS.2006.171
provement actions might be defined. And the assessment [13] ACM/IEEE-CS/AIS Joint Task Force “Computing Curricula –
The Overview Report”, 2005.
process continues.
[14] ACM/IEEE-CS Joint Task Force “Guidelines for Software Engi-
Finally, let’s note that ABET has changed its require- neering Curricula”, 2004.
ments regarding the assessment of PEOs. In fact, effective [15] SWEBOK – Software Engineering Body of Knowledge, 2010.
42 http://www.i-jep.org
PAPER
PROGRAM EDUCATIONAL OBJECTIVES DEFINITION AND ASSESSMENT FOR QUALITY AND ACCREDITATION
AUTHORS
Dr. Noureddine Abbadeni is an associate professor
and vice-chair of the department of Software Engineering,
College of Computer and Information Sciences, King
Saud University. He is also an ABET Program Evaluator.
(emails: nabbadeni@ksu.edu.sa;
noureddine.abbadeni@usherbrooke.ca)
Dr. Ahmed Ghoneim is an assistant professor in the
department of Software Engineering, College of Computer
and Information Sciences, King Saud University. (email:
ghoneim@ksu.edu.sa)
Dr. Abdullah Alghamdi is a professor and chair of the
department of Software Engineering, College of Computer
and Information Sciences, King Saud University. (email:
ghamdi@ksu.edu.sa)
This article is an extended and modified version of a paper presented
at the EDUCON2013 conference held at Technische Universität Berlin,
Berlin, Germany from March 13-15, 2013.
APPENDIX
See next pages.
Maintenance
Construction
Requirements
Project Manage-
Capstone Project
Ethics & Practice
Process & Model-
Quality Assurance
Architecture & De-
(a) Ability to apply knowledge of mathematics, science, and en-
H
H
M
gineering
H
H
H
H
lyze and interpret data
H
H
H
H
M
vironmental, social, political, ethical, health and safety, man-
ufacturability, and sustainability
L
L
H
M
M
(d) Ability to function on multidisciplinary teams
TABLE V.
PAPER
H
H
H
H
(e) Ability to identify, formulate, and solve engineering problems
M
M L
H
H
(f) Understanding of professional and ethical responsibility
H
H
H
(g) Ability to communicate effectively
M
M
(h) Broad education necessary to understand the impact of engi-
H
H
neering solutions in a global, economic, environmental, and socie-
tal context
H
H
long learning
SAMPLE OF MAPPING BETWEEN STUDENT OUTCOMES AND COURSES IN A SOFTWARE ENGINEERING PROGRAM
PROGRAM EDUCATIONAL OBJECTIVES DEFINITION AND ASSESSMENT FOR QUALITY AND ACCREDITATION
H
H
(j) Knowledge of contemporary issues.
M
M
M
http://www.i-jep.org
PAPER
PROGRAM EDUCATIONAL OBJECTIVES DEFINITION AND ASSESSMENT FOR QUALITY AND ACCREDITATION
TABLE VI.
SUMMARY OF INSTRUMENTS USED TO ASSESS THE ATTAINMENT OF PROGRAM EDUCATIONAL OBJECTIVES IN THE SOFTWARE ENGINEERING PROGRAM
How Collected?
Lev-
el of Satisfaction
Responsi-
From Whom?
Fre-
Mainte-
Method
Results
quency
sion
Alumni Survey Each 3 Depends on each PEOs Assess- Paper/ Alumni Assessment
Years Educational Objec- ment Portfolio Electronic Committee
tive (see Table 4)
Department
Council
Employer Each 3 Depends on each PEOs Assess- Paper/ Employers of Assessment
Survey Years Educational Objec- ment Portfolio Electronic program’s Committee
tive (see Table 4) graduates
Department
Council
Advisory Each 3 Depends on each PEOs Assess- Paper/Meeting External Assessment
Board Meeting Years Educational Objec- ment Portfolio Minutes Members of Committee
tive (see Table 4) the Advisory
Department
board
Council
Focus Group Each 3 Depends on each PEOs Assess- Paper/ Students Assessment
Survey Years Educational Objec- ment Portfolio Electronic Committee
Faculty
(All constitu- tive (see Table 4)
members Department
ents)
Council
External
Members of
the Advisory
board
Employers
Alumni
“Face-to- Each 3 Depends on each PEOs Assess- Verbal/Interview Alumni Assessment
Face” Meet- Years Educational Objec- ment Portfolio Video Committee
ings with tive (see Table 4) Recordings
Department
Alumni
Council
In their em-
ployment site
TABLE VII.
EXAMPLE OF ASSESSMENT RESULTS FROM ONE ASSESSMENT INSTRUMENT USED IN THE SOFTWARE ENGINEERING PROGRAM: ASSESSMENT OF PEOS
ATTAINMENT BY DIFFERENT CONSTITUENTS USING A FOCUS GROUP SURVEY (ALL NUMBERS REPRESENT PERCENTAGES)
Satisfactory %
Satisfactory %
Satisfactory %
Average %
Average %
Average %
PEO 1 - Possess essential professional software engineering
skills that make them confident to develop high-quality soft- 70 73 65 86 100 92 100
ware solutions in various application domains under various
realistic constraints.
PEO 2 - Engage and succeed in their professional careers
through team work, ethical behavior, proactive involvement, 70 90 93 84 100 80 80
and effective communication.
PEO 3 - Demonstrate an understanding of the importance of
life-long learning through professional development, practical 70 78 72 75 66 80 80
training, and specialized certifications.
PEO 4 - Assume progressively managerial, leading, and influ-
ential roles in their organizations and communities. 30 76 75 79 77 76 60
46 http://www.i-jep.org